Mar 14 06:58:39 crc systemd[1]: Starting Kubernetes Kubelet... Mar 14 06:58:39 crc restorecon[4818]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:39 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 14 06:58:40 crc restorecon[4818]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 14 06:58:40 crc restorecon[4818]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 14 06:58:41 crc kubenswrapper[4893]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 14 06:58:41 crc kubenswrapper[4893]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 14 06:58:41 crc kubenswrapper[4893]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 14 06:58:41 crc kubenswrapper[4893]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 14 06:58:41 crc kubenswrapper[4893]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 14 06:58:41 crc kubenswrapper[4893]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.142709 4893 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.149024 4893 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.149044 4893 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.149049 4893 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.149053 4893 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.149057 4893 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.149061 4893 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.149065 4893 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.149070 4893 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.149073 4893 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.149077 4893 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.149082 4893 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.149086 4893 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.149091 4893 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.149095 4893 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.149100 4893 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.149104 4893 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.149109 4893 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.149113 4893 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.149116 4893 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.149120 4893 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.149124 4893 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.149127 4893 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.149131 4893 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.149145 4893 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.149149 4893 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.149154 4893 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.149159 4893 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.149163 4893 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.149168 4893 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.149171 4893 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.149175 4893 feature_gate.go:330] unrecognized feature gate: Example Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.149179 4893 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.149185 4893 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.149190 4893 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.149194 4893 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.149198 4893 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.149202 4893 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.149206 4893 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.149210 4893 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.149214 4893 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.149219 4893 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.149226 4893 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.149231 4893 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.149237 4893 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.149242 4893 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.149247 4893 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.149252 4893 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.149256 4893 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.149261 4893 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.149265 4893 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.149268 4893 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.149272 4893 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.149277 4893 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.149282 4893 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.149287 4893 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.149291 4893 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.149295 4893 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.149299 4893 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.149302 4893 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.149306 4893 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.149310 4893 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.149314 4893 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.149317 4893 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.149321 4893 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.149325 4893 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.149329 4893 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.149333 4893 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.149337 4893 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.149341 4893 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.149347 4893 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.149351 4893 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151110 4893 flags.go:64] FLAG: --address="0.0.0.0" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151127 4893 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151136 4893 flags.go:64] FLAG: --anonymous-auth="true" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151142 4893 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151148 4893 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151153 4893 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151159 4893 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151164 4893 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151168 4893 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151173 4893 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151178 4893 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151182 4893 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151186 4893 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151190 4893 flags.go:64] FLAG: --cgroup-root="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151194 4893 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151199 4893 flags.go:64] FLAG: --client-ca-file="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151203 4893 flags.go:64] FLAG: --cloud-config="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151207 4893 flags.go:64] FLAG: --cloud-provider="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151212 4893 flags.go:64] FLAG: --cluster-dns="[]" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151217 4893 flags.go:64] FLAG: --cluster-domain="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151222 4893 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151228 4893 flags.go:64] FLAG: --config-dir="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151233 4893 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151239 4893 flags.go:64] FLAG: --container-log-max-files="5" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151246 4893 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151251 4893 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151256 4893 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151261 4893 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151269 4893 flags.go:64] FLAG: --contention-profiling="false" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151274 4893 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151280 4893 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151284 4893 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151289 4893 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151295 4893 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151299 4893 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151304 4893 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151308 4893 flags.go:64] FLAG: --enable-load-reader="false" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151314 4893 flags.go:64] FLAG: --enable-server="true" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151319 4893 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151325 4893 flags.go:64] FLAG: --event-burst="100" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151329 4893 flags.go:64] FLAG: --event-qps="50" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151334 4893 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151339 4893 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151343 4893 flags.go:64] FLAG: --eviction-hard="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151349 4893 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151353 4893 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151357 4893 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151361 4893 flags.go:64] FLAG: --eviction-soft="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151366 4893 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151370 4893 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151374 4893 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151378 4893 flags.go:64] FLAG: --experimental-mounter-path="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151383 4893 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151387 4893 flags.go:64] FLAG: --fail-swap-on="true" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151391 4893 flags.go:64] FLAG: --feature-gates="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151396 4893 flags.go:64] FLAG: --file-check-frequency="20s" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151400 4893 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151404 4893 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151408 4893 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151413 4893 flags.go:64] FLAG: --healthz-port="10248" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151418 4893 flags.go:64] FLAG: --help="false" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151423 4893 flags.go:64] FLAG: --hostname-override="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151427 4893 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151432 4893 flags.go:64] FLAG: --http-check-frequency="20s" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151437 4893 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151442 4893 flags.go:64] FLAG: --image-credential-provider-config="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151447 4893 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151453 4893 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151457 4893 flags.go:64] FLAG: --image-service-endpoint="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151462 4893 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151468 4893 flags.go:64] FLAG: --kube-api-burst="100" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151473 4893 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151477 4893 flags.go:64] FLAG: --kube-api-qps="50" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151483 4893 flags.go:64] FLAG: --kube-reserved="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151488 4893 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151491 4893 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151496 4893 flags.go:64] FLAG: --kubelet-cgroups="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151500 4893 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151504 4893 flags.go:64] FLAG: --lock-file="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151508 4893 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151514 4893 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151534 4893 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151540 4893 flags.go:64] FLAG: --log-json-split-stream="false" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151545 4893 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151549 4893 flags.go:64] FLAG: --log-text-split-stream="false" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151553 4893 flags.go:64] FLAG: --logging-format="text" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151557 4893 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151562 4893 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151566 4893 flags.go:64] FLAG: --manifest-url="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151570 4893 flags.go:64] FLAG: --manifest-url-header="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151576 4893 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151580 4893 flags.go:64] FLAG: --max-open-files="1000000" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151585 4893 flags.go:64] FLAG: --max-pods="110" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151590 4893 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151594 4893 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151598 4893 flags.go:64] FLAG: --memory-manager-policy="None" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151602 4893 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151606 4893 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151611 4893 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151615 4893 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151626 4893 flags.go:64] FLAG: --node-status-max-images="50" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151630 4893 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151635 4893 flags.go:64] FLAG: --oom-score-adj="-999" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151639 4893 flags.go:64] FLAG: --pod-cidr="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151643 4893 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151651 4893 flags.go:64] FLAG: --pod-manifest-path="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151655 4893 flags.go:64] FLAG: --pod-max-pids="-1" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151659 4893 flags.go:64] FLAG: --pods-per-core="0" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151663 4893 flags.go:64] FLAG: --port="10250" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151668 4893 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151672 4893 flags.go:64] FLAG: --provider-id="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151702 4893 flags.go:64] FLAG: --qos-reserved="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151707 4893 flags.go:64] FLAG: --read-only-port="10255" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151712 4893 flags.go:64] FLAG: --register-node="true" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151716 4893 flags.go:64] FLAG: --register-schedulable="true" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151720 4893 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151728 4893 flags.go:64] FLAG: --registry-burst="10" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151732 4893 flags.go:64] FLAG: --registry-qps="5" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151736 4893 flags.go:64] FLAG: --reserved-cpus="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151740 4893 flags.go:64] FLAG: --reserved-memory="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151745 4893 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151749 4893 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151753 4893 flags.go:64] FLAG: --rotate-certificates="false" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151757 4893 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151762 4893 flags.go:64] FLAG: --runonce="false" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151766 4893 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151770 4893 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151774 4893 flags.go:64] FLAG: --seccomp-default="false" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151778 4893 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151783 4893 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151789 4893 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151795 4893 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151800 4893 flags.go:64] FLAG: --storage-driver-password="root" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151806 4893 flags.go:64] FLAG: --storage-driver-secure="false" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151812 4893 flags.go:64] FLAG: --storage-driver-table="stats" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151816 4893 flags.go:64] FLAG: --storage-driver-user="root" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151821 4893 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151826 4893 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151830 4893 flags.go:64] FLAG: --system-cgroups="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151834 4893 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151841 4893 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151845 4893 flags.go:64] FLAG: --tls-cert-file="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151849 4893 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151854 4893 flags.go:64] FLAG: --tls-min-version="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151858 4893 flags.go:64] FLAG: --tls-private-key-file="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151871 4893 flags.go:64] FLAG: --topology-manager-policy="none" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151876 4893 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151880 4893 flags.go:64] FLAG: --topology-manager-scope="container" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151884 4893 flags.go:64] FLAG: --v="2" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151895 4893 flags.go:64] FLAG: --version="false" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151901 4893 flags.go:64] FLAG: --vmodule="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151907 4893 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.151911 4893 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.152019 4893 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.152024 4893 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.152029 4893 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.152033 4893 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.152037 4893 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.152042 4893 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.152047 4893 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.152051 4893 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.152055 4893 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.152058 4893 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.152062 4893 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.152065 4893 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.152069 4893 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.152073 4893 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.152077 4893 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.152080 4893 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.152084 4893 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.152088 4893 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.152091 4893 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.152095 4893 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.152098 4893 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.152101 4893 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.152105 4893 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.152109 4893 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.152112 4893 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.152116 4893 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.152120 4893 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.152123 4893 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.152127 4893 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.152131 4893 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.152135 4893 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.152140 4893 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.152144 4893 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.152149 4893 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.152154 4893 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.152158 4893 feature_gate.go:330] unrecognized feature gate: Example Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.152162 4893 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.152167 4893 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.152173 4893 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.152177 4893 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.152182 4893 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.152186 4893 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.152190 4893 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.152195 4893 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.152199 4893 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.152203 4893 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.152207 4893 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.152210 4893 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.152214 4893 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.152219 4893 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.152224 4893 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.152230 4893 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.152235 4893 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.152239 4893 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.152244 4893 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.152248 4893 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.152253 4893 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.152258 4893 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.152263 4893 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.152268 4893 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.152272 4893 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.152276 4893 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.152280 4893 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.152284 4893 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.152289 4893 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.152293 4893 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.152297 4893 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.152301 4893 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.152306 4893 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.152309 4893 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.152313 4893 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.152320 4893 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.165245 4893 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.165303 4893 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.165426 4893 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.165449 4893 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.165458 4893 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.165469 4893 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.165478 4893 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.165486 4893 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.165494 4893 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.165501 4893 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.165509 4893 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.165517 4893 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.165551 4893 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.165566 4893 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.165574 4893 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.165583 4893 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.165592 4893 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.165601 4893 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.165613 4893 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.165624 4893 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.165632 4893 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.165641 4893 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.165649 4893 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.165657 4893 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.165666 4893 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.165674 4893 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.165681 4893 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.165689 4893 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.165697 4893 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.165705 4893 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.165712 4893 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.165720 4893 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.165727 4893 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.165735 4893 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.165743 4893 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.165751 4893 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.165760 4893 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.165768 4893 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.165776 4893 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.165784 4893 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.165792 4893 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.165799 4893 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.165807 4893 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.165815 4893 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.165822 4893 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.165830 4893 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.165838 4893 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.165845 4893 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.165854 4893 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.165864 4893 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.165874 4893 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.165884 4893 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.165894 4893 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.165903 4893 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.165913 4893 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.165926 4893 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.165938 4893 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.165949 4893 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.165959 4893 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.165977 4893 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.165986 4893 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.165994 4893 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166002 4893 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166009 4893 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166017 4893 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166025 4893 feature_gate.go:330] unrecognized feature gate: Example Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166033 4893 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166043 4893 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166080 4893 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166088 4893 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166097 4893 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166106 4893 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166118 4893 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.166131 4893 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166360 4893 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166376 4893 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166385 4893 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166394 4893 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166403 4893 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166411 4893 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166420 4893 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166430 4893 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166439 4893 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166447 4893 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166455 4893 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166463 4893 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166472 4893 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166481 4893 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166489 4893 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166498 4893 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166508 4893 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166545 4893 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166554 4893 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166562 4893 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166569 4893 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166577 4893 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166585 4893 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166593 4893 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166601 4893 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166608 4893 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166616 4893 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166624 4893 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166632 4893 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166640 4893 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166647 4893 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166655 4893 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166663 4893 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166670 4893 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166679 4893 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166687 4893 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166699 4893 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166709 4893 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166718 4893 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166728 4893 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166739 4893 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166749 4893 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166759 4893 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166770 4893 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166781 4893 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166791 4893 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166803 4893 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166813 4893 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166824 4893 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166835 4893 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166845 4893 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166856 4893 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166867 4893 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166877 4893 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166887 4893 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166895 4893 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166903 4893 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166910 4893 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166918 4893 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166926 4893 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166934 4893 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166942 4893 feature_gate.go:330] unrecognized feature gate: Example Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166952 4893 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166962 4893 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166970 4893 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166978 4893 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166986 4893 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.166993 4893 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.167001 4893 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.167009 4893 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.167020 4893 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.167033 4893 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.167298 4893 server.go:940] "Client rotation is on, will bootstrap in background" Mar 14 06:58:41 crc kubenswrapper[4893]: E0314 06:58:41.172271 4893 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.177491 4893 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.177746 4893 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.179837 4893 server.go:997] "Starting client certificate rotation" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.179896 4893 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.180622 4893 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.209311 4893 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 14 06:58:41 crc kubenswrapper[4893]: E0314 06:58:41.214189 4893 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.5:6443: connect: connection refused" logger="UnhandledError" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.214253 4893 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.230554 4893 log.go:25] "Validated CRI v1 runtime API" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.268805 4893 log.go:25] "Validated CRI v1 image API" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.271572 4893 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.277119 4893 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-14-06-48-26-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.277164 4893 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.292491 4893 manager.go:217] Machine: {Timestamp:2026-03-14 06:58:41.289857249 +0000 UTC m=+0.552034061 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654124544 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:89dd883c-4665-4ad4-a9f9-9508898f96bf BootID:b2309444-9d73-4619-8320-90d0fa1f365c Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:88:3d:c9 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:88:3d:c9 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:24:dd:34 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:fb:0d:ce Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:24:22:3a Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:7e:b0:5e Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:59:8f:cc Speed:-1 Mtu:1496} {Name:ens7.44 MacAddress:52:54:00:da:fc:f1 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:8a:d2:37:d1:28:f7 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:ca:73:17:4e:28:f7 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654124544 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.292714 4893 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.292844 4893 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.294399 4893 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.294611 4893 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.294651 4893 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.294867 4893 topology_manager.go:138] "Creating topology manager with none policy" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.294878 4893 container_manager_linux.go:303] "Creating device plugin manager" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.295259 4893 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.295289 4893 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.295488 4893 state_mem.go:36] "Initialized new in-memory state store" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.295581 4893 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.299904 4893 kubelet.go:418] "Attempting to sync node with API server" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.299927 4893 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.299949 4893 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.299962 4893 kubelet.go:324] "Adding apiserver pod source" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.299973 4893 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.304423 4893 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.305550 4893 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.306791 4893 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.5:6443: connect: connection refused Mar 14 06:58:41 crc kubenswrapper[4893]: E0314 06:58:41.306858 4893 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.5:6443: connect: connection refused" logger="UnhandledError" Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.306832 4893 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.5:6443: connect: connection refused Mar 14 06:58:41 crc kubenswrapper[4893]: E0314 06:58:41.307067 4893 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.5:6443: connect: connection refused" logger="UnhandledError" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.309199 4893 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.312630 4893 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.312738 4893 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.312763 4893 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.312782 4893 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.312813 4893 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.312835 4893 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.312858 4893 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.312890 4893 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.313409 4893 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.313565 4893 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.313645 4893 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.313672 4893 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.314846 4893 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.316041 4893 server.go:1280] "Started kubelet" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.317472 4893 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.317622 4893 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.317620 4893 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.5:6443: connect: connection refused Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.318309 4893 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 14 06:58:41 crc systemd[1]: Started Kubernetes Kubelet. Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.319806 4893 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.319860 4893 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 14 06:58:41 crc kubenswrapper[4893]: E0314 06:58:41.320026 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.320085 4893 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.320092 4893 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.320174 4893 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.321560 4893 server.go:460] "Adding debug handlers to kubelet server" Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.324631 4893 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.5:6443: connect: connection refused Mar 14 06:58:41 crc kubenswrapper[4893]: E0314 06:58:41.324721 4893 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.5:6443: connect: connection refused" logger="UnhandledError" Mar 14 06:58:41 crc kubenswrapper[4893]: E0314 06:58:41.324838 4893 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.5:6443: connect: connection refused" interval="200ms" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.327894 4893 factory.go:55] Registering systemd factory Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.327954 4893 factory.go:221] Registration of the systemd container factory successfully Mar 14 06:58:41 crc kubenswrapper[4893]: E0314 06:58:41.327411 4893 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.5:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189ca2f3cf4f37fc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:41.315960828 +0000 UTC m=+0.578137690,LastTimestamp:2026-03-14 06:58:41.315960828 +0000 UTC m=+0.578137690,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.329046 4893 factory.go:153] Registering CRI-O factory Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.329076 4893 factory.go:221] Registration of the crio container factory successfully Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.329135 4893 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.329169 4893 factory.go:103] Registering Raw factory Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.329184 4893 manager.go:1196] Started watching for new ooms in manager Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.329683 4893 manager.go:319] Starting recovery of all containers Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.336058 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.336358 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.336382 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.336402 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.336426 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.336443 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.336461 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.336479 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.336499 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.336517 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.336561 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.336580 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.336598 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.336619 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.338986 4893 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.339097 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.339144 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.339178 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.339211 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.339242 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.339272 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.339302 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.339334 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.339393 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.339434 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.339464 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.339579 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.339628 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.339667 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.339700 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.339733 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.339765 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.339799 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.341502 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.341772 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.341969 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.342158 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.342348 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.342592 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.342782 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.343653 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.343871 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.344066 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.344206 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.344346 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.344488 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.344751 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.344969 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.345160 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.345392 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.345669 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.345892 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.351406 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.351497 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.351589 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.351619 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.351709 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.351734 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.351759 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.351783 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.351803 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.351825 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.351846 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.351865 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.351886 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.351909 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.351931 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.351951 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.351972 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.351991 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.352015 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.352037 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.352059 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.352082 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.352103 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.352125 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.352149 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.352171 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.352193 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.352215 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.352240 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.352263 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.352285 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.352307 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.352328 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.352354 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.352376 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.352400 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.352422 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.352445 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.352466 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.352489 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.352510 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.352561 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.352586 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.352609 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.352635 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.352659 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.352682 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.352705 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.352728 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.352750 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.352772 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.352795 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.352820 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.352853 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.352879 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.352903 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.352926 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.352948 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.352972 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.352996 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.353019 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.353043 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.353065 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.353089 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.353113 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.353136 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.353158 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.353180 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.353234 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.353257 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.353278 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.353307 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.353328 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.353350 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.353374 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.353398 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.353421 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.353442 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.353463 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.353484 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.353505 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.353861 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.353892 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.353918 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.353939 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.353962 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.353984 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.354006 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.354027 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.354048 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.354074 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.354094 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.354114 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.354133 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.354154 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.354174 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.354197 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.354219 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.354238 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.354259 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.354281 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.354301 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.354321 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.354342 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.354364 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.354383 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.354406 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.354426 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.354447 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.354467 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.354490 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.354512 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.355794 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.356487 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.356544 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.356568 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.356591 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.356613 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.356642 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.356665 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.356717 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.356743 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.356767 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.356790 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.356813 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.356839 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.356861 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.356883 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.356903 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.356925 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.356975 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.357001 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.357023 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.357045 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.357066 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.357106 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.357127 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.357148 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.357172 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.357194 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.357215 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.357237 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.357261 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.357285 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.357306 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.357328 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.357351 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.357373 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.357394 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.357418 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.357439 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.357461 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.357483 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.357505 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.357552 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.357573 4893 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.357598 4893 reconstruct.go:97] "Volume reconstruction finished" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.357613 4893 reconciler.go:26] "Reconciler: start to sync state" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.365465 4893 manager.go:324] Recovery completed Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.373061 4893 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.375295 4893 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.375345 4893 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.375381 4893 kubelet.go:2335] "Starting kubelet main sync loop" Mar 14 06:58:41 crc kubenswrapper[4893]: E0314 06:58:41.375506 4893 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 14 06:58:41 crc kubenswrapper[4893]: W0314 06:58:41.376740 4893 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.5:6443: connect: connection refused Mar 14 06:58:41 crc kubenswrapper[4893]: E0314 06:58:41.376832 4893 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.5:6443: connect: connection refused" logger="UnhandledError" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.378042 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.380001 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.380047 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.380060 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.380748 4893 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.380766 4893 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 14 06:58:41 crc kubenswrapper[4893]: I0314 06:58:41.380784 4893 state_mem.go:36] "Initialized new in-memory state store" Mar 14 06:58:41 crc kubenswrapper[4893]: E0314 06:58:41.420194 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 06:58:41 crc kubenswrapper[4893]: E0314 06:58:41.476147 4893 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 14 06:58:41 crc kubenswrapper[4893]: E0314 06:58:41.520413 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 06:58:41 crc kubenswrapper[4893]: E0314 06:58:41.526452 4893 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.5:6443: connect: connection refused" interval="400ms" Mar 14 06:58:41 crc kubenswrapper[4893]: E0314 06:58:41.620934 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 06:58:41 crc kubenswrapper[4893]: E0314 06:58:41.677011 4893 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 14 06:58:41 crc kubenswrapper[4893]: E0314 06:58:41.721661 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 06:58:41 crc kubenswrapper[4893]: E0314 06:58:41.822018 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 06:58:41 crc kubenswrapper[4893]: E0314 06:58:41.923296 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 06:58:41 crc kubenswrapper[4893]: E0314 06:58:41.928190 4893 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.5:6443: connect: connection refused" interval="800ms" Mar 14 06:58:42 crc kubenswrapper[4893]: E0314 06:58:42.024569 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 06:58:42 crc kubenswrapper[4893]: E0314 06:58:42.077453 4893 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 14 06:58:42 crc kubenswrapper[4893]: E0314 06:58:42.125321 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 06:58:42 crc kubenswrapper[4893]: E0314 06:58:42.226162 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 06:58:42 crc kubenswrapper[4893]: W0314 06:58:42.274500 4893 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.5:6443: connect: connection refused Mar 14 06:58:42 crc kubenswrapper[4893]: E0314 06:58:42.274853 4893 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.5:6443: connect: connection refused" logger="UnhandledError" Mar 14 06:58:42 crc kubenswrapper[4893]: I0314 06:58:42.318507 4893 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.5:6443: connect: connection refused Mar 14 06:58:42 crc kubenswrapper[4893]: E0314 06:58:42.327093 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 06:58:42 crc kubenswrapper[4893]: E0314 06:58:42.428249 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 06:58:42 crc kubenswrapper[4893]: W0314 06:58:42.439115 4893 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.5:6443: connect: connection refused Mar 14 06:58:42 crc kubenswrapper[4893]: E0314 06:58:42.439192 4893 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.5:6443: connect: connection refused" logger="UnhandledError" Mar 14 06:58:42 crc kubenswrapper[4893]: W0314 06:58:42.476101 4893 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.5:6443: connect: connection refused Mar 14 06:58:42 crc kubenswrapper[4893]: E0314 06:58:42.476224 4893 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.5:6443: connect: connection refused" logger="UnhandledError" Mar 14 06:58:42 crc kubenswrapper[4893]: E0314 06:58:42.529448 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 06:58:42 crc kubenswrapper[4893]: W0314 06:58:42.611942 4893 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.5:6443: connect: connection refused Mar 14 06:58:42 crc kubenswrapper[4893]: E0314 06:58:42.612069 4893 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.5:6443: connect: connection refused" logger="UnhandledError" Mar 14 06:58:42 crc kubenswrapper[4893]: E0314 06:58:42.629933 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 06:58:42 crc kubenswrapper[4893]: E0314 06:58:42.730208 4893 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.5:6443: connect: connection refused" interval="1.6s" Mar 14 06:58:42 crc kubenswrapper[4893]: E0314 06:58:42.730331 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 06:58:42 crc kubenswrapper[4893]: E0314 06:58:42.830913 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 06:58:42 crc kubenswrapper[4893]: E0314 06:58:42.877924 4893 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 14 06:58:42 crc kubenswrapper[4893]: E0314 06:58:42.931602 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 06:58:43 crc kubenswrapper[4893]: E0314 06:58:43.032711 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 06:58:43 crc kubenswrapper[4893]: E0314 06:58:43.133777 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 06:58:43 crc kubenswrapper[4893]: E0314 06:58:43.234291 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 06:58:43 crc kubenswrapper[4893]: I0314 06:58:43.318827 4893 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.5:6443: connect: connection refused Mar 14 06:58:43 crc kubenswrapper[4893]: E0314 06:58:43.335385 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 06:58:43 crc kubenswrapper[4893]: I0314 06:58:43.390908 4893 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 14 06:58:43 crc kubenswrapper[4893]: E0314 06:58:43.436450 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 06:58:43 crc kubenswrapper[4893]: E0314 06:58:43.537417 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 06:58:43 crc kubenswrapper[4893]: E0314 06:58:43.638587 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 06:58:43 crc kubenswrapper[4893]: E0314 06:58:43.739336 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 06:58:43 crc kubenswrapper[4893]: E0314 06:58:43.839690 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 06:58:43 crc kubenswrapper[4893]: E0314 06:58:43.940726 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 06:58:44 crc kubenswrapper[4893]: E0314 06:58:44.041498 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 06:58:44 crc kubenswrapper[4893]: E0314 06:58:44.142495 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 06:58:44 crc kubenswrapper[4893]: E0314 06:58:44.242923 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 06:58:44 crc kubenswrapper[4893]: W0314 06:58:44.250448 4893 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.5:6443: connect: connection refused Mar 14 06:58:44 crc kubenswrapper[4893]: E0314 06:58:44.250488 4893 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.5:6443: connect: connection refused" logger="UnhandledError" Mar 14 06:58:44 crc kubenswrapper[4893]: I0314 06:58:44.318847 4893 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.5:6443: connect: connection refused Mar 14 06:58:44 crc kubenswrapper[4893]: E0314 06:58:44.331504 4893 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.5:6443: connect: connection refused" interval="3.2s" Mar 14 06:58:44 crc kubenswrapper[4893]: E0314 06:58:44.343734 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 06:58:44 crc kubenswrapper[4893]: E0314 06:58:44.444224 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 06:58:44 crc kubenswrapper[4893]: E0314 06:58:44.478383 4893 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 14 06:58:44 crc kubenswrapper[4893]: E0314 06:58:44.544800 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 06:58:44 crc kubenswrapper[4893]: E0314 06:58:44.645336 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 06:58:44 crc kubenswrapper[4893]: E0314 06:58:44.746257 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 06:58:44 crc kubenswrapper[4893]: E0314 06:58:44.847298 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 06:58:44 crc kubenswrapper[4893]: E0314 06:58:44.947954 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 06:58:44 crc kubenswrapper[4893]: W0314 06:58:44.973482 4893 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.5:6443: connect: connection refused Mar 14 06:58:44 crc kubenswrapper[4893]: E0314 06:58:44.973586 4893 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.5:6443: connect: connection refused" logger="UnhandledError" Mar 14 06:58:45 crc kubenswrapper[4893]: E0314 06:58:45.048357 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 06:58:45 crc kubenswrapper[4893]: W0314 06:58:45.123824 4893 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.5:6443: connect: connection refused Mar 14 06:58:45 crc kubenswrapper[4893]: E0314 06:58:45.123925 4893 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.5:6443: connect: connection refused" logger="UnhandledError" Mar 14 06:58:45 crc kubenswrapper[4893]: E0314 06:58:45.149370 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 06:58:45 crc kubenswrapper[4893]: W0314 06:58:45.194488 4893 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.5:6443: connect: connection refused Mar 14 06:58:45 crc kubenswrapper[4893]: E0314 06:58:45.194619 4893 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.5:6443: connect: connection refused" logger="UnhandledError" Mar 14 06:58:45 crc kubenswrapper[4893]: E0314 06:58:45.249556 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 06:58:45 crc kubenswrapper[4893]: I0314 06:58:45.319147 4893 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.5:6443: connect: connection refused Mar 14 06:58:45 crc kubenswrapper[4893]: E0314 06:58:45.350657 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 06:58:45 crc kubenswrapper[4893]: E0314 06:58:45.451446 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 06:58:45 crc kubenswrapper[4893]: E0314 06:58:45.552407 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 06:58:45 crc kubenswrapper[4893]: E0314 06:58:45.618879 4893 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.5:6443: connect: connection refused" logger="UnhandledError" Mar 14 06:58:45 crc kubenswrapper[4893]: E0314 06:58:45.652891 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 06:58:45 crc kubenswrapper[4893]: I0314 06:58:45.675059 4893 policy_none.go:49] "None policy: Start" Mar 14 06:58:45 crc kubenswrapper[4893]: I0314 06:58:45.676836 4893 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 14 06:58:45 crc kubenswrapper[4893]: I0314 06:58:45.676902 4893 state_mem.go:35] "Initializing new in-memory state store" Mar 14 06:58:45 crc kubenswrapper[4893]: E0314 06:58:45.753933 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 06:58:45 crc kubenswrapper[4893]: I0314 06:58:45.837177 4893 manager.go:334] "Starting Device Plugin manager" Mar 14 06:58:45 crc kubenswrapper[4893]: I0314 06:58:45.837462 4893 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 14 06:58:45 crc kubenswrapper[4893]: I0314 06:58:45.837483 4893 server.go:79] "Starting device plugin registration server" Mar 14 06:58:45 crc kubenswrapper[4893]: I0314 06:58:45.837953 4893 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 14 06:58:45 crc kubenswrapper[4893]: I0314 06:58:45.837977 4893 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 14 06:58:45 crc kubenswrapper[4893]: I0314 06:58:45.838301 4893 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 14 06:58:45 crc kubenswrapper[4893]: I0314 06:58:45.838637 4893 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 14 06:58:45 crc kubenswrapper[4893]: I0314 06:58:45.838667 4893 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 14 06:58:45 crc kubenswrapper[4893]: E0314 06:58:45.853346 4893 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 14 06:58:45 crc kubenswrapper[4893]: I0314 06:58:45.938405 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:58:45 crc kubenswrapper[4893]: I0314 06:58:45.940151 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:58:45 crc kubenswrapper[4893]: I0314 06:58:45.940201 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:58:45 crc kubenswrapper[4893]: I0314 06:58:45.940221 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:58:45 crc kubenswrapper[4893]: I0314 06:58:45.940263 4893 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 14 06:58:45 crc kubenswrapper[4893]: E0314 06:58:45.940765 4893 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.5:6443: connect: connection refused" node="crc" Mar 14 06:58:46 crc kubenswrapper[4893]: I0314 06:58:46.141366 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:58:46 crc kubenswrapper[4893]: I0314 06:58:46.143108 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:58:46 crc kubenswrapper[4893]: I0314 06:58:46.143337 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:58:46 crc kubenswrapper[4893]: I0314 06:58:46.143502 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:58:46 crc kubenswrapper[4893]: I0314 06:58:46.143719 4893 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 14 06:58:46 crc kubenswrapper[4893]: E0314 06:58:46.144597 4893 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.5:6443: connect: connection refused" node="crc" Mar 14 06:58:46 crc kubenswrapper[4893]: I0314 06:58:46.319678 4893 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.5:6443: connect: connection refused Mar 14 06:58:46 crc kubenswrapper[4893]: I0314 06:58:46.545227 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:58:46 crc kubenswrapper[4893]: I0314 06:58:46.547026 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:58:46 crc kubenswrapper[4893]: I0314 06:58:46.547250 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:58:46 crc kubenswrapper[4893]: I0314 06:58:46.547429 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:58:46 crc kubenswrapper[4893]: I0314 06:58:46.547644 4893 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 14 06:58:46 crc kubenswrapper[4893]: E0314 06:58:46.548707 4893 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.5:6443: connect: connection refused" node="crc" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.319503 4893 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.5:6443: connect: connection refused Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.350103 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.352387 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.352482 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.352508 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.352618 4893 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 14 06:58:47 crc kubenswrapper[4893]: E0314 06:58:47.353409 4893 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.5:6443: connect: connection refused" node="crc" Mar 14 06:58:47 crc kubenswrapper[4893]: E0314 06:58:47.533678 4893 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.5:6443: connect: connection refused" interval="6.4s" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.679076 4893 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc"] Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.679256 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.682592 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.682698 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.682731 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.683077 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.683487 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.683583 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.685098 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.685159 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.685181 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.685216 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.685270 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.685300 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.685336 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.685590 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.685661 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.686835 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.686890 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.686904 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.687045 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.687065 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.687074 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.687089 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.687240 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.687295 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.688605 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.688628 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.688638 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.688738 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.688851 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.688908 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.688964 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.688914 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.689034 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.689507 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.689610 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.689637 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.690027 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.690090 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.690183 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.690224 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.690238 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.691196 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.691233 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.691246 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.739562 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.739620 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.739653 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.739684 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.739706 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.739788 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.739811 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.739889 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.739958 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.739991 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.740062 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.740188 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.740356 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.740494 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.740621 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.842607 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.842679 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.842706 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.842735 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.842764 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.842791 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.842813 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.842810 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.842836 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.842863 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.842852 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.842872 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.842817 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.843007 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.843016 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.843024 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.843070 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.843131 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.843164 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.843180 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.843017 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.843214 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.843193 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.843230 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.843297 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.843322 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.843340 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.843404 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.843418 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 06:58:47 crc kubenswrapper[4893]: I0314 06:58:47.843493 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 06:58:48 crc kubenswrapper[4893]: I0314 06:58:48.041923 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 06:58:48 crc kubenswrapper[4893]: I0314 06:58:48.071886 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 06:58:48 crc kubenswrapper[4893]: I0314 06:58:48.088851 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 14 06:58:48 crc kubenswrapper[4893]: I0314 06:58:48.113175 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 14 06:58:48 crc kubenswrapper[4893]: I0314 06:58:48.118658 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 14 06:58:48 crc kubenswrapper[4893]: W0314 06:58:48.157937 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-d90e22ce5dc7788a7c5433086c68216d9c23cb89e4b99f1d7d7e2736efe07207 WatchSource:0}: Error finding container d90e22ce5dc7788a7c5433086c68216d9c23cb89e4b99f1d7d7e2736efe07207: Status 404 returned error can't find the container with id d90e22ce5dc7788a7c5433086c68216d9c23cb89e4b99f1d7d7e2736efe07207 Mar 14 06:58:48 crc kubenswrapper[4893]: W0314 06:58:48.173044 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-21558ec8a84efa6d74b2a3568e04221172207efb3b3d7f3cdf49bc96e46027ae WatchSource:0}: Error finding container 21558ec8a84efa6d74b2a3568e04221172207efb3b3d7f3cdf49bc96e46027ae: Status 404 returned error can't find the container with id 21558ec8a84efa6d74b2a3568e04221172207efb3b3d7f3cdf49bc96e46027ae Mar 14 06:58:48 crc kubenswrapper[4893]: W0314 06:58:48.173339 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-6433073ba65ef301e347a38eb260cb7ed9a80dc3fa4d1f3d1990421f0329d721 WatchSource:0}: Error finding container 6433073ba65ef301e347a38eb260cb7ed9a80dc3fa4d1f3d1990421f0329d721: Status 404 returned error can't find the container with id 6433073ba65ef301e347a38eb260cb7ed9a80dc3fa4d1f3d1990421f0329d721 Mar 14 06:58:48 crc kubenswrapper[4893]: W0314 06:58:48.180093 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-5488296e06108fd5206794c589f7fb664c022eb3127b7c313ac575373da9986e WatchSource:0}: Error finding container 5488296e06108fd5206794c589f7fb664c022eb3127b7c313ac575373da9986e: Status 404 returned error can't find the container with id 5488296e06108fd5206794c589f7fb664c022eb3127b7c313ac575373da9986e Mar 14 06:58:48 crc kubenswrapper[4893]: W0314 06:58:48.181776 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-33d5f651dc391765b4ea79681ad7ef21b4e76bd74238669e54d6e3f5dc94a95e WatchSource:0}: Error finding container 33d5f651dc391765b4ea79681ad7ef21b4e76bd74238669e54d6e3f5dc94a95e: Status 404 returned error can't find the container with id 33d5f651dc391765b4ea79681ad7ef21b4e76bd74238669e54d6e3f5dc94a95e Mar 14 06:58:48 crc kubenswrapper[4893]: I0314 06:58:48.319365 4893 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.5:6443: connect: connection refused Mar 14 06:58:48 crc kubenswrapper[4893]: I0314 06:58:48.394088 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"5488296e06108fd5206794c589f7fb664c022eb3127b7c313ac575373da9986e"} Mar 14 06:58:48 crc kubenswrapper[4893]: I0314 06:58:48.395283 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"33d5f651dc391765b4ea79681ad7ef21b4e76bd74238669e54d6e3f5dc94a95e"} Mar 14 06:58:48 crc kubenswrapper[4893]: I0314 06:58:48.396464 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"6433073ba65ef301e347a38eb260cb7ed9a80dc3fa4d1f3d1990421f0329d721"} Mar 14 06:58:48 crc kubenswrapper[4893]: I0314 06:58:48.397741 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"21558ec8a84efa6d74b2a3568e04221172207efb3b3d7f3cdf49bc96e46027ae"} Mar 14 06:58:48 crc kubenswrapper[4893]: I0314 06:58:48.398930 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d90e22ce5dc7788a7c5433086c68216d9c23cb89e4b99f1d7d7e2736efe07207"} Mar 14 06:58:48 crc kubenswrapper[4893]: I0314 06:58:48.954093 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:58:48 crc kubenswrapper[4893]: I0314 06:58:48.955924 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:58:48 crc kubenswrapper[4893]: I0314 06:58:48.955969 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:58:48 crc kubenswrapper[4893]: I0314 06:58:48.955984 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:58:48 crc kubenswrapper[4893]: I0314 06:58:48.956018 4893 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 14 06:58:48 crc kubenswrapper[4893]: E0314 06:58:48.956636 4893 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.5:6443: connect: connection refused" node="crc" Mar 14 06:58:49 crc kubenswrapper[4893]: I0314 06:58:49.318932 4893 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.5:6443: connect: connection refused Mar 14 06:58:49 crc kubenswrapper[4893]: W0314 06:58:49.383279 4893 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.5:6443: connect: connection refused Mar 14 06:58:49 crc kubenswrapper[4893]: E0314 06:58:49.383383 4893 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.5:6443: connect: connection refused" logger="UnhandledError" Mar 14 06:58:49 crc kubenswrapper[4893]: I0314 06:58:49.823477 4893 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 14 06:58:49 crc kubenswrapper[4893]: E0314 06:58:49.824724 4893 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.5:6443: connect: connection refused" logger="UnhandledError" Mar 14 06:58:50 crc kubenswrapper[4893]: I0314 06:58:50.318543 4893 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.5:6443: connect: connection refused Mar 14 06:58:50 crc kubenswrapper[4893]: I0314 06:58:50.406480 4893 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="f95e14126ef8d4e1faa84d25aab7dcf4bad7686037ad804d5a026263069ab672" exitCode=0 Mar 14 06:58:50 crc kubenswrapper[4893]: I0314 06:58:50.406624 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:58:50 crc kubenswrapper[4893]: I0314 06:58:50.406676 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"f95e14126ef8d4e1faa84d25aab7dcf4bad7686037ad804d5a026263069ab672"} Mar 14 06:58:50 crc kubenswrapper[4893]: I0314 06:58:50.408167 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:58:50 crc kubenswrapper[4893]: I0314 06:58:50.408211 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:58:50 crc kubenswrapper[4893]: I0314 06:58:50.408228 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:58:50 crc kubenswrapper[4893]: I0314 06:58:50.410162 4893 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="eff55950969c2d8ef6ffb046b1bf7a5443798a62285d69741ace1a2d90f839ac" exitCode=0 Mar 14 06:58:50 crc kubenswrapper[4893]: I0314 06:58:50.410260 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:58:50 crc kubenswrapper[4893]: I0314 06:58:50.410284 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"eff55950969c2d8ef6ffb046b1bf7a5443798a62285d69741ace1a2d90f839ac"} Mar 14 06:58:50 crc kubenswrapper[4893]: I0314 06:58:50.413773 4893 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="cc2224a6f1570ab855d2d8b18945061428e1fc7099257e1080a50b5dba3f61b8" exitCode=0 Mar 14 06:58:50 crc kubenswrapper[4893]: I0314 06:58:50.413862 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"cc2224a6f1570ab855d2d8b18945061428e1fc7099257e1080a50b5dba3f61b8"} Mar 14 06:58:50 crc kubenswrapper[4893]: I0314 06:58:50.414018 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:58:50 crc kubenswrapper[4893]: I0314 06:58:50.416707 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:58:50 crc kubenswrapper[4893]: I0314 06:58:50.416773 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:58:50 crc kubenswrapper[4893]: I0314 06:58:50.416773 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:58:50 crc kubenswrapper[4893]: I0314 06:58:50.416816 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:58:50 crc kubenswrapper[4893]: I0314 06:58:50.416791 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:58:50 crc kubenswrapper[4893]: I0314 06:58:50.416827 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:58:50 crc kubenswrapper[4893]: I0314 06:58:50.418160 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"edb18f1797c24dd1bf0910081063b20e5bb8409605ec90dabf8c045a9d04a99e"} Mar 14 06:58:50 crc kubenswrapper[4893]: I0314 06:58:50.418207 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d9c375df2546244fc3486a22cc2fdfa8309783f94a19e116dc1c3fc9a6719dc7"} Mar 14 06:58:50 crc kubenswrapper[4893]: I0314 06:58:50.421567 4893 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c80d2b2255ef2adea9837a99343f51b8e5b351eb739bb9aa8d9f66e72cc31830" exitCode=0 Mar 14 06:58:50 crc kubenswrapper[4893]: I0314 06:58:50.421636 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"c80d2b2255ef2adea9837a99343f51b8e5b351eb739bb9aa8d9f66e72cc31830"} Mar 14 06:58:50 crc kubenswrapper[4893]: I0314 06:58:50.421758 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:58:50 crc kubenswrapper[4893]: I0314 06:58:50.423375 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:58:50 crc kubenswrapper[4893]: I0314 06:58:50.423414 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:58:50 crc kubenswrapper[4893]: I0314 06:58:50.423426 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:58:50 crc kubenswrapper[4893]: I0314 06:58:50.427544 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:58:50 crc kubenswrapper[4893]: I0314 06:58:50.428283 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:58:50 crc kubenswrapper[4893]: I0314 06:58:50.428319 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:58:50 crc kubenswrapper[4893]: I0314 06:58:50.428330 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:58:50 crc kubenswrapper[4893]: W0314 06:58:50.648407 4893 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.5:6443: connect: connection refused Mar 14 06:58:50 crc kubenswrapper[4893]: E0314 06:58:50.648557 4893 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.5:6443: connect: connection refused" logger="UnhandledError" Mar 14 06:58:50 crc kubenswrapper[4893]: E0314 06:58:50.747542 4893 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.5:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189ca2f3cf4f37fc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:41.315960828 +0000 UTC m=+0.578137690,LastTimestamp:2026-03-14 06:58:41.315960828 +0000 UTC m=+0.578137690,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:58:50 crc kubenswrapper[4893]: W0314 06:58:50.994067 4893 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.5:6443: connect: connection refused Mar 14 06:58:50 crc kubenswrapper[4893]: E0314 06:58:50.994212 4893 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.5:6443: connect: connection refused" logger="UnhandledError" Mar 14 06:58:51 crc kubenswrapper[4893]: W0314 06:58:51.054068 4893 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.5:6443: connect: connection refused Mar 14 06:58:51 crc kubenswrapper[4893]: E0314 06:58:51.054216 4893 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.5:6443: connect: connection refused" logger="UnhandledError" Mar 14 06:58:51 crc kubenswrapper[4893]: I0314 06:58:51.319518 4893 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.5:6443: connect: connection refused Mar 14 06:58:51 crc kubenswrapper[4893]: I0314 06:58:51.428503 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"7a04d968dbfe88e8feabb1bc0ca79f526bf075b474755ae6e3275142a1a07129"} Mar 14 06:58:51 crc kubenswrapper[4893]: I0314 06:58:51.428590 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"a89b4eebc764edaee5c8c8ba5b354d379648cc84fe636e9051f3ebbccb0dcd3a"} Mar 14 06:58:51 crc kubenswrapper[4893]: I0314 06:58:51.428603 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"b65ae6f761eee45fad3d00f4c1aec2c7d7cabfd491cdd1f01ff4f2d45d8a40f1"} Mar 14 06:58:51 crc kubenswrapper[4893]: I0314 06:58:51.428746 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:58:51 crc kubenswrapper[4893]: I0314 06:58:51.429790 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:58:51 crc kubenswrapper[4893]: I0314 06:58:51.429819 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:58:51 crc kubenswrapper[4893]: I0314 06:58:51.429827 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:58:51 crc kubenswrapper[4893]: I0314 06:58:51.432640 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9a569e186833030a175e8707d76e522a912af9fc7943ad92b88bf05c518f9a17"} Mar 14 06:58:51 crc kubenswrapper[4893]: I0314 06:58:51.432675 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e6020db59dc4dfed39cbc5491613e374311200a9b90727b9bf78554a53312a38"} Mar 14 06:58:51 crc kubenswrapper[4893]: I0314 06:58:51.432731 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:58:51 crc kubenswrapper[4893]: I0314 06:58:51.433594 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:58:51 crc kubenswrapper[4893]: I0314 06:58:51.433615 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:58:51 crc kubenswrapper[4893]: I0314 06:58:51.433624 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:58:51 crc kubenswrapper[4893]: I0314 06:58:51.437854 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0dc9e912767032cdb60b63896636e573b4f6df5817c678fb00bca04809b23eee"} Mar 14 06:58:51 crc kubenswrapper[4893]: I0314 06:58:51.437882 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"67fb6e3258810524baca6310f3158bd631cab495efae35751a6935f6b504eceb"} Mar 14 06:58:51 crc kubenswrapper[4893]: I0314 06:58:51.437894 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9bb559a0110d78c5a24c05232d87f123298de2a9865937e122aa1e0b4f22800a"} Mar 14 06:58:51 crc kubenswrapper[4893]: I0314 06:58:51.437905 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1d52e070ff2c06d0dd8b8a6ebcc2123cdd912571402285a57f7198351020c530"} Mar 14 06:58:51 crc kubenswrapper[4893]: I0314 06:58:51.440256 4893 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="42efd85c7afaac4c43b9abfc44f5beaee818177931d19d862a16e7a73baea7f4" exitCode=0 Mar 14 06:58:51 crc kubenswrapper[4893]: I0314 06:58:51.440302 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"42efd85c7afaac4c43b9abfc44f5beaee818177931d19d862a16e7a73baea7f4"} Mar 14 06:58:51 crc kubenswrapper[4893]: I0314 06:58:51.440390 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:58:51 crc kubenswrapper[4893]: I0314 06:58:51.441155 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:58:51 crc kubenswrapper[4893]: I0314 06:58:51.441178 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:58:51 crc kubenswrapper[4893]: I0314 06:58:51.441186 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:58:51 crc kubenswrapper[4893]: I0314 06:58:51.444403 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"a3d02d6cccd912e953a09272248822d6600be18e5138914d0a4e6af6b4eaf1a6"} Mar 14 06:58:51 crc kubenswrapper[4893]: I0314 06:58:51.444458 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:58:51 crc kubenswrapper[4893]: I0314 06:58:51.445015 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:58:51 crc kubenswrapper[4893]: I0314 06:58:51.445030 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:58:51 crc kubenswrapper[4893]: I0314 06:58:51.445038 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:58:52 crc kubenswrapper[4893]: I0314 06:58:52.157383 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:58:52 crc kubenswrapper[4893]: I0314 06:58:52.158540 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:58:52 crc kubenswrapper[4893]: I0314 06:58:52.158573 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:58:52 crc kubenswrapper[4893]: I0314 06:58:52.158584 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:58:52 crc kubenswrapper[4893]: I0314 06:58:52.158603 4893 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 14 06:58:52 crc kubenswrapper[4893]: I0314 06:58:52.448804 4893 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="12e9eb77cfafad5ca5ca1ca71de32acd0bd172b4ff6bba5d43d51b1eb43c3a1a" exitCode=0 Mar 14 06:58:52 crc kubenswrapper[4893]: I0314 06:58:52.448866 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"12e9eb77cfafad5ca5ca1ca71de32acd0bd172b4ff6bba5d43d51b1eb43c3a1a"} Mar 14 06:58:52 crc kubenswrapper[4893]: I0314 06:58:52.448969 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:58:52 crc kubenswrapper[4893]: I0314 06:58:52.449744 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:58:52 crc kubenswrapper[4893]: I0314 06:58:52.449832 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:58:52 crc kubenswrapper[4893]: I0314 06:58:52.449921 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:58:52 crc kubenswrapper[4893]: I0314 06:58:52.451564 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e006554b086d62449d31d61b3917cc5972ec7b796834b1696d4129d8dd4cb64f"} Mar 14 06:58:52 crc kubenswrapper[4893]: I0314 06:58:52.451606 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:58:52 crc kubenswrapper[4893]: I0314 06:58:52.451677 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:58:52 crc kubenswrapper[4893]: I0314 06:58:52.451724 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:58:52 crc kubenswrapper[4893]: I0314 06:58:52.451802 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:58:52 crc kubenswrapper[4893]: I0314 06:58:52.452687 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 14 06:58:52 crc kubenswrapper[4893]: I0314 06:58:52.452941 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:58:52 crc kubenswrapper[4893]: I0314 06:58:52.453003 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:58:52 crc kubenswrapper[4893]: I0314 06:58:52.453017 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:58:52 crc kubenswrapper[4893]: I0314 06:58:52.453608 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:58:52 crc kubenswrapper[4893]: I0314 06:58:52.453658 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:58:52 crc kubenswrapper[4893]: I0314 06:58:52.453674 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:58:52 crc kubenswrapper[4893]: I0314 06:58:52.453788 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:58:52 crc kubenswrapper[4893]: I0314 06:58:52.453854 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:58:52 crc kubenswrapper[4893]: I0314 06:58:52.453988 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:58:52 crc kubenswrapper[4893]: I0314 06:58:52.453931 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:58:52 crc kubenswrapper[4893]: I0314 06:58:52.454353 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:58:52 crc kubenswrapper[4893]: I0314 06:58:52.454372 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:58:53 crc kubenswrapper[4893]: I0314 06:58:53.042067 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 06:58:53 crc kubenswrapper[4893]: I0314 06:58:53.042142 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 06:58:53 crc kubenswrapper[4893]: I0314 06:58:53.237067 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 06:58:53 crc kubenswrapper[4893]: I0314 06:58:53.459405 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:58:53 crc kubenswrapper[4893]: I0314 06:58:53.459649 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"492e2e17e5f7c53b824566055b07fef61f9fb8bac73a68ca7c75594d234132c9"} Mar 14 06:58:53 crc kubenswrapper[4893]: I0314 06:58:53.459715 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"8974c67f4129571db545fe36e7f7f95724131ca2c106f911fb6818194973164b"} Mar 14 06:58:53 crc kubenswrapper[4893]: I0314 06:58:53.459730 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"42891f5e3a86654fc47e53765ed9c9e2ce9d0e39ed604bf43ce1c76b529ed497"} Mar 14 06:58:53 crc kubenswrapper[4893]: I0314 06:58:53.459739 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:58:53 crc kubenswrapper[4893]: I0314 06:58:53.459805 4893 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 14 06:58:53 crc kubenswrapper[4893]: I0314 06:58:53.459744 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"95004116439f8ed03a26654e0d7eaf325a1c1810c8799d6a9ab77642607e98c2"} Mar 14 06:58:53 crc kubenswrapper[4893]: I0314 06:58:53.459885 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:58:53 crc kubenswrapper[4893]: I0314 06:58:53.460855 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:58:53 crc kubenswrapper[4893]: I0314 06:58:53.460900 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:58:53 crc kubenswrapper[4893]: I0314 06:58:53.460917 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:58:53 crc kubenswrapper[4893]: I0314 06:58:53.461824 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:58:53 crc kubenswrapper[4893]: I0314 06:58:53.461839 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:58:53 crc kubenswrapper[4893]: I0314 06:58:53.461872 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:58:53 crc kubenswrapper[4893]: I0314 06:58:53.461893 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:58:53 crc kubenswrapper[4893]: I0314 06:58:53.461878 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:58:53 crc kubenswrapper[4893]: I0314 06:58:53.462068 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:58:53 crc kubenswrapper[4893]: I0314 06:58:53.973271 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 06:58:53 crc kubenswrapper[4893]: I0314 06:58:53.982603 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 06:58:54 crc kubenswrapper[4893]: I0314 06:58:54.466063 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"b8ef7634592d5d444110f1a0effaa6ed17ea63d7159bcb2ccc5137d7ab2d5bdb"} Mar 14 06:58:54 crc kubenswrapper[4893]: I0314 06:58:54.466126 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:58:54 crc kubenswrapper[4893]: I0314 06:58:54.466181 4893 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 14 06:58:54 crc kubenswrapper[4893]: I0314 06:58:54.466204 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:58:54 crc kubenswrapper[4893]: I0314 06:58:54.466258 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:58:54 crc kubenswrapper[4893]: I0314 06:58:54.466891 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:58:54 crc kubenswrapper[4893]: I0314 06:58:54.466920 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:58:54 crc kubenswrapper[4893]: I0314 06:58:54.466931 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:58:54 crc kubenswrapper[4893]: I0314 06:58:54.467408 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:58:54 crc kubenswrapper[4893]: I0314 06:58:54.467429 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:58:54 crc kubenswrapper[4893]: I0314 06:58:54.467438 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:58:54 crc kubenswrapper[4893]: I0314 06:58:54.468087 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:58:54 crc kubenswrapper[4893]: I0314 06:58:54.468149 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:58:54 crc kubenswrapper[4893]: I0314 06:58:54.468413 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:58:55 crc kubenswrapper[4893]: I0314 06:58:55.005160 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 06:58:55 crc kubenswrapper[4893]: I0314 06:58:55.468332 4893 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 14 06:58:55 crc kubenswrapper[4893]: I0314 06:58:55.468976 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:58:55 crc kubenswrapper[4893]: I0314 06:58:55.469064 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:58:55 crc kubenswrapper[4893]: I0314 06:58:55.468379 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:58:55 crc kubenswrapper[4893]: I0314 06:58:55.469961 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:58:55 crc kubenswrapper[4893]: I0314 06:58:55.469988 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:58:55 crc kubenswrapper[4893]: I0314 06:58:55.470002 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:58:55 crc kubenswrapper[4893]: I0314 06:58:55.470266 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:58:55 crc kubenswrapper[4893]: I0314 06:58:55.470349 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:58:55 crc kubenswrapper[4893]: I0314 06:58:55.470413 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:58:55 crc kubenswrapper[4893]: I0314 06:58:55.470852 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:58:55 crc kubenswrapper[4893]: I0314 06:58:55.470873 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:58:55 crc kubenswrapper[4893]: I0314 06:58:55.470882 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:58:55 crc kubenswrapper[4893]: E0314 06:58:55.855414 4893 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 14 06:58:56 crc kubenswrapper[4893]: I0314 06:58:56.033767 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 06:58:56 crc kubenswrapper[4893]: I0314 06:58:56.238009 4893 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:58:56 crc kubenswrapper[4893]: I0314 06:58:56.238090 4893 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:58:56 crc kubenswrapper[4893]: I0314 06:58:56.470611 4893 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 14 06:58:56 crc kubenswrapper[4893]: I0314 06:58:56.470662 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:58:56 crc kubenswrapper[4893]: I0314 06:58:56.471677 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:58:56 crc kubenswrapper[4893]: I0314 06:58:56.471720 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:58:56 crc kubenswrapper[4893]: I0314 06:58:56.471733 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:58:56 crc kubenswrapper[4893]: I0314 06:58:56.762511 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 14 06:58:56 crc kubenswrapper[4893]: I0314 06:58:56.762735 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:58:56 crc kubenswrapper[4893]: I0314 06:58:56.763937 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:58:56 crc kubenswrapper[4893]: I0314 06:58:56.764000 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:58:56 crc kubenswrapper[4893]: I0314 06:58:56.764017 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:58:56 crc kubenswrapper[4893]: I0314 06:58:56.769112 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 06:58:57 crc kubenswrapper[4893]: I0314 06:58:57.472849 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:58:57 crc kubenswrapper[4893]: I0314 06:58:57.473793 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:58:57 crc kubenswrapper[4893]: I0314 06:58:57.473853 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:58:57 crc kubenswrapper[4893]: I0314 06:58:57.473870 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:58:58 crc kubenswrapper[4893]: I0314 06:58:58.119612 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 14 06:58:58 crc kubenswrapper[4893]: I0314 06:58:58.120192 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:58:58 crc kubenswrapper[4893]: I0314 06:58:58.121965 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:58:58 crc kubenswrapper[4893]: I0314 06:58:58.122005 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:58:58 crc kubenswrapper[4893]: I0314 06:58:58.122020 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:58:58 crc kubenswrapper[4893]: I0314 06:58:58.422236 4893 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 14 06:59:02 crc kubenswrapper[4893]: E0314 06:59:02.160303 4893 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": net/http: TLS handshake timeout" node="crc" Mar 14 06:59:02 crc kubenswrapper[4893]: E0314 06:59:02.303376 4893 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:02Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 14 06:59:02 crc kubenswrapper[4893]: W0314 06:59:02.305111 4893 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:02Z is after 2026-02-23T05:33:13Z Mar 14 06:59:02 crc kubenswrapper[4893]: E0314 06:59:02.305416 4893 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:02Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 14 06:59:02 crc kubenswrapper[4893]: W0314 06:59:02.306063 4893 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:02Z is after 2026-02-23T05:33:13Z Mar 14 06:59:02 crc kubenswrapper[4893]: E0314 06:59:02.306157 4893 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:02Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 14 06:59:02 crc kubenswrapper[4893]: W0314 06:59:02.307911 4893 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:02Z is after 2026-02-23T05:33:13Z Mar 14 06:59:02 crc kubenswrapper[4893]: E0314 06:59:02.308157 4893 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:02Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 14 06:59:02 crc kubenswrapper[4893]: I0314 06:59:02.308066 4893 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:02Z is after 2026-02-23T05:33:13Z Mar 14 06:59:02 crc kubenswrapper[4893]: W0314 06:59:02.309758 4893 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:02Z is after 2026-02-23T05:33:13Z Mar 14 06:59:02 crc kubenswrapper[4893]: E0314 06:59:02.309901 4893 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:02Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 14 06:59:02 crc kubenswrapper[4893]: E0314 06:59:02.311185 4893 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:02Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189ca2f3cf4f37fc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:41.315960828 +0000 UTC m=+0.578137690,LastTimestamp:2026-03-14 06:58:41.315960828 +0000 UTC m=+0.578137690,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:02 crc kubenswrapper[4893]: I0314 06:59:02.314429 4893 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 14 06:59:02 crc kubenswrapper[4893]: I0314 06:59:02.314572 4893 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 14 06:59:02 crc kubenswrapper[4893]: E0314 06:59:02.317438 4893 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:02Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 14 06:59:02 crc kubenswrapper[4893]: I0314 06:59:02.318945 4893 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 14 06:59:02 crc kubenswrapper[4893]: I0314 06:59:02.319023 4893 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 14 06:59:02 crc kubenswrapper[4893]: I0314 06:59:02.329807 4893 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:02Z is after 2026-02-23T05:33:13Z Mar 14 06:59:02 crc kubenswrapper[4893]: I0314 06:59:02.487745 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 14 06:59:02 crc kubenswrapper[4893]: I0314 06:59:02.490420 4893 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e006554b086d62449d31d61b3917cc5972ec7b796834b1696d4129d8dd4cb64f" exitCode=255 Mar 14 06:59:02 crc kubenswrapper[4893]: I0314 06:59:02.490513 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"e006554b086d62449d31d61b3917cc5972ec7b796834b1696d4129d8dd4cb64f"} Mar 14 06:59:02 crc kubenswrapper[4893]: I0314 06:59:02.490708 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:02 crc kubenswrapper[4893]: I0314 06:59:02.491488 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:02 crc kubenswrapper[4893]: I0314 06:59:02.491532 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:02 crc kubenswrapper[4893]: I0314 06:59:02.491546 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:02 crc kubenswrapper[4893]: I0314 06:59:02.492101 4893 scope.go:117] "RemoveContainer" containerID="e006554b086d62449d31d61b3917cc5972ec7b796834b1696d4129d8dd4cb64f" Mar 14 06:59:03 crc kubenswrapper[4893]: I0314 06:59:03.049483 4893 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 14 06:59:03 crc kubenswrapper[4893]: [+]log ok Mar 14 06:59:03 crc kubenswrapper[4893]: [+]etcd ok Mar 14 06:59:03 crc kubenswrapper[4893]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 14 06:59:03 crc kubenswrapper[4893]: [+]poststarthook/openshift.io-api-request-count-filter ok Mar 14 06:59:03 crc kubenswrapper[4893]: [+]poststarthook/openshift.io-startkubeinformers ok Mar 14 06:59:03 crc kubenswrapper[4893]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Mar 14 06:59:03 crc kubenswrapper[4893]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Mar 14 06:59:03 crc kubenswrapper[4893]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 14 06:59:03 crc kubenswrapper[4893]: [+]poststarthook/generic-apiserver-start-informers ok Mar 14 06:59:03 crc kubenswrapper[4893]: [+]poststarthook/priority-and-fairness-config-consumer ok Mar 14 06:59:03 crc kubenswrapper[4893]: [+]poststarthook/priority-and-fairness-filter ok Mar 14 06:59:03 crc kubenswrapper[4893]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 14 06:59:03 crc kubenswrapper[4893]: [+]poststarthook/start-apiextensions-informers ok Mar 14 06:59:03 crc kubenswrapper[4893]: [+]poststarthook/start-apiextensions-controllers ok Mar 14 06:59:03 crc kubenswrapper[4893]: [+]poststarthook/crd-informer-synced ok Mar 14 06:59:03 crc kubenswrapper[4893]: [+]poststarthook/start-system-namespaces-controller ok Mar 14 06:59:03 crc kubenswrapper[4893]: [+]poststarthook/start-cluster-authentication-info-controller ok Mar 14 06:59:03 crc kubenswrapper[4893]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Mar 14 06:59:03 crc kubenswrapper[4893]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Mar 14 06:59:03 crc kubenswrapper[4893]: [+]poststarthook/start-legacy-token-tracking-controller ok Mar 14 06:59:03 crc kubenswrapper[4893]: [+]poststarthook/start-service-ip-repair-controllers ok Mar 14 06:59:03 crc kubenswrapper[4893]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Mar 14 06:59:03 crc kubenswrapper[4893]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Mar 14 06:59:03 crc kubenswrapper[4893]: [+]poststarthook/priority-and-fairness-config-producer ok Mar 14 06:59:03 crc kubenswrapper[4893]: [+]poststarthook/bootstrap-controller ok Mar 14 06:59:03 crc kubenswrapper[4893]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Mar 14 06:59:03 crc kubenswrapper[4893]: [+]poststarthook/start-kube-aggregator-informers ok Mar 14 06:59:03 crc kubenswrapper[4893]: [+]poststarthook/apiservice-status-local-available-controller ok Mar 14 06:59:03 crc kubenswrapper[4893]: [+]poststarthook/apiservice-status-remote-available-controller ok Mar 14 06:59:03 crc kubenswrapper[4893]: [+]poststarthook/apiservice-registration-controller ok Mar 14 06:59:03 crc kubenswrapper[4893]: [+]poststarthook/apiservice-wait-for-first-sync ok Mar 14 06:59:03 crc kubenswrapper[4893]: [+]poststarthook/apiservice-discovery-controller ok Mar 14 06:59:03 crc kubenswrapper[4893]: [+]poststarthook/kube-apiserver-autoregistration ok Mar 14 06:59:03 crc kubenswrapper[4893]: [+]autoregister-completion ok Mar 14 06:59:03 crc kubenswrapper[4893]: [+]poststarthook/apiservice-openapi-controller ok Mar 14 06:59:03 crc kubenswrapper[4893]: [+]poststarthook/apiservice-openapiv3-controller ok Mar 14 06:59:03 crc kubenswrapper[4893]: livez check failed Mar 14 06:59:03 crc kubenswrapper[4893]: I0314 06:59:03.049562 4893 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 14 06:59:03 crc kubenswrapper[4893]: I0314 06:59:03.321820 4893 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:03Z is after 2026-02-23T05:33:13Z Mar 14 06:59:03 crc kubenswrapper[4893]: I0314 06:59:03.402987 4893 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 06:59:03 crc kubenswrapper[4893]: I0314 06:59:03.494080 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 14 06:59:03 crc kubenswrapper[4893]: I0314 06:59:03.495486 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f9c9ea5dacc30a4725422f09469af597e45f1270f27d4a328c1574188aa58091"} Mar 14 06:59:03 crc kubenswrapper[4893]: I0314 06:59:03.495626 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:03 crc kubenswrapper[4893]: I0314 06:59:03.497068 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:03 crc kubenswrapper[4893]: I0314 06:59:03.497092 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:03 crc kubenswrapper[4893]: I0314 06:59:03.497101 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:04 crc kubenswrapper[4893]: I0314 06:59:04.322875 4893 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:04Z is after 2026-02-23T05:33:13Z Mar 14 06:59:04 crc kubenswrapper[4893]: I0314 06:59:04.500914 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 14 06:59:04 crc kubenswrapper[4893]: I0314 06:59:04.501634 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 14 06:59:04 crc kubenswrapper[4893]: I0314 06:59:04.503969 4893 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f9c9ea5dacc30a4725422f09469af597e45f1270f27d4a328c1574188aa58091" exitCode=255 Mar 14 06:59:04 crc kubenswrapper[4893]: I0314 06:59:04.504040 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"f9c9ea5dacc30a4725422f09469af597e45f1270f27d4a328c1574188aa58091"} Mar 14 06:59:04 crc kubenswrapper[4893]: I0314 06:59:04.504117 4893 scope.go:117] "RemoveContainer" containerID="e006554b086d62449d31d61b3917cc5972ec7b796834b1696d4129d8dd4cb64f" Mar 14 06:59:04 crc kubenswrapper[4893]: I0314 06:59:04.504180 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:04 crc kubenswrapper[4893]: I0314 06:59:04.505683 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:04 crc kubenswrapper[4893]: I0314 06:59:04.505735 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:04 crc kubenswrapper[4893]: I0314 06:59:04.505752 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:04 crc kubenswrapper[4893]: I0314 06:59:04.506555 4893 scope.go:117] "RemoveContainer" containerID="f9c9ea5dacc30a4725422f09469af597e45f1270f27d4a328c1574188aa58091" Mar 14 06:59:04 crc kubenswrapper[4893]: E0314 06:59:04.506877 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 14 06:59:05 crc kubenswrapper[4893]: I0314 06:59:05.005477 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 06:59:05 crc kubenswrapper[4893]: I0314 06:59:05.321303 4893 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:05Z is after 2026-02-23T05:33:13Z Mar 14 06:59:05 crc kubenswrapper[4893]: I0314 06:59:05.508841 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 14 06:59:05 crc kubenswrapper[4893]: I0314 06:59:05.511062 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:05 crc kubenswrapper[4893]: I0314 06:59:05.511828 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:05 crc kubenswrapper[4893]: I0314 06:59:05.511857 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:05 crc kubenswrapper[4893]: I0314 06:59:05.511866 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:05 crc kubenswrapper[4893]: I0314 06:59:05.512322 4893 scope.go:117] "RemoveContainer" containerID="f9c9ea5dacc30a4725422f09469af597e45f1270f27d4a328c1574188aa58091" Mar 14 06:59:05 crc kubenswrapper[4893]: E0314 06:59:05.512510 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 14 06:59:05 crc kubenswrapper[4893]: E0314 06:59:05.856118 4893 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 14 06:59:06 crc kubenswrapper[4893]: I0314 06:59:06.238042 4893 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:59:06 crc kubenswrapper[4893]: I0314 06:59:06.238117 4893 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:59:06 crc kubenswrapper[4893]: I0314 06:59:06.324564 4893 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:06Z is after 2026-02-23T05:33:13Z Mar 14 06:59:06 crc kubenswrapper[4893]: I0314 06:59:06.513937 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:06 crc kubenswrapper[4893]: I0314 06:59:06.515742 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:06 crc kubenswrapper[4893]: I0314 06:59:06.515798 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:06 crc kubenswrapper[4893]: I0314 06:59:06.515816 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:06 crc kubenswrapper[4893]: I0314 06:59:06.516678 4893 scope.go:117] "RemoveContainer" containerID="f9c9ea5dacc30a4725422f09469af597e45f1270f27d4a328c1574188aa58091" Mar 14 06:59:06 crc kubenswrapper[4893]: E0314 06:59:06.517121 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 14 06:59:06 crc kubenswrapper[4893]: I0314 06:59:06.777676 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 06:59:06 crc kubenswrapper[4893]: I0314 06:59:06.777898 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:06 crc kubenswrapper[4893]: I0314 06:59:06.779663 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:06 crc kubenswrapper[4893]: I0314 06:59:06.779705 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:06 crc kubenswrapper[4893]: I0314 06:59:06.779723 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:06 crc kubenswrapper[4893]: I0314 06:59:06.794507 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 14 06:59:06 crc kubenswrapper[4893]: I0314 06:59:06.794691 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:06 crc kubenswrapper[4893]: I0314 06:59:06.795944 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:06 crc kubenswrapper[4893]: I0314 06:59:06.795976 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:06 crc kubenswrapper[4893]: I0314 06:59:06.795986 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:06 crc kubenswrapper[4893]: I0314 06:59:06.808703 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 14 06:59:07 crc kubenswrapper[4893]: I0314 06:59:07.321365 4893 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:07Z is after 2026-02-23T05:33:13Z Mar 14 06:59:07 crc kubenswrapper[4893]: I0314 06:59:07.516797 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:07 crc kubenswrapper[4893]: I0314 06:59:07.518209 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:07 crc kubenswrapper[4893]: I0314 06:59:07.518254 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:07 crc kubenswrapper[4893]: I0314 06:59:07.518268 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:08 crc kubenswrapper[4893]: I0314 06:59:08.047800 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 06:59:08 crc kubenswrapper[4893]: I0314 06:59:08.047947 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:08 crc kubenswrapper[4893]: I0314 06:59:08.049083 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:08 crc kubenswrapper[4893]: I0314 06:59:08.049141 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:08 crc kubenswrapper[4893]: I0314 06:59:08.049150 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:08 crc kubenswrapper[4893]: I0314 06:59:08.049748 4893 scope.go:117] "RemoveContainer" containerID="f9c9ea5dacc30a4725422f09469af597e45f1270f27d4a328c1574188aa58091" Mar 14 06:59:08 crc kubenswrapper[4893]: E0314 06:59:08.049916 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 14 06:59:08 crc kubenswrapper[4893]: I0314 06:59:08.052764 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 06:59:08 crc kubenswrapper[4893]: I0314 06:59:08.322693 4893 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:08Z is after 2026-02-23T05:33:13Z Mar 14 06:59:08 crc kubenswrapper[4893]: I0314 06:59:08.519094 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:08 crc kubenswrapper[4893]: I0314 06:59:08.520612 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:08 crc kubenswrapper[4893]: I0314 06:59:08.520662 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:08 crc kubenswrapper[4893]: I0314 06:59:08.520674 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:08 crc kubenswrapper[4893]: I0314 06:59:08.521325 4893 scope.go:117] "RemoveContainer" containerID="f9c9ea5dacc30a4725422f09469af597e45f1270f27d4a328c1574188aa58091" Mar 14 06:59:08 crc kubenswrapper[4893]: E0314 06:59:08.521540 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 14 06:59:08 crc kubenswrapper[4893]: I0314 06:59:08.560768 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:08 crc kubenswrapper[4893]: I0314 06:59:08.562420 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:08 crc kubenswrapper[4893]: I0314 06:59:08.562461 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:08 crc kubenswrapper[4893]: I0314 06:59:08.562474 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:08 crc kubenswrapper[4893]: I0314 06:59:08.562503 4893 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 14 06:59:08 crc kubenswrapper[4893]: E0314 06:59:08.566349 4893 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:08Z is after 2026-02-23T05:33:13Z" node="crc" Mar 14 06:59:09 crc kubenswrapper[4893]: E0314 06:59:09.323049 4893 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:09Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 14 06:59:09 crc kubenswrapper[4893]: I0314 06:59:09.327004 4893 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:09Z is after 2026-02-23T05:33:13Z Mar 14 06:59:10 crc kubenswrapper[4893]: I0314 06:59:10.322281 4893 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:10Z is after 2026-02-23T05:33:13Z Mar 14 06:59:11 crc kubenswrapper[4893]: I0314 06:59:11.322400 4893 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:11Z is after 2026-02-23T05:33:13Z Mar 14 06:59:12 crc kubenswrapper[4893]: E0314 06:59:12.315137 4893 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:12Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189ca2f3cf4f37fc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:41.315960828 +0000 UTC m=+0.578137690,LastTimestamp:2026-03-14 06:58:41.315960828 +0000 UTC m=+0.578137690,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:12 crc kubenswrapper[4893]: I0314 06:59:12.320929 4893 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:12Z is after 2026-02-23T05:33:13Z Mar 14 06:59:13 crc kubenswrapper[4893]: I0314 06:59:13.323373 4893 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:13Z is after 2026-02-23T05:33:13Z Mar 14 06:59:13 crc kubenswrapper[4893]: I0314 06:59:13.403131 4893 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 06:59:13 crc kubenswrapper[4893]: I0314 06:59:13.403615 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:13 crc kubenswrapper[4893]: I0314 06:59:13.404940 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:13 crc kubenswrapper[4893]: I0314 06:59:13.405038 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:13 crc kubenswrapper[4893]: I0314 06:59:13.405112 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:13 crc kubenswrapper[4893]: I0314 06:59:13.405677 4893 scope.go:117] "RemoveContainer" containerID="f9c9ea5dacc30a4725422f09469af597e45f1270f27d4a328c1574188aa58091" Mar 14 06:59:13 crc kubenswrapper[4893]: E0314 06:59:13.405903 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 14 06:59:14 crc kubenswrapper[4893]: I0314 06:59:14.322405 4893 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:14Z is after 2026-02-23T05:33:13Z Mar 14 06:59:15 crc kubenswrapper[4893]: I0314 06:59:15.320553 4893 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:15Z is after 2026-02-23T05:33:13Z Mar 14 06:59:15 crc kubenswrapper[4893]: I0314 06:59:15.567038 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:15 crc kubenswrapper[4893]: I0314 06:59:15.568372 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:15 crc kubenswrapper[4893]: I0314 06:59:15.568478 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:15 crc kubenswrapper[4893]: I0314 06:59:15.568598 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:15 crc kubenswrapper[4893]: I0314 06:59:15.568699 4893 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 14 06:59:15 crc kubenswrapper[4893]: E0314 06:59:15.571606 4893 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:15Z is after 2026-02-23T05:33:13Z" node="crc" Mar 14 06:59:15 crc kubenswrapper[4893]: E0314 06:59:15.856239 4893 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 14 06:59:16 crc kubenswrapper[4893]: I0314 06:59:16.238502 4893 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:59:16 crc kubenswrapper[4893]: I0314 06:59:16.239090 4893 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:59:16 crc kubenswrapper[4893]: I0314 06:59:16.239231 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 06:59:16 crc kubenswrapper[4893]: I0314 06:59:16.239460 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:16 crc kubenswrapper[4893]: I0314 06:59:16.240732 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:16 crc kubenswrapper[4893]: I0314 06:59:16.240779 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:16 crc kubenswrapper[4893]: I0314 06:59:16.240790 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:16 crc kubenswrapper[4893]: I0314 06:59:16.241350 4893 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"edb18f1797c24dd1bf0910081063b20e5bb8409605ec90dabf8c045a9d04a99e"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 14 06:59:16 crc kubenswrapper[4893]: I0314 06:59:16.241542 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://edb18f1797c24dd1bf0910081063b20e5bb8409605ec90dabf8c045a9d04a99e" gracePeriod=30 Mar 14 06:59:16 crc kubenswrapper[4893]: I0314 06:59:16.321723 4893 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:16Z is after 2026-02-23T05:33:13Z Mar 14 06:59:16 crc kubenswrapper[4893]: E0314 06:59:16.326241 4893 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:16Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 14 06:59:16 crc kubenswrapper[4893]: I0314 06:59:16.539676 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 14 06:59:16 crc kubenswrapper[4893]: I0314 06:59:16.540005 4893 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="edb18f1797c24dd1bf0910081063b20e5bb8409605ec90dabf8c045a9d04a99e" exitCode=255 Mar 14 06:59:16 crc kubenswrapper[4893]: I0314 06:59:16.540035 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"edb18f1797c24dd1bf0910081063b20e5bb8409605ec90dabf8c045a9d04a99e"} Mar 14 06:59:16 crc kubenswrapper[4893]: I0314 06:59:16.540077 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"69000e90207208ff01c09bcb46a97174a8ef8a682db8d50e3d3ae283267500ec"} Mar 14 06:59:16 crc kubenswrapper[4893]: I0314 06:59:16.540154 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:16 crc kubenswrapper[4893]: I0314 06:59:16.540851 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:16 crc kubenswrapper[4893]: I0314 06:59:16.540869 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:16 crc kubenswrapper[4893]: I0314 06:59:16.540877 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:17 crc kubenswrapper[4893]: I0314 06:59:17.320720 4893 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:17Z is after 2026-02-23T05:33:13Z Mar 14 06:59:18 crc kubenswrapper[4893]: I0314 06:59:18.321427 4893 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:18Z is after 2026-02-23T05:33:13Z Mar 14 06:59:18 crc kubenswrapper[4893]: I0314 06:59:18.418477 4893 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 14 06:59:18 crc kubenswrapper[4893]: E0314 06:59:18.421875 4893 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:18Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 14 06:59:18 crc kubenswrapper[4893]: E0314 06:59:18.423127 4893 certificate_manager.go:440] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Reached backoff limit, still unable to rotate certs: timed out waiting for the condition" logger="UnhandledError" Mar 14 06:59:19 crc kubenswrapper[4893]: I0314 06:59:19.320403 4893 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:19Z is after 2026-02-23T05:33:13Z Mar 14 06:59:20 crc kubenswrapper[4893]: I0314 06:59:20.321027 4893 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:20Z is after 2026-02-23T05:33:13Z Mar 14 06:59:20 crc kubenswrapper[4893]: W0314 06:59:20.450822 4893 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:20Z is after 2026-02-23T05:33:13Z Mar 14 06:59:20 crc kubenswrapper[4893]: E0314 06:59:20.450918 4893 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:20Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 14 06:59:20 crc kubenswrapper[4893]: W0314 06:59:20.615183 4893 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:20Z is after 2026-02-23T05:33:13Z Mar 14 06:59:20 crc kubenswrapper[4893]: E0314 06:59:20.615453 4893 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:20Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 14 06:59:21 crc kubenswrapper[4893]: I0314 06:59:21.323094 4893 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:21Z is after 2026-02-23T05:33:13Z Mar 14 06:59:21 crc kubenswrapper[4893]: W0314 06:59:21.742425 4893 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:21Z is after 2026-02-23T05:33:13Z Mar 14 06:59:21 crc kubenswrapper[4893]: E0314 06:59:21.742774 4893 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:21Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.320835 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca2f3cf4f37fc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:41.315960828 +0000 UTC m=+0.578137690,LastTimestamp:2026-03-14 06:58:41.315960828 +0000 UTC m=+0.578137690,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: I0314 06:59:22.320966 4893 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.323269 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca2f3d320dfca default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:41.380032458 +0000 UTC m=+0.642209260,LastTimestamp:2026-03-14 06:58:41.380032458 +0000 UTC m=+0.642209260,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.327338 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca2f3d3213a61 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:41.380055649 +0000 UTC m=+0.642232451,LastTimestamp:2026-03-14 06:58:41.380055649 +0000 UTC m=+0.642232451,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.328633 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca2f3d32165ef default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:41.380066799 +0000 UTC m=+0.642243601,LastTimestamp:2026-03-14 06:58:41.380066799 +0000 UTC m=+0.642243601,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.335594 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca2f4dd22bfbf default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:45.842894783 +0000 UTC m=+5.105071605,LastTimestamp:2026-03-14 06:58:45.842894783 +0000 UTC m=+5.105071605,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.343376 4893 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca2f3d320dfca\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca2f3d320dfca default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:41.380032458 +0000 UTC m=+0.642209260,LastTimestamp:2026-03-14 06:58:45.940180152 +0000 UTC m=+5.202356964,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.349198 4893 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca2f3d3213a61\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca2f3d3213a61 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:41.380055649 +0000 UTC m=+0.642232451,LastTimestamp:2026-03-14 06:58:45.940211803 +0000 UTC m=+5.202388615,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.354207 4893 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca2f3d32165ef\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca2f3d32165ef default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:41.380066799 +0000 UTC m=+0.642243601,LastTimestamp:2026-03-14 06:58:45.940231343 +0000 UTC m=+5.202408155,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.359835 4893 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca2f3d320dfca\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca2f3d320dfca default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:41.380032458 +0000 UTC m=+0.642209260,LastTimestamp:2026-03-14 06:58:46.143267623 +0000 UTC m=+5.405444485,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.366686 4893 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca2f3d3213a61\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca2f3d3213a61 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:41.380055649 +0000 UTC m=+0.642232451,LastTimestamp:2026-03-14 06:58:46.143483069 +0000 UTC m=+5.405659941,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.371790 4893 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca2f3d32165ef\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca2f3d32165ef default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:41.380066799 +0000 UTC m=+0.642243601,LastTimestamp:2026-03-14 06:58:46.143672913 +0000 UTC m=+5.405849765,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.377090 4893 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca2f3d320dfca\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca2f3d320dfca default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:41.380032458 +0000 UTC m=+0.642209260,LastTimestamp:2026-03-14 06:58:46.547212116 +0000 UTC m=+5.809388938,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.382396 4893 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca2f3d3213a61\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca2f3d3213a61 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:41.380055649 +0000 UTC m=+0.642232451,LastTimestamp:2026-03-14 06:58:46.54740201 +0000 UTC m=+5.809578842,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.389908 4893 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca2f3d32165ef\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca2f3d32165ef default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:41.380066799 +0000 UTC m=+0.642243601,LastTimestamp:2026-03-14 06:58:46.547578174 +0000 UTC m=+5.809755026,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.395109 4893 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca2f3d320dfca\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca2f3d320dfca default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:41.380032458 +0000 UTC m=+0.642209260,LastTimestamp:2026-03-14 06:58:47.352444861 +0000 UTC m=+6.614621703,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.401728 4893 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca2f3d3213a61\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca2f3d3213a61 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:41.380055649 +0000 UTC m=+0.642232451,LastTimestamp:2026-03-14 06:58:47.352499952 +0000 UTC m=+6.614676784,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.409380 4893 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca2f3d32165ef\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca2f3d32165ef default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:41.380066799 +0000 UTC m=+0.642243601,LastTimestamp:2026-03-14 06:58:47.352583414 +0000 UTC m=+6.614760246,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.414747 4893 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca2f3d320dfca\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca2f3d320dfca default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:41.380032458 +0000 UTC m=+0.642209260,LastTimestamp:2026-03-14 06:58:47.682666207 +0000 UTC m=+6.944843029,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.419972 4893 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca2f3d3213a61\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca2f3d3213a61 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:41.380055649 +0000 UTC m=+0.642232451,LastTimestamp:2026-03-14 06:58:47.682716798 +0000 UTC m=+6.944893640,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.424703 4893 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca2f3d32165ef\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca2f3d32165ef default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:41.380066799 +0000 UTC m=+0.642243601,LastTimestamp:2026-03-14 06:58:47.682746058 +0000 UTC m=+6.944922910,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.430742 4893 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca2f3d320dfca\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca2f3d320dfca default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:41.380032458 +0000 UTC m=+0.642209260,LastTimestamp:2026-03-14 06:58:47.685145073 +0000 UTC m=+6.947321905,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.435873 4893 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca2f3d3213a61\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca2f3d3213a61 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:41.380055649 +0000 UTC m=+0.642232451,LastTimestamp:2026-03-14 06:58:47.685174403 +0000 UTC m=+6.947351225,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.440286 4893 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca2f3d32165ef\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca2f3d32165ef default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:41.380066799 +0000 UTC m=+0.642243601,LastTimestamp:2026-03-14 06:58:47.685192384 +0000 UTC m=+6.947369216,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.446790 4893 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca2f3d320dfca\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca2f3d320dfca default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:41.380032458 +0000 UTC m=+0.642209260,LastTimestamp:2026-03-14 06:58:47.685251125 +0000 UTC m=+6.947428007,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.452296 4893 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca2f3d3213a61\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca2f3d3213a61 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:41.380055649 +0000 UTC m=+0.642232451,LastTimestamp:2026-03-14 06:58:47.685288376 +0000 UTC m=+6.947465208,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.460977 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca2f568027e36 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:48.172813878 +0000 UTC m=+7.434990660,LastTimestamp:2026-03-14 06:58:48.172813878 +0000 UTC m=+7.434990660,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.466816 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ca2f568329671 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:48.175965809 +0000 UTC m=+7.438142591,LastTimestamp:2026-03-14 06:58:48.175965809 +0000 UTC m=+7.438142591,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.471380 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca2f568405136 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:48.17686559 +0000 UTC m=+7.439042382,LastTimestamp:2026-03-14 06:58:48.17686559 +0000 UTC m=+7.439042382,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.478893 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ca2f568b9f071 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:48.184836209 +0000 UTC m=+7.447013001,LastTimestamp:2026-03-14 06:58:48.184836209 +0000 UTC m=+7.447013001,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.483731 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca2f568bc53b5 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:48.184992693 +0000 UTC m=+7.447169495,LastTimestamp:2026-03-14 06:58:48.184992693 +0000 UTC m=+7.447169495,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.490766 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ca2f5c4893466 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:49.725146214 +0000 UTC m=+8.987323006,LastTimestamp:2026-03-14 06:58:49.725146214 +0000 UTC m=+8.987323006,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.497053 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca2f5c4899672 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:49.725171314 +0000 UTC m=+8.987348146,LastTimestamp:2026-03-14 06:58:49.725171314 +0000 UTC m=+8.987348146,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.503612 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca2f5c489ea3d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:49.725192765 +0000 UTC m=+8.987369557,LastTimestamp:2026-03-14 06:58:49.725192765 +0000 UTC m=+8.987369557,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.508688 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca2f5c48b4c8b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:49.725283467 +0000 UTC m=+8.987460259,LastTimestamp:2026-03-14 06:58:49.725283467 +0000 UTC m=+8.987460259,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.512887 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ca2f5c49653d8 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:49.726006232 +0000 UTC m=+8.988183064,LastTimestamp:2026-03-14 06:58:49.726006232 +0000 UTC m=+8.988183064,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.519024 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca2f5c6cbb769 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:49.763059561 +0000 UTC m=+9.025236353,LastTimestamp:2026-03-14 06:58:49.763059561 +0000 UTC m=+9.025236353,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.523213 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ca2f5cac99ed9 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:49.830031065 +0000 UTC m=+9.092207877,LastTimestamp:2026-03-14 06:58:49.830031065 +0000 UTC m=+9.092207877,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.527289 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ca2f5d74f224a openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:50.040107594 +0000 UTC m=+9.302284406,LastTimestamp:2026-03-14 06:58:50.040107594 +0000 UTC m=+9.302284406,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.531798 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca2f5d792ea0d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:50.044549645 +0000 UTC m=+9.306726467,LastTimestamp:2026-03-14 06:58:50.044549645 +0000 UTC m=+9.306726467,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.538630 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca2f5d7936fe1 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:50.044583905 +0000 UTC m=+9.306760707,LastTimestamp:2026-03-14 06:58:50.044583905 +0000 UTC m=+9.306760707,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.542986 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca2f5d7be1993 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:50.047379859 +0000 UTC m=+9.309556651,LastTimestamp:2026-03-14 06:58:50.047379859 +0000 UTC m=+9.309556651,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.550029 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca2f5e8202e79 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:50.322243193 +0000 UTC m=+9.584419985,LastTimestamp:2026-03-14 06:58:50.322243193 +0000 UTC m=+9.584419985,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.558578 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca2f5e8d07ce4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:50.333797604 +0000 UTC m=+9.595974406,LastTimestamp:2026-03-14 06:58:50.333797604 +0000 UTC m=+9.595974406,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.565057 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca2f5e8e82a50 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:50.335349328 +0000 UTC m=+9.597526130,LastTimestamp:2026-03-14 06:58:50.335349328 +0000 UTC m=+9.597526130,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.571362 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca2f5edb52012 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:50.41589045 +0000 UTC m=+9.678067242,LastTimestamp:2026-03-14 06:58:50.41589045 +0000 UTC m=+9.678067242,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: I0314 06:59:22.572482 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:22 crc kubenswrapper[4893]: I0314 06:59:22.574286 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:22 crc kubenswrapper[4893]: I0314 06:59:22.574343 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:22 crc kubenswrapper[4893]: I0314 06:59:22.574368 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:22 crc kubenswrapper[4893]: I0314 06:59:22.574410 4893 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.579935 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ca2f5ede91152 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:50.419294546 +0000 UTC m=+9.681471338,LastTimestamp:2026-03-14 06:58:50.419294546 +0000 UTC m=+9.681471338,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.584611 4893 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.584757 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ca2f5edec17ea openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:50.419492842 +0000 UTC m=+9.681669644,LastTimestamp:2026-03-14 06:58:50.419492842 +0000 UTC m=+9.681669644,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.591307 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca2f5ee63fc17 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:50.427350039 +0000 UTC m=+9.689526831,LastTimestamp:2026-03-14 06:58:50.427350039 +0000 UTC m=+9.689526831,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.598986 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca2f5f4db1fec openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:50.535821292 +0000 UTC m=+9.797998084,LastTimestamp:2026-03-14 06:58:50.535821292 +0000 UTC m=+9.797998084,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.604061 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca2f5f62d39fc openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:50.557979132 +0000 UTC m=+9.820155924,LastTimestamp:2026-03-14 06:58:50.557979132 +0000 UTC m=+9.820155924,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.610352 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca2f5f64d4cfe openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:50.56008115 +0000 UTC m=+9.822257942,LastTimestamp:2026-03-14 06:58:50.56008115 +0000 UTC m=+9.822257942,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.617390 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca2f5fb60d6f7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:50.645247735 +0000 UTC m=+9.907424527,LastTimestamp:2026-03-14 06:58:50.645247735 +0000 UTC m=+9.907424527,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.622277 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ca2f5fb8df562 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:50.648204642 +0000 UTC m=+9.910381444,LastTimestamp:2026-03-14 06:58:50.648204642 +0000 UTC m=+9.910381444,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.628661 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca2f5fb95a105 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:50.648707333 +0000 UTC m=+9.910884125,LastTimestamp:2026-03-14 06:58:50.648707333 +0000 UTC m=+9.910884125,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.638894 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ca2f5fb97db08 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:50.648853256 +0000 UTC m=+9.911030048,LastTimestamp:2026-03-14 06:58:50.648853256 +0000 UTC m=+9.911030048,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.644331 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca2f5fc6e3688 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:50.662901384 +0000 UTC m=+9.925078166,LastTimestamp:2026-03-14 06:58:50.662901384 +0000 UTC m=+9.925078166,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.648718 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ca2f5fc779d9e openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:50.663517598 +0000 UTC m=+9.925708741,LastTimestamp:2026-03-14 06:58:50.663517598 +0000 UTC m=+9.925708741,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.653552 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca2f5fc859200 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:50.664432128 +0000 UTC m=+9.926608920,LastTimestamp:2026-03-14 06:58:50.664432128 +0000 UTC m=+9.926608920,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.658725 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ca2f5fc953b8c openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:50.665458572 +0000 UTC m=+9.927635364,LastTimestamp:2026-03-14 06:58:50.665458572 +0000 UTC m=+9.927635364,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.663373 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ca2f5fcbc1a2a openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:50.66800593 +0000 UTC m=+9.930182722,LastTimestamp:2026-03-14 06:58:50.66800593 +0000 UTC m=+9.930182722,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.667961 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca2f5fe5bb89f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:50.695243935 +0000 UTC m=+9.957420727,LastTimestamp:2026-03-14 06:58:50.695243935 +0000 UTC m=+9.957420727,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.673150 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca2f60395b232 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:50.782929458 +0000 UTC m=+10.045106250,LastTimestamp:2026-03-14 06:58:50.782929458 +0000 UTC m=+10.045106250,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.679302 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca2f604ac86d5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:50.801202901 +0000 UTC m=+10.063379693,LastTimestamp:2026-03-14 06:58:50.801202901 +0000 UTC m=+10.063379693,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.684588 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca2f608070a69 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:50.857466473 +0000 UTC m=+10.119643265,LastTimestamp:2026-03-14 06:58:50.857466473 +0000 UTC m=+10.119643265,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.689708 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ca2f608e92f5c openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:50.872287068 +0000 UTC m=+10.134463860,LastTimestamp:2026-03-14 06:58:50.872287068 +0000 UTC m=+10.134463860,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.696603 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca2f60948a8f7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:50.878544119 +0000 UTC m=+10.140720901,LastTimestamp:2026-03-14 06:58:50.878544119 +0000 UTC m=+10.140720901,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.702005 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca2f6095e76bc openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:50.879973052 +0000 UTC m=+10.142149844,LastTimestamp:2026-03-14 06:58:50.879973052 +0000 UTC m=+10.142149844,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.707292 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ca2f60a39068c openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:50.894296716 +0000 UTC m=+10.156473508,LastTimestamp:2026-03-14 06:58:50.894296716 +0000 UTC m=+10.156473508,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.711817 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ca2f60a4e2473 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:50.895680627 +0000 UTC m=+10.157857419,LastTimestamp:2026-03-14 06:58:50.895680627 +0000 UTC m=+10.157857419,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.716238 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca2f617fda9a1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:51.125287329 +0000 UTC m=+10.387464121,LastTimestamp:2026-03-14 06:58:51.125287329 +0000 UTC m=+10.387464121,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.721175 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ca2f61818e0ef openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:51.127070959 +0000 UTC m=+10.389247751,LastTimestamp:2026-03-14 06:58:51.127070959 +0000 UTC m=+10.389247751,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.725957 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca2f618e50630 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:51.14044984 +0000 UTC m=+10.402626622,LastTimestamp:2026-03-14 06:58:51.14044984 +0000 UTC m=+10.402626622,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.730003 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ca2f618ff68c2 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:51.14217901 +0000 UTC m=+10.404355802,LastTimestamp:2026-03-14 06:58:51.14217901 +0000 UTC m=+10.404355802,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.734480 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca2f6192e6848 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:51.14525908 +0000 UTC m=+10.407435872,LastTimestamp:2026-03-14 06:58:51.14525908 +0000 UTC m=+10.407435872,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.738910 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca2f623608b1d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:51.316316957 +0000 UTC m=+10.578493749,LastTimestamp:2026-03-14 06:58:51.316316957 +0000 UTC m=+10.578493749,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.743405 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca2f6245c8222 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:51.33282973 +0000 UTC m=+10.595006522,LastTimestamp:2026-03-14 06:58:51.33282973 +0000 UTC m=+10.595006522,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.747200 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca2f624719902 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:51.334211842 +0000 UTC m=+10.596388634,LastTimestamp:2026-03-14 06:58:51.334211842 +0000 UTC m=+10.596388634,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.754844 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca2f62ae40077 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:51.442372727 +0000 UTC m=+10.704549509,LastTimestamp:2026-03-14 06:58:51.442372727 +0000 UTC m=+10.704549509,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.762868 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca2f6303ed4b2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:51.532211378 +0000 UTC m=+10.794388160,LastTimestamp:2026-03-14 06:58:51.532211378 +0000 UTC m=+10.794388160,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.767503 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca2f63139b654 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:51.54865314 +0000 UTC m=+10.810829932,LastTimestamp:2026-03-14 06:58:51.54865314 +0000 UTC m=+10.810829932,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.771816 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca2f636944182 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:51.63847309 +0000 UTC m=+10.900649882,LastTimestamp:2026-03-14 06:58:51.63847309 +0000 UTC m=+10.900649882,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.775920 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca2f6374e283c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:51.650656316 +0000 UTC m=+10.912833108,LastTimestamp:2026-03-14 06:58:51.650656316 +0000 UTC m=+10.912833108,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.780573 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca2f667089354 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:52.45140258 +0000 UTC m=+11.713579382,LastTimestamp:2026-03-14 06:58:52.45140258 +0000 UTC m=+11.713579382,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.786993 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca2f671218f62 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:52.62081213 +0000 UTC m=+11.882988932,LastTimestamp:2026-03-14 06:58:52.62081213 +0000 UTC m=+11.882988932,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.791057 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca2f6719add0d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:52.628761869 +0000 UTC m=+11.890938661,LastTimestamp:2026-03-14 06:58:52.628761869 +0000 UTC m=+11.890938661,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.794639 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca2f671adf9f9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:52.630014457 +0000 UTC m=+11.892191259,LastTimestamp:2026-03-14 06:58:52.630014457 +0000 UTC m=+11.892191259,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.799460 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca2f67caf9858 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:52.814669912 +0000 UTC m=+12.076846704,LastTimestamp:2026-03-14 06:58:52.814669912 +0000 UTC m=+12.076846704,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.803845 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca2f67da2a32d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:52.830597933 +0000 UTC m=+12.092774735,LastTimestamp:2026-03-14 06:58:52.830597933 +0000 UTC m=+12.092774735,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.808581 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca2f67dbfbea1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:52.832505505 +0000 UTC m=+12.094682307,LastTimestamp:2026-03-14 06:58:52.832505505 +0000 UTC m=+12.094682307,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.812167 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca2f6893fb8c4 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:53.025441988 +0000 UTC m=+12.287618770,LastTimestamp:2026-03-14 06:58:53.025441988 +0000 UTC m=+12.287618770,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.816491 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca2f68a0f88e9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:53.039061225 +0000 UTC m=+12.301238017,LastTimestamp:2026-03-14 06:58:53.039061225 +0000 UTC m=+12.301238017,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.820822 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca2f68a292744 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:53.040740164 +0000 UTC m=+12.302916956,LastTimestamp:2026-03-14 06:58:53.040740164 +0000 UTC m=+12.302916956,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.827746 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca2f696177547 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:53.240907079 +0000 UTC m=+12.503083871,LastTimestamp:2026-03-14 06:58:53.240907079 +0000 UTC m=+12.503083871,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.832646 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca2f696bc9cc9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:53.251730633 +0000 UTC m=+12.513907425,LastTimestamp:2026-03-14 06:58:53.251730633 +0000 UTC m=+12.513907425,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.836692 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca2f696d13f36 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:53.253082934 +0000 UTC m=+12.515259716,LastTimestamp:2026-03-14 06:58:53.253082934 +0000 UTC m=+12.515259716,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.840205 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca2f6a31040ad openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:53.458538669 +0000 UTC m=+12.720715461,LastTimestamp:2026-03-14 06:58:53.458538669 +0000 UTC m=+12.720715461,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.847100 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca2f6a3c5f58d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:53.470446989 +0000 UTC m=+12.732623781,LastTimestamp:2026-03-14 06:58:53.470446989 +0000 UTC m=+12.732623781,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.850212 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 14 06:59:22 crc kubenswrapper[4893]: &Event{ObjectMeta:{kube-controller-manager-crc.189ca2f748bc75e9 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 14 06:59:22 crc kubenswrapper[4893]: body: Mar 14 06:59:22 crc kubenswrapper[4893]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:56.238065129 +0000 UTC m=+15.500241931,LastTimestamp:2026-03-14 06:58:56.238065129 +0000 UTC m=+15.500241931,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 14 06:59:22 crc kubenswrapper[4893]: > Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.854666 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca2f748bd4aed openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:56.238119661 +0000 UTC m=+15.500296453,LastTimestamp:2026-03-14 06:58:56.238119661 +0000 UTC m=+15.500296453,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.860186 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 14 06:59:22 crc kubenswrapper[4893]: &Event{ObjectMeta:{kube-apiserver-crc.189ca2f8b2ebb336 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 14 06:59:22 crc kubenswrapper[4893]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 14 06:59:22 crc kubenswrapper[4893]: Mar 14 06:59:22 crc kubenswrapper[4893]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:02.314513206 +0000 UTC m=+21.576689998,LastTimestamp:2026-03-14 06:59:02.314513206 +0000 UTC m=+21.576689998,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 14 06:59:22 crc kubenswrapper[4893]: > Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.864862 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca2f8b2ed351e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:02.314611998 +0000 UTC m=+21.576788790,LastTimestamp:2026-03-14 06:59:02.314611998 +0000 UTC m=+21.576788790,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.869115 4893 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189ca2f8b2ebb336\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 14 06:59:22 crc kubenswrapper[4893]: &Event{ObjectMeta:{kube-apiserver-crc.189ca2f8b2ebb336 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 14 06:59:22 crc kubenswrapper[4893]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 14 06:59:22 crc kubenswrapper[4893]: Mar 14 06:59:22 crc kubenswrapper[4893]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:02.314513206 +0000 UTC m=+21.576689998,LastTimestamp:2026-03-14 06:59:02.319001034 +0000 UTC m=+21.581177826,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 14 06:59:22 crc kubenswrapper[4893]: > Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.874630 4893 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189ca2f8b2ed351e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca2f8b2ed351e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:02.314611998 +0000 UTC m=+21.576788790,LastTimestamp:2026-03-14 06:59:02.319056636 +0000 UTC m=+21.581233428,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.879788 4893 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189ca2f624719902\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca2f624719902 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:51.334211842 +0000 UTC m=+10.596388634,LastTimestamp:2026-03-14 06:59:02.493178157 +0000 UTC m=+21.755354949,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.884551 4893 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189ca2f6303ed4b2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca2f6303ed4b2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:51.532211378 +0000 UTC m=+10.794388160,LastTimestamp:2026-03-14 06:59:02.678572221 +0000 UTC m=+21.940749013,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.890691 4893 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189ca2f63139b654\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca2f63139b654 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:51.54865314 +0000 UTC m=+10.810829932,LastTimestamp:2026-03-14 06:59:02.688038241 +0000 UTC m=+21.950215033,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.895479 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 14 06:59:22 crc kubenswrapper[4893]: &Event{ObjectMeta:{kube-controller-manager-crc.189ca2f99cc8cee4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 14 06:59:22 crc kubenswrapper[4893]: body: Mar 14 06:59:22 crc kubenswrapper[4893]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:06.238095076 +0000 UTC m=+25.500271868,LastTimestamp:2026-03-14 06:59:06.238095076 +0000 UTC m=+25.500271868,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 14 06:59:22 crc kubenswrapper[4893]: > Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.899261 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca2f99cc97d63 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:06.238139747 +0000 UTC m=+25.500316539,LastTimestamp:2026-03-14 06:59:06.238139747 +0000 UTC m=+25.500316539,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.908115 4893 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ca2f99cc8cee4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 14 06:59:22 crc kubenswrapper[4893]: &Event{ObjectMeta:{kube-controller-manager-crc.189ca2f99cc8cee4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 14 06:59:22 crc kubenswrapper[4893]: body: Mar 14 06:59:22 crc kubenswrapper[4893]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:06.238095076 +0000 UTC m=+25.500271868,LastTimestamp:2026-03-14 06:59:16.23906444 +0000 UTC m=+35.501241262,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 14 06:59:22 crc kubenswrapper[4893]: > Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.915958 4893 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ca2f99cc97d63\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca2f99cc97d63 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:06.238139747 +0000 UTC m=+25.500316539,LastTimestamp:2026-03-14 06:59:16.239195403 +0000 UTC m=+35.501372195,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.920086 4893 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca2fbf108c2a4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:16.24150698 +0000 UTC m=+35.503683772,LastTimestamp:2026-03-14 06:59:16.24150698 +0000 UTC m=+35.503683772,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.924433 4893 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ca2f5d7be1993\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca2f5d7be1993 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:50.047379859 +0000 UTC m=+9.309556651,LastTimestamp:2026-03-14 06:59:16.355246107 +0000 UTC m=+35.617422889,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.928391 4893 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ca2f5e8202e79\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca2f5e8202e79 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:50.322243193 +0000 UTC m=+9.584419985,LastTimestamp:2026-03-14 06:59:16.523761392 +0000 UTC m=+35.785938184,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:22 crc kubenswrapper[4893]: E0314 06:59:22.932324 4893 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ca2f5e8d07ce4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca2f5e8d07ce4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:50.333797604 +0000 UTC m=+9.595974406,LastTimestamp:2026-03-14 06:59:16.532205897 +0000 UTC m=+35.794382689,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:23 crc kubenswrapper[4893]: I0314 06:59:23.237844 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 06:59:23 crc kubenswrapper[4893]: I0314 06:59:23.238212 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:23 crc kubenswrapper[4893]: I0314 06:59:23.240435 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:23 crc kubenswrapper[4893]: I0314 06:59:23.240712 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:23 crc kubenswrapper[4893]: I0314 06:59:23.240739 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:23 crc kubenswrapper[4893]: I0314 06:59:23.325034 4893 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 06:59:23 crc kubenswrapper[4893]: E0314 06:59:23.332421 4893 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 14 06:59:24 crc kubenswrapper[4893]: I0314 06:59:24.324083 4893 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 06:59:25 crc kubenswrapper[4893]: W0314 06:59:25.040782 4893 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 14 06:59:25 crc kubenswrapper[4893]: E0314 06:59:25.040839 4893 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 14 06:59:25 crc kubenswrapper[4893]: I0314 06:59:25.324704 4893 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 06:59:25 crc kubenswrapper[4893]: E0314 06:59:25.856944 4893 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 14 06:59:26 crc kubenswrapper[4893]: I0314 06:59:26.034568 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 06:59:26 crc kubenswrapper[4893]: I0314 06:59:26.034747 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:26 crc kubenswrapper[4893]: I0314 06:59:26.036140 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:26 crc kubenswrapper[4893]: I0314 06:59:26.036218 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:26 crc kubenswrapper[4893]: I0314 06:59:26.036232 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:26 crc kubenswrapper[4893]: I0314 06:59:26.238055 4893 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:59:26 crc kubenswrapper[4893]: I0314 06:59:26.238176 4893 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:59:26 crc kubenswrapper[4893]: E0314 06:59:26.246338 4893 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ca2f99cc8cee4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 14 06:59:26 crc kubenswrapper[4893]: &Event{ObjectMeta:{kube-controller-manager-crc.189ca2f99cc8cee4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 14 06:59:26 crc kubenswrapper[4893]: body: Mar 14 06:59:26 crc kubenswrapper[4893]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:06.238095076 +0000 UTC m=+25.500271868,LastTimestamp:2026-03-14 06:59:26.238141447 +0000 UTC m=+45.500318269,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 14 06:59:26 crc kubenswrapper[4893]: > Mar 14 06:59:26 crc kubenswrapper[4893]: E0314 06:59:26.252941 4893 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ca2f99cc97d63\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca2f99cc97d63 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:06.238139747 +0000 UTC m=+25.500316539,LastTimestamp:2026-03-14 06:59:26.238227519 +0000 UTC m=+45.500404341,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:26 crc kubenswrapper[4893]: I0314 06:59:26.323235 4893 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 06:59:27 crc kubenswrapper[4893]: I0314 06:59:27.322168 4893 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 06:59:28 crc kubenswrapper[4893]: I0314 06:59:28.324456 4893 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 06:59:28 crc kubenswrapper[4893]: I0314 06:59:28.375937 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:28 crc kubenswrapper[4893]: I0314 06:59:28.377287 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:28 crc kubenswrapper[4893]: I0314 06:59:28.377333 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:28 crc kubenswrapper[4893]: I0314 06:59:28.377345 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:28 crc kubenswrapper[4893]: I0314 06:59:28.377974 4893 scope.go:117] "RemoveContainer" containerID="f9c9ea5dacc30a4725422f09469af597e45f1270f27d4a328c1574188aa58091" Mar 14 06:59:29 crc kubenswrapper[4893]: I0314 06:59:29.327323 4893 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 06:59:29 crc kubenswrapper[4893]: I0314 06:59:29.578400 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 14 06:59:29 crc kubenswrapper[4893]: I0314 06:59:29.579906 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 14 06:59:29 crc kubenswrapper[4893]: I0314 06:59:29.583288 4893 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a62aeda46c76e76449bd037e1594b4d96e68d2519229dd27665299c24d5142b9" exitCode=255 Mar 14 06:59:29 crc kubenswrapper[4893]: I0314 06:59:29.583384 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"a62aeda46c76e76449bd037e1594b4d96e68d2519229dd27665299c24d5142b9"} Mar 14 06:59:29 crc kubenswrapper[4893]: I0314 06:59:29.583484 4893 scope.go:117] "RemoveContainer" containerID="f9c9ea5dacc30a4725422f09469af597e45f1270f27d4a328c1574188aa58091" Mar 14 06:59:29 crc kubenswrapper[4893]: I0314 06:59:29.583882 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:29 crc kubenswrapper[4893]: I0314 06:59:29.584751 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:29 crc kubenswrapper[4893]: I0314 06:59:29.585971 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:29 crc kubenswrapper[4893]: I0314 06:59:29.586047 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:29 crc kubenswrapper[4893]: I0314 06:59:29.586076 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:29 crc kubenswrapper[4893]: I0314 06:59:29.587159 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:29 crc kubenswrapper[4893]: I0314 06:59:29.587214 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:29 crc kubenswrapper[4893]: I0314 06:59:29.587239 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:29 crc kubenswrapper[4893]: I0314 06:59:29.587287 4893 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 14 06:59:29 crc kubenswrapper[4893]: I0314 06:59:29.587471 4893 scope.go:117] "RemoveContainer" containerID="a62aeda46c76e76449bd037e1594b4d96e68d2519229dd27665299c24d5142b9" Mar 14 06:59:29 crc kubenswrapper[4893]: E0314 06:59:29.587790 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 14 06:59:29 crc kubenswrapper[4893]: E0314 06:59:29.599879 4893 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 14 06:59:30 crc kubenswrapper[4893]: I0314 06:59:30.326553 4893 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 06:59:30 crc kubenswrapper[4893]: E0314 06:59:30.339059 4893 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 14 06:59:30 crc kubenswrapper[4893]: I0314 06:59:30.590262 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 14 06:59:31 crc kubenswrapper[4893]: I0314 06:59:31.324875 4893 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 06:59:32 crc kubenswrapper[4893]: I0314 06:59:32.324038 4893 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 06:59:33 crc kubenswrapper[4893]: I0314 06:59:33.322997 4893 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 06:59:33 crc kubenswrapper[4893]: I0314 06:59:33.402952 4893 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 06:59:33 crc kubenswrapper[4893]: I0314 06:59:33.403212 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:33 crc kubenswrapper[4893]: I0314 06:59:33.404919 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:33 crc kubenswrapper[4893]: I0314 06:59:33.404992 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:33 crc kubenswrapper[4893]: I0314 06:59:33.405014 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:33 crc kubenswrapper[4893]: I0314 06:59:33.405929 4893 scope.go:117] "RemoveContainer" containerID="a62aeda46c76e76449bd037e1594b4d96e68d2519229dd27665299c24d5142b9" Mar 14 06:59:33 crc kubenswrapper[4893]: E0314 06:59:33.406229 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 14 06:59:34 crc kubenswrapper[4893]: I0314 06:59:34.326043 4893 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 06:59:35 crc kubenswrapper[4893]: I0314 06:59:35.005943 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 06:59:35 crc kubenswrapper[4893]: I0314 06:59:35.006331 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:35 crc kubenswrapper[4893]: I0314 06:59:35.008465 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:35 crc kubenswrapper[4893]: I0314 06:59:35.008550 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:35 crc kubenswrapper[4893]: I0314 06:59:35.008574 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:35 crc kubenswrapper[4893]: I0314 06:59:35.009514 4893 scope.go:117] "RemoveContainer" containerID="a62aeda46c76e76449bd037e1594b4d96e68d2519229dd27665299c24d5142b9" Mar 14 06:59:35 crc kubenswrapper[4893]: E0314 06:59:35.009801 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 14 06:59:35 crc kubenswrapper[4893]: I0314 06:59:35.327364 4893 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 06:59:35 crc kubenswrapper[4893]: E0314 06:59:35.857077 4893 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 14 06:59:36 crc kubenswrapper[4893]: I0314 06:59:36.238911 4893 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:59:36 crc kubenswrapper[4893]: I0314 06:59:36.238993 4893 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:59:36 crc kubenswrapper[4893]: E0314 06:59:36.245369 4893 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ca2f99cc8cee4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 14 06:59:36 crc kubenswrapper[4893]: &Event{ObjectMeta:{kube-controller-manager-crc.189ca2f99cc8cee4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 14 06:59:36 crc kubenswrapper[4893]: body: Mar 14 06:59:36 crc kubenswrapper[4893]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:06.238095076 +0000 UTC m=+25.500271868,LastTimestamp:2026-03-14 06:59:36.238965585 +0000 UTC m=+55.501142377,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 14 06:59:36 crc kubenswrapper[4893]: > Mar 14 06:59:36 crc kubenswrapper[4893]: I0314 06:59:36.323854 4893 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 06:59:36 crc kubenswrapper[4893]: I0314 06:59:36.600891 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:36 crc kubenswrapper[4893]: I0314 06:59:36.602304 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:36 crc kubenswrapper[4893]: I0314 06:59:36.602334 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:36 crc kubenswrapper[4893]: I0314 06:59:36.602342 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:36 crc kubenswrapper[4893]: I0314 06:59:36.602362 4893 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 14 06:59:36 crc kubenswrapper[4893]: E0314 06:59:36.609105 4893 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 14 06:59:37 crc kubenswrapper[4893]: I0314 06:59:37.325104 4893 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 06:59:37 crc kubenswrapper[4893]: E0314 06:59:37.345725 4893 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 14 06:59:38 crc kubenswrapper[4893]: I0314 06:59:38.323918 4893 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 06:59:38 crc kubenswrapper[4893]: I0314 06:59:38.541192 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 14 06:59:38 crc kubenswrapper[4893]: I0314 06:59:38.541360 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:38 crc kubenswrapper[4893]: I0314 06:59:38.542602 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:38 crc kubenswrapper[4893]: I0314 06:59:38.542640 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:38 crc kubenswrapper[4893]: I0314 06:59:38.542657 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:39 crc kubenswrapper[4893]: I0314 06:59:39.326182 4893 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 06:59:40 crc kubenswrapper[4893]: I0314 06:59:40.324589 4893 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 06:59:41 crc kubenswrapper[4893]: I0314 06:59:41.324976 4893 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 06:59:42 crc kubenswrapper[4893]: I0314 06:59:42.323144 4893 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 06:59:43 crc kubenswrapper[4893]: I0314 06:59:43.251348 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 06:59:43 crc kubenswrapper[4893]: I0314 06:59:43.251566 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:43 crc kubenswrapper[4893]: I0314 06:59:43.253134 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:43 crc kubenswrapper[4893]: I0314 06:59:43.253203 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:43 crc kubenswrapper[4893]: I0314 06:59:43.253222 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:43 crc kubenswrapper[4893]: I0314 06:59:43.257101 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 06:59:43 crc kubenswrapper[4893]: I0314 06:59:43.322185 4893 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 06:59:43 crc kubenswrapper[4893]: I0314 06:59:43.610065 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:43 crc kubenswrapper[4893]: I0314 06:59:43.611545 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:43 crc kubenswrapper[4893]: I0314 06:59:43.611627 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:43 crc kubenswrapper[4893]: I0314 06:59:43.611647 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:43 crc kubenswrapper[4893]: I0314 06:59:43.611723 4893 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 14 06:59:43 crc kubenswrapper[4893]: E0314 06:59:43.617237 4893 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 14 06:59:43 crc kubenswrapper[4893]: I0314 06:59:43.626953 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:43 crc kubenswrapper[4893]: I0314 06:59:43.627916 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:43 crc kubenswrapper[4893]: I0314 06:59:43.627970 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:43 crc kubenswrapper[4893]: I0314 06:59:43.627986 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:44 crc kubenswrapper[4893]: I0314 06:59:44.322886 4893 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 06:59:44 crc kubenswrapper[4893]: E0314 06:59:44.351090 4893 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 14 06:59:45 crc kubenswrapper[4893]: I0314 06:59:45.321805 4893 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 06:59:45 crc kubenswrapper[4893]: E0314 06:59:45.857147 4893 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 14 06:59:46 crc kubenswrapper[4893]: I0314 06:59:46.321788 4893 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 06:59:47 crc kubenswrapper[4893]: I0314 06:59:47.322233 4893 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 06:59:48 crc kubenswrapper[4893]: I0314 06:59:48.322052 4893 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 06:59:48 crc kubenswrapper[4893]: I0314 06:59:48.375686 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:48 crc kubenswrapper[4893]: I0314 06:59:48.377010 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:48 crc kubenswrapper[4893]: I0314 06:59:48.377052 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:48 crc kubenswrapper[4893]: I0314 06:59:48.377062 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:48 crc kubenswrapper[4893]: I0314 06:59:48.377652 4893 scope.go:117] "RemoveContainer" containerID="a62aeda46c76e76449bd037e1594b4d96e68d2519229dd27665299c24d5142b9" Mar 14 06:59:48 crc kubenswrapper[4893]: E0314 06:59:48.377852 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 14 06:59:49 crc kubenswrapper[4893]: I0314 06:59:49.322623 4893 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 06:59:50 crc kubenswrapper[4893]: I0314 06:59:50.323975 4893 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 06:59:50 crc kubenswrapper[4893]: I0314 06:59:50.425274 4893 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 14 06:59:50 crc kubenswrapper[4893]: I0314 06:59:50.446025 4893 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 14 06:59:50 crc kubenswrapper[4893]: I0314 06:59:50.617604 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:50 crc kubenswrapper[4893]: I0314 06:59:50.620735 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:50 crc kubenswrapper[4893]: I0314 06:59:50.620788 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:50 crc kubenswrapper[4893]: I0314 06:59:50.620803 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:50 crc kubenswrapper[4893]: I0314 06:59:50.620835 4893 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 14 06:59:50 crc kubenswrapper[4893]: E0314 06:59:50.630790 4893 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 14 06:59:51 crc kubenswrapper[4893]: I0314 06:59:51.326856 4893 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 06:59:51 crc kubenswrapper[4893]: E0314 06:59:51.359856 4893 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 14 06:59:52 crc kubenswrapper[4893]: I0314 06:59:52.323091 4893 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 06:59:52 crc kubenswrapper[4893]: W0314 06:59:52.817989 4893 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 14 06:59:52 crc kubenswrapper[4893]: E0314 06:59:52.818232 4893 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 14 06:59:53 crc kubenswrapper[4893]: I0314 06:59:53.322578 4893 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 06:59:54 crc kubenswrapper[4893]: I0314 06:59:54.322902 4893 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 06:59:55 crc kubenswrapper[4893]: I0314 06:59:55.321822 4893 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 06:59:55 crc kubenswrapper[4893]: E0314 06:59:55.857225 4893 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 14 06:59:55 crc kubenswrapper[4893]: W0314 06:59:55.918198 4893 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 14 06:59:55 crc kubenswrapper[4893]: E0314 06:59:55.918257 4893 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 14 06:59:56 crc kubenswrapper[4893]: I0314 06:59:56.322537 4893 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 06:59:57 crc kubenswrapper[4893]: I0314 06:59:57.322792 4893 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 06:59:57 crc kubenswrapper[4893]: W0314 06:59:57.512642 4893 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 14 06:59:57 crc kubenswrapper[4893]: E0314 06:59:57.512712 4893 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 14 06:59:57 crc kubenswrapper[4893]: I0314 06:59:57.631233 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:57 crc kubenswrapper[4893]: I0314 06:59:57.632964 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:57 crc kubenswrapper[4893]: I0314 06:59:57.633060 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:57 crc kubenswrapper[4893]: I0314 06:59:57.633083 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:57 crc kubenswrapper[4893]: I0314 06:59:57.633119 4893 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 14 06:59:57 crc kubenswrapper[4893]: E0314 06:59:57.637214 4893 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 14 06:59:58 crc kubenswrapper[4893]: I0314 06:59:58.322513 4893 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 06:59:58 crc kubenswrapper[4893]: E0314 06:59:58.364951 4893 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 14 06:59:58 crc kubenswrapper[4893]: W0314 06:59:58.628823 4893 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 14 06:59:58 crc kubenswrapper[4893]: E0314 06:59:58.628937 4893 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 14 06:59:59 crc kubenswrapper[4893]: I0314 06:59:59.322738 4893 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 07:00:00 crc kubenswrapper[4893]: I0314 07:00:00.323197 4893 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 07:00:00 crc kubenswrapper[4893]: I0314 07:00:00.627940 4893 csr.go:261] certificate signing request csr-884sx is approved, waiting to be issued Mar 14 07:00:00 crc kubenswrapper[4893]: I0314 07:00:00.643342 4893 csr.go:257] certificate signing request csr-884sx is issued Mar 14 07:00:00 crc kubenswrapper[4893]: I0314 07:00:00.694751 4893 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 14 07:00:01 crc kubenswrapper[4893]: I0314 07:00:01.180457 4893 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 14 07:00:01 crc kubenswrapper[4893]: I0314 07:00:01.645568 4893 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-05 20:04:35.806208687 +0000 UTC Mar 14 07:00:01 crc kubenswrapper[4893]: I0314 07:00:01.645674 4893 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7141h4m34.16054099s for next certificate rotation Mar 14 07:00:02 crc kubenswrapper[4893]: I0314 07:00:02.376809 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 07:00:02 crc kubenswrapper[4893]: I0314 07:00:02.379241 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:02 crc kubenswrapper[4893]: I0314 07:00:02.379332 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:02 crc kubenswrapper[4893]: I0314 07:00:02.379360 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:02 crc kubenswrapper[4893]: I0314 07:00:02.380762 4893 scope.go:117] "RemoveContainer" containerID="a62aeda46c76e76449bd037e1594b4d96e68d2519229dd27665299c24d5142b9" Mar 14 07:00:02 crc kubenswrapper[4893]: I0314 07:00:02.680256 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 14 07:00:02 crc kubenswrapper[4893]: I0314 07:00:02.683750 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"431bfcfc9563ced2c11a429c8d1775119884bdebac6d38a8341dc0958b1f5aa6"} Mar 14 07:00:02 crc kubenswrapper[4893]: I0314 07:00:02.684283 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 07:00:02 crc kubenswrapper[4893]: I0314 07:00:02.686273 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:02 crc kubenswrapper[4893]: I0314 07:00:02.686319 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:02 crc kubenswrapper[4893]: I0314 07:00:02.686333 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:03 crc kubenswrapper[4893]: I0314 07:00:03.687498 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 14 07:00:03 crc kubenswrapper[4893]: I0314 07:00:03.689247 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 14 07:00:03 crc kubenswrapper[4893]: I0314 07:00:03.692008 4893 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="431bfcfc9563ced2c11a429c8d1775119884bdebac6d38a8341dc0958b1f5aa6" exitCode=255 Mar 14 07:00:03 crc kubenswrapper[4893]: I0314 07:00:03.692034 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"431bfcfc9563ced2c11a429c8d1775119884bdebac6d38a8341dc0958b1f5aa6"} Mar 14 07:00:03 crc kubenswrapper[4893]: I0314 07:00:03.692094 4893 scope.go:117] "RemoveContainer" containerID="a62aeda46c76e76449bd037e1594b4d96e68d2519229dd27665299c24d5142b9" Mar 14 07:00:03 crc kubenswrapper[4893]: I0314 07:00:03.692300 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 07:00:03 crc kubenswrapper[4893]: I0314 07:00:03.693319 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:03 crc kubenswrapper[4893]: I0314 07:00:03.693499 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:03 crc kubenswrapper[4893]: I0314 07:00:03.693720 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:03 crc kubenswrapper[4893]: I0314 07:00:03.695711 4893 scope.go:117] "RemoveContainer" containerID="431bfcfc9563ced2c11a429c8d1775119884bdebac6d38a8341dc0958b1f5aa6" Mar 14 07:00:03 crc kubenswrapper[4893]: E0314 07:00:03.696177 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 14 07:00:04 crc kubenswrapper[4893]: I0314 07:00:04.637596 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 07:00:04 crc kubenswrapper[4893]: I0314 07:00:04.639622 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:04 crc kubenswrapper[4893]: I0314 07:00:04.639689 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:04 crc kubenswrapper[4893]: I0314 07:00:04.639709 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:04 crc kubenswrapper[4893]: I0314 07:00:04.639901 4893 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 14 07:00:04 crc kubenswrapper[4893]: I0314 07:00:04.654176 4893 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 14 07:00:04 crc kubenswrapper[4893]: I0314 07:00:04.654742 4893 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 14 07:00:04 crc kubenswrapper[4893]: E0314 07:00:04.654797 4893 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 14 07:00:04 crc kubenswrapper[4893]: I0314 07:00:04.659707 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:04 crc kubenswrapper[4893]: I0314 07:00:04.659779 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:04 crc kubenswrapper[4893]: I0314 07:00:04.659809 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:04 crc kubenswrapper[4893]: I0314 07:00:04.659847 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:04 crc kubenswrapper[4893]: I0314 07:00:04.659873 4893 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:04Z","lastTransitionTime":"2026-03-14T07:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:04 crc kubenswrapper[4893]: E0314 07:00:04.687236 4893 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b2309444-9d73-4619-8320-90d0fa1f365c\\\",\\\"systemUUID\\\":\\\"89dd883c-4665-4ad4-a9f9-9508898f96bf\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 07:00:04 crc kubenswrapper[4893]: I0314 07:00:04.697912 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 14 07:00:04 crc kubenswrapper[4893]: I0314 07:00:04.702042 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:04 crc kubenswrapper[4893]: I0314 07:00:04.702155 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:04 crc kubenswrapper[4893]: I0314 07:00:04.702189 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:04 crc kubenswrapper[4893]: I0314 07:00:04.702219 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:04 crc kubenswrapper[4893]: I0314 07:00:04.702241 4893 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:04Z","lastTransitionTime":"2026-03-14T07:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:04 crc kubenswrapper[4893]: E0314 07:00:04.720381 4893 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b2309444-9d73-4619-8320-90d0fa1f365c\\\",\\\"systemUUID\\\":\\\"89dd883c-4665-4ad4-a9f9-9508898f96bf\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 07:00:04 crc kubenswrapper[4893]: I0314 07:00:04.732267 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:04 crc kubenswrapper[4893]: I0314 07:00:04.732314 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:04 crc kubenswrapper[4893]: I0314 07:00:04.732329 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:04 crc kubenswrapper[4893]: I0314 07:00:04.732344 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:04 crc kubenswrapper[4893]: I0314 07:00:04.732354 4893 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:04Z","lastTransitionTime":"2026-03-14T07:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:04 crc kubenswrapper[4893]: E0314 07:00:04.751439 4893 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b2309444-9d73-4619-8320-90d0fa1f365c\\\",\\\"systemUUID\\\":\\\"89dd883c-4665-4ad4-a9f9-9508898f96bf\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 07:00:04 crc kubenswrapper[4893]: I0314 07:00:04.763825 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:04 crc kubenswrapper[4893]: I0314 07:00:04.763881 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:04 crc kubenswrapper[4893]: I0314 07:00:04.763896 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:04 crc kubenswrapper[4893]: I0314 07:00:04.763918 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:04 crc kubenswrapper[4893]: I0314 07:00:04.763937 4893 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:04Z","lastTransitionTime":"2026-03-14T07:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:04 crc kubenswrapper[4893]: E0314 07:00:04.776453 4893 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b2309444-9d73-4619-8320-90d0fa1f365c\\\",\\\"systemUUID\\\":\\\"89dd883c-4665-4ad4-a9f9-9508898f96bf\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 07:00:04 crc kubenswrapper[4893]: E0314 07:00:04.776795 4893 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 14 07:00:04 crc kubenswrapper[4893]: E0314 07:00:04.776854 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:04 crc kubenswrapper[4893]: E0314 07:00:04.877476 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:04 crc kubenswrapper[4893]: E0314 07:00:04.977899 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:05 crc kubenswrapper[4893]: I0314 07:00:05.005261 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 07:00:05 crc kubenswrapper[4893]: I0314 07:00:05.005625 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 07:00:05 crc kubenswrapper[4893]: I0314 07:00:05.007362 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:05 crc kubenswrapper[4893]: I0314 07:00:05.007418 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:05 crc kubenswrapper[4893]: I0314 07:00:05.007435 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:05 crc kubenswrapper[4893]: I0314 07:00:05.008241 4893 scope.go:117] "RemoveContainer" containerID="431bfcfc9563ced2c11a429c8d1775119884bdebac6d38a8341dc0958b1f5aa6" Mar 14 07:00:05 crc kubenswrapper[4893]: E0314 07:00:05.008480 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 14 07:00:05 crc kubenswrapper[4893]: E0314 07:00:05.078721 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:05 crc kubenswrapper[4893]: E0314 07:00:05.179833 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:05 crc kubenswrapper[4893]: E0314 07:00:05.280503 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:05 crc kubenswrapper[4893]: E0314 07:00:05.381351 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:05 crc kubenswrapper[4893]: E0314 07:00:05.482060 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:05 crc kubenswrapper[4893]: E0314 07:00:05.582680 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:05 crc kubenswrapper[4893]: E0314 07:00:05.683596 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:05 crc kubenswrapper[4893]: E0314 07:00:05.784509 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:05 crc kubenswrapper[4893]: E0314 07:00:05.857492 4893 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 14 07:00:05 crc kubenswrapper[4893]: E0314 07:00:05.884928 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:05 crc kubenswrapper[4893]: E0314 07:00:05.985702 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:06 crc kubenswrapper[4893]: E0314 07:00:06.086635 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:06 crc kubenswrapper[4893]: E0314 07:00:06.186942 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:06 crc kubenswrapper[4893]: E0314 07:00:06.288141 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:06 crc kubenswrapper[4893]: E0314 07:00:06.388942 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:06 crc kubenswrapper[4893]: E0314 07:00:06.489577 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:06 crc kubenswrapper[4893]: E0314 07:00:06.590598 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:06 crc kubenswrapper[4893]: E0314 07:00:06.691190 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:06 crc kubenswrapper[4893]: E0314 07:00:06.792136 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:06 crc kubenswrapper[4893]: E0314 07:00:06.892828 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:06 crc kubenswrapper[4893]: E0314 07:00:06.993119 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:07 crc kubenswrapper[4893]: E0314 07:00:07.094218 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:07 crc kubenswrapper[4893]: E0314 07:00:07.195223 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:07 crc kubenswrapper[4893]: E0314 07:00:07.295484 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:07 crc kubenswrapper[4893]: E0314 07:00:07.395818 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:07 crc kubenswrapper[4893]: E0314 07:00:07.496269 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:07 crc kubenswrapper[4893]: E0314 07:00:07.596839 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:07 crc kubenswrapper[4893]: E0314 07:00:07.697969 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:07 crc kubenswrapper[4893]: E0314 07:00:07.798708 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:07 crc kubenswrapper[4893]: E0314 07:00:07.899391 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:07 crc kubenswrapper[4893]: E0314 07:00:07.999625 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:08 crc kubenswrapper[4893]: E0314 07:00:08.099903 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:08 crc kubenswrapper[4893]: E0314 07:00:08.200240 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:08 crc kubenswrapper[4893]: E0314 07:00:08.301385 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:08 crc kubenswrapper[4893]: E0314 07:00:08.402455 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:08 crc kubenswrapper[4893]: E0314 07:00:08.503575 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:08 crc kubenswrapper[4893]: E0314 07:00:08.604585 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:08 crc kubenswrapper[4893]: E0314 07:00:08.704699 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:08 crc kubenswrapper[4893]: E0314 07:00:08.805307 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:08 crc kubenswrapper[4893]: E0314 07:00:08.905804 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:09 crc kubenswrapper[4893]: E0314 07:00:09.006984 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:09 crc kubenswrapper[4893]: E0314 07:00:09.108129 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:09 crc kubenswrapper[4893]: E0314 07:00:09.209185 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:09 crc kubenswrapper[4893]: E0314 07:00:09.310219 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:09 crc kubenswrapper[4893]: I0314 07:00:09.376615 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 07:00:09 crc kubenswrapper[4893]: I0314 07:00:09.377815 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:09 crc kubenswrapper[4893]: I0314 07:00:09.377946 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:09 crc kubenswrapper[4893]: I0314 07:00:09.378026 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:09 crc kubenswrapper[4893]: E0314 07:00:09.411329 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:09 crc kubenswrapper[4893]: E0314 07:00:09.512418 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:09 crc kubenswrapper[4893]: E0314 07:00:09.613596 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:09 crc kubenswrapper[4893]: E0314 07:00:09.714587 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:09 crc kubenswrapper[4893]: E0314 07:00:09.815973 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:09 crc kubenswrapper[4893]: E0314 07:00:09.916917 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:10 crc kubenswrapper[4893]: E0314 07:00:10.017920 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:10 crc kubenswrapper[4893]: E0314 07:00:10.118995 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:10 crc kubenswrapper[4893]: E0314 07:00:10.220053 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:10 crc kubenswrapper[4893]: E0314 07:00:10.321032 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:10 crc kubenswrapper[4893]: E0314 07:00:10.421431 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:10 crc kubenswrapper[4893]: E0314 07:00:10.522252 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:10 crc kubenswrapper[4893]: E0314 07:00:10.622710 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:10 crc kubenswrapper[4893]: E0314 07:00:10.723001 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:10 crc kubenswrapper[4893]: E0314 07:00:10.824122 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:10 crc kubenswrapper[4893]: E0314 07:00:10.924767 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:11 crc kubenswrapper[4893]: E0314 07:00:11.025224 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:11 crc kubenswrapper[4893]: E0314 07:00:11.126357 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:11 crc kubenswrapper[4893]: E0314 07:00:11.227168 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:11 crc kubenswrapper[4893]: E0314 07:00:11.327760 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:11 crc kubenswrapper[4893]: E0314 07:00:11.428414 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:11 crc kubenswrapper[4893]: E0314 07:00:11.529571 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:11 crc kubenswrapper[4893]: E0314 07:00:11.630216 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:11 crc kubenswrapper[4893]: E0314 07:00:11.730801 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:11 crc kubenswrapper[4893]: E0314 07:00:11.831379 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:11 crc kubenswrapper[4893]: E0314 07:00:11.931652 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:12 crc kubenswrapper[4893]: E0314 07:00:12.032593 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:12 crc kubenswrapper[4893]: E0314 07:00:12.133636 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:12 crc kubenswrapper[4893]: E0314 07:00:12.234107 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:12 crc kubenswrapper[4893]: E0314 07:00:12.334234 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:12 crc kubenswrapper[4893]: E0314 07:00:12.434804 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:12 crc kubenswrapper[4893]: E0314 07:00:12.535684 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:12 crc kubenswrapper[4893]: E0314 07:00:12.635796 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:12 crc kubenswrapper[4893]: E0314 07:00:12.736386 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:12 crc kubenswrapper[4893]: E0314 07:00:12.837546 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:12 crc kubenswrapper[4893]: E0314 07:00:12.937758 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:13 crc kubenswrapper[4893]: E0314 07:00:13.038096 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:13 crc kubenswrapper[4893]: E0314 07:00:13.139267 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:13 crc kubenswrapper[4893]: E0314 07:00:13.239987 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:13 crc kubenswrapper[4893]: E0314 07:00:13.340765 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:13 crc kubenswrapper[4893]: I0314 07:00:13.403085 4893 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 07:00:13 crc kubenswrapper[4893]: I0314 07:00:13.403284 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 07:00:13 crc kubenswrapper[4893]: I0314 07:00:13.404308 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:13 crc kubenswrapper[4893]: I0314 07:00:13.404364 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:13 crc kubenswrapper[4893]: I0314 07:00:13.404391 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:13 crc kubenswrapper[4893]: I0314 07:00:13.405111 4893 scope.go:117] "RemoveContainer" containerID="431bfcfc9563ced2c11a429c8d1775119884bdebac6d38a8341dc0958b1f5aa6" Mar 14 07:00:13 crc kubenswrapper[4893]: E0314 07:00:13.405283 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 14 07:00:13 crc kubenswrapper[4893]: E0314 07:00:13.441425 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:13 crc kubenswrapper[4893]: E0314 07:00:13.542464 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:13 crc kubenswrapper[4893]: E0314 07:00:13.643420 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:13 crc kubenswrapper[4893]: E0314 07:00:13.744836 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:13 crc kubenswrapper[4893]: E0314 07:00:13.846419 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:13 crc kubenswrapper[4893]: E0314 07:00:13.948215 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:14 crc kubenswrapper[4893]: E0314 07:00:14.048706 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:14 crc kubenswrapper[4893]: E0314 07:00:14.148963 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:14 crc kubenswrapper[4893]: E0314 07:00:14.249470 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:14 crc kubenswrapper[4893]: E0314 07:00:14.349613 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:14 crc kubenswrapper[4893]: E0314 07:00:14.450872 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:14 crc kubenswrapper[4893]: E0314 07:00:14.551765 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:14 crc kubenswrapper[4893]: E0314 07:00:14.651917 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:14 crc kubenswrapper[4893]: E0314 07:00:14.752089 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:14 crc kubenswrapper[4893]: E0314 07:00:14.839550 4893 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 14 07:00:14 crc kubenswrapper[4893]: I0314 07:00:14.844649 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:14 crc kubenswrapper[4893]: I0314 07:00:14.844704 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:14 crc kubenswrapper[4893]: I0314 07:00:14.844722 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:14 crc kubenswrapper[4893]: I0314 07:00:14.844749 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:14 crc kubenswrapper[4893]: I0314 07:00:14.844769 4893 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:14Z","lastTransitionTime":"2026-03-14T07:00:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:14 crc kubenswrapper[4893]: E0314 07:00:14.861870 4893 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b2309444-9d73-4619-8320-90d0fa1f365c\\\",\\\"systemUUID\\\":\\\"89dd883c-4665-4ad4-a9f9-9508898f96bf\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 07:00:14 crc kubenswrapper[4893]: I0314 07:00:14.874174 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:14 crc kubenswrapper[4893]: I0314 07:00:14.874458 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:14 crc kubenswrapper[4893]: I0314 07:00:14.874735 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:14 crc kubenswrapper[4893]: I0314 07:00:14.875379 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:14 crc kubenswrapper[4893]: I0314 07:00:14.875701 4893 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:14Z","lastTransitionTime":"2026-03-14T07:00:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:14 crc kubenswrapper[4893]: E0314 07:00:14.896058 4893 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b2309444-9d73-4619-8320-90d0fa1f365c\\\",\\\"systemUUID\\\":\\\"89dd883c-4665-4ad4-a9f9-9508898f96bf\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 07:00:14 crc kubenswrapper[4893]: I0314 07:00:14.907383 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:14 crc kubenswrapper[4893]: I0314 07:00:14.907431 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:14 crc kubenswrapper[4893]: I0314 07:00:14.907440 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:14 crc kubenswrapper[4893]: I0314 07:00:14.907454 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:14 crc kubenswrapper[4893]: I0314 07:00:14.907463 4893 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:14Z","lastTransitionTime":"2026-03-14T07:00:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:14 crc kubenswrapper[4893]: E0314 07:00:14.916937 4893 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b2309444-9d73-4619-8320-90d0fa1f365c\\\",\\\"systemUUID\\\":\\\"89dd883c-4665-4ad4-a9f9-9508898f96bf\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 07:00:14 crc kubenswrapper[4893]: I0314 07:00:14.924686 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:14 crc kubenswrapper[4893]: I0314 07:00:14.924776 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:14 crc kubenswrapper[4893]: I0314 07:00:14.924797 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:14 crc kubenswrapper[4893]: I0314 07:00:14.924824 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:14 crc kubenswrapper[4893]: I0314 07:00:14.924842 4893 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:14Z","lastTransitionTime":"2026-03-14T07:00:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:14 crc kubenswrapper[4893]: E0314 07:00:14.942202 4893 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b2309444-9d73-4619-8320-90d0fa1f365c\\\",\\\"systemUUID\\\":\\\"89dd883c-4665-4ad4-a9f9-9508898f96bf\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 07:00:14 crc kubenswrapper[4893]: E0314 07:00:14.942893 4893 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 14 07:00:14 crc kubenswrapper[4893]: E0314 07:00:14.943047 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:15 crc kubenswrapper[4893]: E0314 07:00:15.043793 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:15 crc kubenswrapper[4893]: E0314 07:00:15.144820 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:15 crc kubenswrapper[4893]: E0314 07:00:15.246032 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:15 crc kubenswrapper[4893]: E0314 07:00:15.347056 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:15 crc kubenswrapper[4893]: E0314 07:00:15.447740 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:15 crc kubenswrapper[4893]: E0314 07:00:15.548346 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:15 crc kubenswrapper[4893]: E0314 07:00:15.648947 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:15 crc kubenswrapper[4893]: E0314 07:00:15.749685 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:15 crc kubenswrapper[4893]: E0314 07:00:15.850284 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:15 crc kubenswrapper[4893]: E0314 07:00:15.857687 4893 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 14 07:00:15 crc kubenswrapper[4893]: E0314 07:00:15.951388 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:16 crc kubenswrapper[4893]: E0314 07:00:16.051834 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:16 crc kubenswrapper[4893]: E0314 07:00:16.152238 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:16 crc kubenswrapper[4893]: E0314 07:00:16.252449 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:16 crc kubenswrapper[4893]: E0314 07:00:16.353456 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:16 crc kubenswrapper[4893]: E0314 07:00:16.454120 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:16 crc kubenswrapper[4893]: E0314 07:00:16.554664 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:16 crc kubenswrapper[4893]: E0314 07:00:16.655801 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:16 crc kubenswrapper[4893]: E0314 07:00:16.755962 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:16 crc kubenswrapper[4893]: E0314 07:00:16.856873 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:16 crc kubenswrapper[4893]: E0314 07:00:16.957650 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:17 crc kubenswrapper[4893]: E0314 07:00:17.057811 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:17 crc kubenswrapper[4893]: E0314 07:00:17.158284 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:17 crc kubenswrapper[4893]: E0314 07:00:17.258814 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:17 crc kubenswrapper[4893]: E0314 07:00:17.359383 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:17 crc kubenswrapper[4893]: E0314 07:00:17.460508 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:17 crc kubenswrapper[4893]: E0314 07:00:17.561084 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:17 crc kubenswrapper[4893]: E0314 07:00:17.662034 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:17 crc kubenswrapper[4893]: E0314 07:00:17.763063 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:17 crc kubenswrapper[4893]: E0314 07:00:17.864042 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:17 crc kubenswrapper[4893]: E0314 07:00:17.964261 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:18 crc kubenswrapper[4893]: E0314 07:00:18.064489 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:18 crc kubenswrapper[4893]: E0314 07:00:18.164861 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:18 crc kubenswrapper[4893]: E0314 07:00:18.265885 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:18 crc kubenswrapper[4893]: E0314 07:00:18.366795 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:18 crc kubenswrapper[4893]: E0314 07:00:18.467448 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:18 crc kubenswrapper[4893]: E0314 07:00:18.567540 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:18 crc kubenswrapper[4893]: E0314 07:00:18.668578 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:18 crc kubenswrapper[4893]: E0314 07:00:18.769094 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:18 crc kubenswrapper[4893]: E0314 07:00:18.870232 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:18 crc kubenswrapper[4893]: E0314 07:00:18.970497 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:19 crc kubenswrapper[4893]: E0314 07:00:19.071492 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:19 crc kubenswrapper[4893]: E0314 07:00:19.171675 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:19 crc kubenswrapper[4893]: E0314 07:00:19.272593 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:19 crc kubenswrapper[4893]: E0314 07:00:19.372902 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:19 crc kubenswrapper[4893]: E0314 07:00:19.473463 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:19 crc kubenswrapper[4893]: E0314 07:00:19.573911 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:19 crc kubenswrapper[4893]: E0314 07:00:19.674996 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:19 crc kubenswrapper[4893]: E0314 07:00:19.775145 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:19 crc kubenswrapper[4893]: E0314 07:00:19.875717 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:19 crc kubenswrapper[4893]: E0314 07:00:19.976486 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:20 crc kubenswrapper[4893]: E0314 07:00:20.077353 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:20 crc kubenswrapper[4893]: E0314 07:00:20.178223 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:20 crc kubenswrapper[4893]: E0314 07:00:20.278470 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:20 crc kubenswrapper[4893]: E0314 07:00:20.379514 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:20 crc kubenswrapper[4893]: E0314 07:00:20.479804 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:20 crc kubenswrapper[4893]: E0314 07:00:20.580828 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:20 crc kubenswrapper[4893]: E0314 07:00:20.681236 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:20 crc kubenswrapper[4893]: E0314 07:00:20.781774 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:20 crc kubenswrapper[4893]: E0314 07:00:20.882807 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:20 crc kubenswrapper[4893]: E0314 07:00:20.982929 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:21 crc kubenswrapper[4893]: E0314 07:00:21.083095 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:21 crc kubenswrapper[4893]: E0314 07:00:21.184022 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:21 crc kubenswrapper[4893]: E0314 07:00:21.284587 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:21 crc kubenswrapper[4893]: E0314 07:00:21.385493 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:21 crc kubenswrapper[4893]: E0314 07:00:21.486234 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:21 crc kubenswrapper[4893]: E0314 07:00:21.586905 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:21 crc kubenswrapper[4893]: E0314 07:00:21.687561 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:21 crc kubenswrapper[4893]: E0314 07:00:21.788339 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:21 crc kubenswrapper[4893]: E0314 07:00:21.889072 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:21 crc kubenswrapper[4893]: E0314 07:00:21.989457 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:22 crc kubenswrapper[4893]: E0314 07:00:22.090375 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:22 crc kubenswrapper[4893]: E0314 07:00:22.191218 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:22 crc kubenswrapper[4893]: E0314 07:00:22.292408 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:22 crc kubenswrapper[4893]: E0314 07:00:22.393076 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:22 crc kubenswrapper[4893]: E0314 07:00:22.494158 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:22 crc kubenswrapper[4893]: E0314 07:00:22.594407 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:22 crc kubenswrapper[4893]: E0314 07:00:22.695583 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:22 crc kubenswrapper[4893]: E0314 07:00:22.796478 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:22 crc kubenswrapper[4893]: E0314 07:00:22.896831 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:22 crc kubenswrapper[4893]: E0314 07:00:22.997176 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:23 crc kubenswrapper[4893]: E0314 07:00:23.097869 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:23 crc kubenswrapper[4893]: E0314 07:00:23.198528 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:23 crc kubenswrapper[4893]: E0314 07:00:23.299350 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:23 crc kubenswrapper[4893]: E0314 07:00:23.399604 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:23 crc kubenswrapper[4893]: E0314 07:00:23.500129 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:23 crc kubenswrapper[4893]: E0314 07:00:23.601189 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:23 crc kubenswrapper[4893]: E0314 07:00:23.701602 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:23 crc kubenswrapper[4893]: E0314 07:00:23.802198 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:23 crc kubenswrapper[4893]: E0314 07:00:23.902657 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:24 crc kubenswrapper[4893]: E0314 07:00:24.003104 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:24 crc kubenswrapper[4893]: E0314 07:00:24.104093 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:24 crc kubenswrapper[4893]: E0314 07:00:24.205003 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:24 crc kubenswrapper[4893]: E0314 07:00:24.305190 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:24 crc kubenswrapper[4893]: I0314 07:00:24.375771 4893 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 07:00:24 crc kubenswrapper[4893]: I0314 07:00:24.377132 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:24 crc kubenswrapper[4893]: I0314 07:00:24.377170 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:24 crc kubenswrapper[4893]: I0314 07:00:24.377181 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:24 crc kubenswrapper[4893]: I0314 07:00:24.377812 4893 scope.go:117] "RemoveContainer" containerID="431bfcfc9563ced2c11a429c8d1775119884bdebac6d38a8341dc0958b1f5aa6" Mar 14 07:00:24 crc kubenswrapper[4893]: E0314 07:00:24.377978 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 14 07:00:24 crc kubenswrapper[4893]: E0314 07:00:24.406001 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:24 crc kubenswrapper[4893]: E0314 07:00:24.506936 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:24 crc kubenswrapper[4893]: E0314 07:00:24.608095 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:24 crc kubenswrapper[4893]: E0314 07:00:24.708930 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:24 crc kubenswrapper[4893]: E0314 07:00:24.809605 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:24 crc kubenswrapper[4893]: E0314 07:00:24.910550 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:25 crc kubenswrapper[4893]: E0314 07:00:25.001136 4893 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 14 07:00:25 crc kubenswrapper[4893]: I0314 07:00:25.005410 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:25 crc kubenswrapper[4893]: I0314 07:00:25.005451 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:25 crc kubenswrapper[4893]: I0314 07:00:25.005462 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:25 crc kubenswrapper[4893]: I0314 07:00:25.005480 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:25 crc kubenswrapper[4893]: I0314 07:00:25.005494 4893 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:25Z","lastTransitionTime":"2026-03-14T07:00:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:25 crc kubenswrapper[4893]: E0314 07:00:25.015453 4893 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b2309444-9d73-4619-8320-90d0fa1f365c\\\",\\\"systemUUID\\\":\\\"89dd883c-4665-4ad4-a9f9-9508898f96bf\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 07:00:25 crc kubenswrapper[4893]: I0314 07:00:25.021182 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:25 crc kubenswrapper[4893]: I0314 07:00:25.021240 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:25 crc kubenswrapper[4893]: I0314 07:00:25.021257 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:25 crc kubenswrapper[4893]: I0314 07:00:25.021279 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:25 crc kubenswrapper[4893]: I0314 07:00:25.021296 4893 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:25Z","lastTransitionTime":"2026-03-14T07:00:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:25 crc kubenswrapper[4893]: E0314 07:00:25.030422 4893 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b2309444-9d73-4619-8320-90d0fa1f365c\\\",\\\"systemUUID\\\":\\\"89dd883c-4665-4ad4-a9f9-9508898f96bf\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 07:00:25 crc kubenswrapper[4893]: I0314 07:00:25.037665 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:25 crc kubenswrapper[4893]: I0314 07:00:25.037706 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:25 crc kubenswrapper[4893]: I0314 07:00:25.037718 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:25 crc kubenswrapper[4893]: I0314 07:00:25.037736 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:25 crc kubenswrapper[4893]: I0314 07:00:25.037748 4893 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:25Z","lastTransitionTime":"2026-03-14T07:00:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:25 crc kubenswrapper[4893]: E0314 07:00:25.047833 4893 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b2309444-9d73-4619-8320-90d0fa1f365c\\\",\\\"systemUUID\\\":\\\"89dd883c-4665-4ad4-a9f9-9508898f96bf\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 07:00:25 crc kubenswrapper[4893]: I0314 07:00:25.054654 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:25 crc kubenswrapper[4893]: I0314 07:00:25.054688 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:25 crc kubenswrapper[4893]: I0314 07:00:25.054697 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:25 crc kubenswrapper[4893]: I0314 07:00:25.054711 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:25 crc kubenswrapper[4893]: I0314 07:00:25.054721 4893 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:25Z","lastTransitionTime":"2026-03-14T07:00:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:25 crc kubenswrapper[4893]: E0314 07:00:25.066149 4893 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b2309444-9d73-4619-8320-90d0fa1f365c\\\",\\\"systemUUID\\\":\\\"89dd883c-4665-4ad4-a9f9-9508898f96bf\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 07:00:25 crc kubenswrapper[4893]: E0314 07:00:25.066263 4893 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 14 07:00:25 crc kubenswrapper[4893]: E0314 07:00:25.066286 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:25 crc kubenswrapper[4893]: E0314 07:00:25.166945 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:25 crc kubenswrapper[4893]: E0314 07:00:25.267446 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:25 crc kubenswrapper[4893]: E0314 07:00:25.368234 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:25 crc kubenswrapper[4893]: E0314 07:00:25.468800 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:25 crc kubenswrapper[4893]: E0314 07:00:25.569816 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:25 crc kubenswrapper[4893]: E0314 07:00:25.670716 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:25 crc kubenswrapper[4893]: E0314 07:00:25.771562 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:25 crc kubenswrapper[4893]: E0314 07:00:25.858432 4893 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 14 07:00:25 crc kubenswrapper[4893]: E0314 07:00:25.872174 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:25 crc kubenswrapper[4893]: E0314 07:00:25.972894 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:26 crc kubenswrapper[4893]: E0314 07:00:26.073555 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:26 crc kubenswrapper[4893]: E0314 07:00:26.174689 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:26 crc kubenswrapper[4893]: E0314 07:00:26.274805 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:26 crc kubenswrapper[4893]: E0314 07:00:26.375208 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:26 crc kubenswrapper[4893]: E0314 07:00:26.475797 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:26 crc kubenswrapper[4893]: E0314 07:00:26.576849 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:26 crc kubenswrapper[4893]: E0314 07:00:26.677930 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:26 crc kubenswrapper[4893]: E0314 07:00:26.778575 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:26 crc kubenswrapper[4893]: E0314 07:00:26.879804 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:26 crc kubenswrapper[4893]: E0314 07:00:26.980203 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:27 crc kubenswrapper[4893]: E0314 07:00:27.081068 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:27 crc kubenswrapper[4893]: E0314 07:00:27.181989 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:27 crc kubenswrapper[4893]: E0314 07:00:27.282402 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:27 crc kubenswrapper[4893]: E0314 07:00:27.383299 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:27 crc kubenswrapper[4893]: E0314 07:00:27.483698 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:27 crc kubenswrapper[4893]: E0314 07:00:27.584587 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:27 crc kubenswrapper[4893]: E0314 07:00:27.685014 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:27 crc kubenswrapper[4893]: E0314 07:00:27.785618 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:27 crc kubenswrapper[4893]: E0314 07:00:27.885986 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:27 crc kubenswrapper[4893]: E0314 07:00:27.986730 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:28 crc kubenswrapper[4893]: E0314 07:00:28.087500 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:28 crc kubenswrapper[4893]: I0314 07:00:28.135670 4893 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 14 07:00:28 crc kubenswrapper[4893]: E0314 07:00:28.187667 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:28 crc kubenswrapper[4893]: E0314 07:00:28.288090 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:28 crc kubenswrapper[4893]: E0314 07:00:28.388451 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:28 crc kubenswrapper[4893]: E0314 07:00:28.488747 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:28 crc kubenswrapper[4893]: E0314 07:00:28.589488 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:28 crc kubenswrapper[4893]: E0314 07:00:28.690160 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:28 crc kubenswrapper[4893]: E0314 07:00:28.791003 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:28 crc kubenswrapper[4893]: E0314 07:00:28.891289 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:28 crc kubenswrapper[4893]: E0314 07:00:28.991914 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:29 crc kubenswrapper[4893]: E0314 07:00:29.092727 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:29 crc kubenswrapper[4893]: E0314 07:00:29.193064 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:29 crc kubenswrapper[4893]: E0314 07:00:29.293466 4893 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.345286 4893 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.358455 4893 apiserver.go:52] "Watching apiserver" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.363905 4893 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.364399 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-dns/node-resolver-tfcrc","openshift-image-registry/node-ca-5bsmj","openshift-machine-config-operator/machine-config-daemon-d4x6q","openshift-multus/multus-additional-cni-plugins-klzvc","openshift-multus/multus-hk75c","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-ovn-kubernetes/ovnkube-node-cbskd","openshift-network-node-identity/network-node-identity-vrzqb"] Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.364887 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.364973 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.364997 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:00:29 crc kubenswrapper[4893]: E0314 07:00:29.365040 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 07:00:29 crc kubenswrapper[4893]: E0314 07:00:29.365201 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.365363 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.365582 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-tfcrc" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.365627 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.365634 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.365677 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-5bsmj" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.365681 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" Mar 14 07:00:29 crc kubenswrapper[4893]: E0314 07:00:29.365712 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.366040 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-klzvc" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.366923 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-hk75c" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.367187 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.368700 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.369352 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.369710 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.370915 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.371063 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.371303 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.371498 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.371632 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.371727 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.371843 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.371936 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.372038 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.372338 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.372591 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.372704 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.372853 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.372904 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.372961 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.372852 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.373038 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.373106 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.373914 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.374116 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.374586 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.374725 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.374753 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.374828 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.374897 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.374950 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.374953 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.374987 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.375079 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.375130 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.375268 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.375443 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.392892 4893 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.395797 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.395830 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.395842 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.395859 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.395870 4893 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:29Z","lastTransitionTime":"2026-03-14T07:00:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.404840 4893 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.418444 4893 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.422039 4893 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.429340 4893 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.439503 4893 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tfcrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0c1abff-0fdb-44d5-8dd8-c15eb3fca612\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xsgd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tfcrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.452190 4893 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.461247 4893 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5bsmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f24038e-0fe8-4285-86be-96e95c29827b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rksds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5bsmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.473961 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.473994 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.474012 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.474030 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.474045 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.474059 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.474076 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.474090 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.474155 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.474178 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.474230 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.474257 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.474311 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.474346 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.474395 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.474420 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.474469 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.474496 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.474551 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.474578 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.474626 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.474624 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.474652 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.474674 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.474740 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.474766 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.474813 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.474837 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.474860 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.474910 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.474889 4893 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.474932 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.474977 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.475003 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.475026 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.475080 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.475102 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.475154 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.475177 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.475218 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.475239 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.475258 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.475276 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.475302 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.475382 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.475376 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.475432 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.475456 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.475479 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.475496 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.475513 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.475545 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.475564 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.475585 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.475607 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.475631 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.475661 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.475682 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.475711 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.475733 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.475755 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.475777 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.475797 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.475838 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.475856 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.475871 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.475890 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.475908 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.475924 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.475939 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.475954 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.475969 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.475988 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.476007 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.476025 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.476044 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.476063 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.476083 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.476109 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.476126 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.476142 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.476156 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.476171 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.476188 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.476204 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.476218 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.476233 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.476249 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.476265 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.476292 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.476307 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.476324 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.476342 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.476361 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.476378 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.476396 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.476414 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.476432 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.476452 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.476471 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.476494 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.476572 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.476597 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.476618 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.476642 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.476657 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.476672 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.476688 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.476704 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.476720 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.476735 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.476752 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.476769 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.476785 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.476801 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.476821 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.476845 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.476869 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.476893 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.476909 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.476932 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.476955 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.476978 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.476997 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.477014 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.477034 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.477057 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.477080 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.477101 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.477125 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.477147 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.477168 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.477190 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.477212 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.477232 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.477254 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.477280 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.477302 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.477327 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.477352 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.477377 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.477399 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.477421 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.477444 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.477468 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.477496 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.477574 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.477607 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.477628 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.477648 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.477672 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.477694 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.477717 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.477740 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.477762 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.477789 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.477843 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.477866 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.477887 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.477910 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.477933 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.477957 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.477979 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.478000 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.478022 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.478044 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.478067 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.478090 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.478115 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.478139 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.478163 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.478185 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.478207 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.478320 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.478341 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.478358 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.478375 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.478392 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.478409 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.478424 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.478440 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.478458 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.478476 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.478491 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.478510 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.478555 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.478580 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.478601 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.478622 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.478641 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.478658 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.478673 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.478692 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.478714 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.478733 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.478750 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.478767 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.478783 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.478804 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.478822 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.478881 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5x5zb\" (UniqueName: \"kubernetes.io/projected/bdb2af2a-f12a-4cc9-ae9a-368e0f129c18-kube-api-access-5x5zb\") pod \"multus-additional-cni-plugins-klzvc\" (UID: \"bdb2af2a-f12a-4cc9-ae9a-368e0f129c18\") " pod="openshift-multus/multus-additional-cni-plugins-klzvc" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.478907 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9d0cffc0-c15f-4461-817c-1a937ad2afba-etc-kubernetes\") pod \"multus-hk75c\" (UID: \"9d0cffc0-c15f-4461-817c-1a937ad2afba\") " pod="openshift-multus/multus-hk75c" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.478941 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.478965 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.478983 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-log-socket\") pod \"ovnkube-node-cbskd\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.479000 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9d0cffc0-c15f-4461-817c-1a937ad2afba-cnibin\") pod \"multus-hk75c\" (UID: \"9d0cffc0-c15f-4461-817c-1a937ad2afba\") " pod="openshift-multus/multus-hk75c" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.479017 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.479033 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ad6724e5-48cf-4417-ae51-b1cb8c6af70d-proxy-tls\") pod \"machine-config-daemon-d4x6q\" (UID: \"ad6724e5-48cf-4417-ae51-b1cb8c6af70d\") " pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.479048 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bdb2af2a-f12a-4cc9-ae9a-368e0f129c18-system-cni-dir\") pod \"multus-additional-cni-plugins-klzvc\" (UID: \"bdb2af2a-f12a-4cc9-ae9a-368e0f129c18\") " pod="openshift-multus/multus-additional-cni-plugins-klzvc" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.479065 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9d0cffc0-c15f-4461-817c-1a937ad2afba-system-cni-dir\") pod \"multus-hk75c\" (UID: \"9d0cffc0-c15f-4461-817c-1a937ad2afba\") " pod="openshift-multus/multus-hk75c" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.479092 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.479111 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ad6724e5-48cf-4417-ae51-b1cb8c6af70d-mcd-auth-proxy-config\") pod \"machine-config-daemon-d4x6q\" (UID: \"ad6724e5-48cf-4417-ae51-b1cb8c6af70d\") " pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.479128 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9d0cffc0-c15f-4461-817c-1a937ad2afba-host-var-lib-kubelet\") pod \"multus-hk75c\" (UID: \"9d0cffc0-c15f-4461-817c-1a937ad2afba\") " pod="openshift-multus/multus-hk75c" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.479144 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9d0cffc0-c15f-4461-817c-1a937ad2afba-multus-daemon-config\") pod \"multus-hk75c\" (UID: \"9d0cffc0-c15f-4461-817c-1a937ad2afba\") " pod="openshift-multus/multus-hk75c" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.479166 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsgd4\" (UniqueName: \"kubernetes.io/projected/a0c1abff-0fdb-44d5-8dd8-c15eb3fca612-kube-api-access-xsgd4\") pod \"node-resolver-tfcrc\" (UID: \"a0c1abff-0fdb-44d5-8dd8-c15eb3fca612\") " pod="openshift-dns/node-resolver-tfcrc" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.479184 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9d0cffc0-c15f-4461-817c-1a937ad2afba-host-var-lib-cni-bin\") pod \"multus-hk75c\" (UID: \"9d0cffc0-c15f-4461-817c-1a937ad2afba\") " pod="openshift-multus/multus-hk75c" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.479205 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9d0cffc0-c15f-4461-817c-1a937ad2afba-host-run-k8s-cni-cncf-io\") pod \"multus-hk75c\" (UID: \"9d0cffc0-c15f-4461-817c-1a937ad2afba\") " pod="openshift-multus/multus-hk75c" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.479222 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-host-kubelet\") pod \"ovnkube-node-cbskd\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.479240 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-run-systemd\") pod \"ovnkube-node-cbskd\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.479256 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-var-lib-openvswitch\") pod \"ovnkube-node-cbskd\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.479273 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7f24038e-0fe8-4285-86be-96e95c29827b-host\") pod \"node-ca-5bsmj\" (UID: \"7f24038e-0fe8-4285-86be-96e95c29827b\") " pod="openshift-image-registry/node-ca-5bsmj" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.479289 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rksds\" (UniqueName: \"kubernetes.io/projected/7f24038e-0fe8-4285-86be-96e95c29827b-kube-api-access-rksds\") pod \"node-ca-5bsmj\" (UID: \"7f24038e-0fe8-4285-86be-96e95c29827b\") " pod="openshift-image-registry/node-ca-5bsmj" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.479307 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-systemd-units\") pod \"ovnkube-node-cbskd\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.479323 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-run-ovn\") pod \"ovnkube-node-cbskd\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.479343 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bdb2af2a-f12a-4cc9-ae9a-368e0f129c18-tuning-conf-dir\") pod \"multus-additional-cni-plugins-klzvc\" (UID: \"bdb2af2a-f12a-4cc9-ae9a-368e0f129c18\") " pod="openshift-multus/multus-additional-cni-plugins-klzvc" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.479360 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kp2qs\" (UniqueName: \"kubernetes.io/projected/9d0cffc0-c15f-4461-817c-1a937ad2afba-kube-api-access-kp2qs\") pod \"multus-hk75c\" (UID: \"9d0cffc0-c15f-4461-817c-1a937ad2afba\") " pod="openshift-multus/multus-hk75c" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.479377 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a0c1abff-0fdb-44d5-8dd8-c15eb3fca612-hosts-file\") pod \"node-resolver-tfcrc\" (UID: \"a0c1abff-0fdb-44d5-8dd8-c15eb3fca612\") " pod="openshift-dns/node-resolver-tfcrc" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.479393 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-host-slash\") pod \"ovnkube-node-cbskd\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.479410 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zd7rs\" (UniqueName: \"kubernetes.io/projected/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-kube-api-access-zd7rs\") pod \"ovnkube-node-cbskd\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.479428 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/bdb2af2a-f12a-4cc9-ae9a-368e0f129c18-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-klzvc\" (UID: \"bdb2af2a-f12a-4cc9-ae9a-368e0f129c18\") " pod="openshift-multus/multus-additional-cni-plugins-klzvc" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.479448 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.479464 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbwwm\" (UniqueName: \"kubernetes.io/projected/ad6724e5-48cf-4417-ae51-b1cb8c6af70d-kube-api-access-kbwwm\") pod \"machine-config-daemon-d4x6q\" (UID: \"ad6724e5-48cf-4417-ae51-b1cb8c6af70d\") " pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.479484 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bdb2af2a-f12a-4cc9-ae9a-368e0f129c18-cni-binary-copy\") pod \"multus-additional-cni-plugins-klzvc\" (UID: \"bdb2af2a-f12a-4cc9-ae9a-368e0f129c18\") " pod="openshift-multus/multus-additional-cni-plugins-klzvc" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.479500 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9d0cffc0-c15f-4461-817c-1a937ad2afba-multus-conf-dir\") pod \"multus-hk75c\" (UID: \"9d0cffc0-c15f-4461-817c-1a937ad2afba\") " pod="openshift-multus/multus-hk75c" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.479531 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9d0cffc0-c15f-4461-817c-1a937ad2afba-host-run-multus-certs\") pod \"multus-hk75c\" (UID: \"9d0cffc0-c15f-4461-817c-1a937ad2afba\") " pod="openshift-multus/multus-hk75c" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.479547 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9d0cffc0-c15f-4461-817c-1a937ad2afba-multus-cni-dir\") pod \"multus-hk75c\" (UID: \"9d0cffc0-c15f-4461-817c-1a937ad2afba\") " pod="openshift-multus/multus-hk75c" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.479569 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.479585 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.479603 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-host-run-netns\") pod \"ovnkube-node-cbskd\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.479620 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-cbskd\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.479641 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.479662 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bdb2af2a-f12a-4cc9-ae9a-368e0f129c18-cnibin\") pod \"multus-additional-cni-plugins-klzvc\" (UID: \"bdb2af2a-f12a-4cc9-ae9a-368e0f129c18\") " pod="openshift-multus/multus-additional-cni-plugins-klzvc" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.479679 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7f24038e-0fe8-4285-86be-96e95c29827b-serviceca\") pod \"node-ca-5bsmj\" (UID: \"7f24038e-0fe8-4285-86be-96e95c29827b\") " pod="openshift-image-registry/node-ca-5bsmj" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.479700 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.479722 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.479739 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-env-overrides\") pod \"ovnkube-node-cbskd\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.479758 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9d0cffc0-c15f-4461-817c-1a937ad2afba-cni-binary-copy\") pod \"multus-hk75c\" (UID: \"9d0cffc0-c15f-4461-817c-1a937ad2afba\") " pod="openshift-multus/multus-hk75c" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.479775 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-node-log\") pod \"ovnkube-node-cbskd\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.479794 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-ovnkube-config\") pod \"ovnkube-node-cbskd\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.479813 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-ovn-node-metrics-cert\") pod \"ovnkube-node-cbskd\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.479829 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-ovnkube-script-lib\") pod \"ovnkube-node-cbskd\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.479845 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bdb2af2a-f12a-4cc9-ae9a-368e0f129c18-os-release\") pod \"multus-additional-cni-plugins-klzvc\" (UID: \"bdb2af2a-f12a-4cc9-ae9a-368e0f129c18\") " pod="openshift-multus/multus-additional-cni-plugins-klzvc" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.479861 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9d0cffc0-c15f-4461-817c-1a937ad2afba-host-run-netns\") pod \"multus-hk75c\" (UID: \"9d0cffc0-c15f-4461-817c-1a937ad2afba\") " pod="openshift-multus/multus-hk75c" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.479875 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9d0cffc0-c15f-4461-817c-1a937ad2afba-os-release\") pod \"multus-hk75c\" (UID: \"9d0cffc0-c15f-4461-817c-1a937ad2afba\") " pod="openshift-multus/multus-hk75c" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.479906 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.479924 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.479943 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/ad6724e5-48cf-4417-ae51-b1cb8c6af70d-rootfs\") pod \"machine-config-daemon-d4x6q\" (UID: \"ad6724e5-48cf-4417-ae51-b1cb8c6af70d\") " pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.479961 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-etc-openvswitch\") pod \"ovnkube-node-cbskd\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.479977 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-host-cni-bin\") pod \"ovnkube-node-cbskd\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.479996 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9d0cffc0-c15f-4461-817c-1a937ad2afba-hostroot\") pod \"multus-hk75c\" (UID: \"9d0cffc0-c15f-4461-817c-1a937ad2afba\") " pod="openshift-multus/multus-hk75c" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.480023 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.480045 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-host-run-ovn-kubernetes\") pod \"ovnkube-node-cbskd\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.480070 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.480093 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9d0cffc0-c15f-4461-817c-1a937ad2afba-multus-socket-dir-parent\") pod \"multus-hk75c\" (UID: \"9d0cffc0-c15f-4461-817c-1a937ad2afba\") " pod="openshift-multus/multus-hk75c" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.480124 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9d0cffc0-c15f-4461-817c-1a937ad2afba-host-var-lib-cni-multus\") pod \"multus-hk75c\" (UID: \"9d0cffc0-c15f-4461-817c-1a937ad2afba\") " pod="openshift-multus/multus-hk75c" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.480144 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-run-openvswitch\") pod \"ovnkube-node-cbskd\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.480161 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-host-cni-netd\") pod \"ovnkube-node-cbskd\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.480218 4893 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.480233 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.480248 4893 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.480267 4893 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.475561 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.475741 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.475619 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.475610 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.476032 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.476003 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.476074 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.476245 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.476440 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.476445 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.476466 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.476780 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.476829 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.476838 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.477119 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.477150 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.477164 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.477602 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.477625 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.477695 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.477764 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.477778 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.477794 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.478301 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.478450 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.478481 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.478585 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.478704 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.487067 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.479047 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.479141 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.479167 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.479575 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.479689 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.479752 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.479953 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.480132 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.481188 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.481545 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.481555 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.481840 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.481927 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.482264 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.482303 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.482359 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.482698 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.483114 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.483360 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.483374 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.483424 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.483835 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.484110 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.484214 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.484252 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.484280 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.484713 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.484931 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.485101 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.485119 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.485138 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.485340 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.485472 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.485510 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.485765 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.485783 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.485792 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.485864 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.486185 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.486391 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.486471 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.486487 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.486873 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.487003 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.486682 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.487917 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.488005 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.488033 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.490936 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.491249 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.491538 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.491727 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.491764 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.491911 4893 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-klzvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdb2af2a-f12a-4cc9-ae9a-368e0f129c18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x5zb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x5zb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x5zb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x5zb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x5zb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x5zb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x5zb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-klzvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.492254 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: E0314 07:00:29.492408 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:00:29.99238799 +0000 UTC m=+109.254564782 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.492777 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.493095 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.493177 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.493181 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.493218 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.493646 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.493662 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.493658 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.493936 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.493947 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.494059 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.494283 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.494496 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.494504 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.494573 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.495142 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.495226 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.495292 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.495341 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.495553 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.495597 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.495555 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.495737 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.496354 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.496357 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.496475 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.496540 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.496807 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.496864 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.496391 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.497182 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.497272 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.497598 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.497772 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.498102 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.497961 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.498166 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.498396 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.498448 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.498941 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: E0314 07:00:29.499191 4893 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 07:00:29 crc kubenswrapper[4893]: E0314 07:00:29.499272 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 07:00:29.999247116 +0000 UTC m=+109.261423908 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.499269 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.500157 4893 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.500252 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.500697 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.501770 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.501836 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.501855 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.501870 4893 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:29Z","lastTransitionTime":"2026-03-14T07:00:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.501393 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.500835 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.503046 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.503223 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.503391 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.503697 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.504747 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.505252 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.505501 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.505776 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.506064 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.506075 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.506365 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.506371 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: E0314 07:00:29.506436 4893 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 07:00:29 crc kubenswrapper[4893]: E0314 07:00:29.506508 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 07:00:30.006487082 +0000 UTC m=+109.268663874 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.506593 4893 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.506816 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.507015 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.507632 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.507851 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.508471 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.508598 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.508725 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.509049 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.509316 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.510262 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.510611 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.511782 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.512453 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.513017 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.513299 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: E0314 07:00:29.513445 4893 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 07:00:29 crc kubenswrapper[4893]: E0314 07:00:29.513552 4893 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 07:00:29 crc kubenswrapper[4893]: E0314 07:00:29.513652 4893 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 07:00:29 crc kubenswrapper[4893]: E0314 07:00:29.513826 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-14 07:00:30.013771718 +0000 UTC m=+109.275948520 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.514642 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.514714 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.514822 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.514833 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.515131 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.515540 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.515583 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.515682 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.515812 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.516046 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.516088 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.516314 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.516432 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.516798 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.516905 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.516990 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.517147 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: E0314 07:00:29.517211 4893 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 07:00:29 crc kubenswrapper[4893]: E0314 07:00:29.517693 4893 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 07:00:29 crc kubenswrapper[4893]: E0314 07:00:29.517707 4893 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 07:00:29 crc kubenswrapper[4893]: E0314 07:00:29.517759 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-14 07:00:30.017735684 +0000 UTC m=+109.279912476 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.517392 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.517877 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.517957 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.518302 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.518805 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.518817 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.518980 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.519160 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.519671 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.520310 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.520748 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.520752 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.520751 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.521126 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.521285 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.521337 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.521512 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.521736 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.521770 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.521818 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.521842 4893 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hk75c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d0cffc0-c15f-4461-817c-1a937ad2afba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kp2qs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hk75c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.524177 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.528165 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.528183 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.537344 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.542085 4893 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd7rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd7rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd7rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd7rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd7rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd7rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd7rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd7rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd7rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cbskd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.557553 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.558242 4893 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.564449 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.569988 4893 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad6724e5-48cf-4417-ae51-b1cb8c6af70d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kbwwm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kbwwm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d4x6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.570229 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.581821 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a0c1abff-0fdb-44d5-8dd8-c15eb3fca612-hosts-file\") pod \"node-resolver-tfcrc\" (UID: \"a0c1abff-0fdb-44d5-8dd8-c15eb3fca612\") " pod="openshift-dns/node-resolver-tfcrc" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.581903 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-host-slash\") pod \"ovnkube-node-cbskd\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.581929 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zd7rs\" (UniqueName: \"kubernetes.io/projected/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-kube-api-access-zd7rs\") pod \"ovnkube-node-cbskd\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.581951 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/bdb2af2a-f12a-4cc9-ae9a-368e0f129c18-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-klzvc\" (UID: \"bdb2af2a-f12a-4cc9-ae9a-368e0f129c18\") " pod="openshift-multus/multus-additional-cni-plugins-klzvc" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.581996 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbwwm\" (UniqueName: \"kubernetes.io/projected/ad6724e5-48cf-4417-ae51-b1cb8c6af70d-kube-api-access-kbwwm\") pod \"machine-config-daemon-d4x6q\" (UID: \"ad6724e5-48cf-4417-ae51-b1cb8c6af70d\") " pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.582014 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bdb2af2a-f12a-4cc9-ae9a-368e0f129c18-cni-binary-copy\") pod \"multus-additional-cni-plugins-klzvc\" (UID: \"bdb2af2a-f12a-4cc9-ae9a-368e0f129c18\") " pod="openshift-multus/multus-additional-cni-plugins-klzvc" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.582031 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9d0cffc0-c15f-4461-817c-1a937ad2afba-multus-conf-dir\") pod \"multus-hk75c\" (UID: \"9d0cffc0-c15f-4461-817c-1a937ad2afba\") " pod="openshift-multus/multus-hk75c" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.582073 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9d0cffc0-c15f-4461-817c-1a937ad2afba-host-run-multus-certs\") pod \"multus-hk75c\" (UID: \"9d0cffc0-c15f-4461-817c-1a937ad2afba\") " pod="openshift-multus/multus-hk75c" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.582100 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.582145 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-host-run-netns\") pod \"ovnkube-node-cbskd\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.582167 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-cbskd\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.582195 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9d0cffc0-c15f-4461-817c-1a937ad2afba-multus-cni-dir\") pod \"multus-hk75c\" (UID: \"9d0cffc0-c15f-4461-817c-1a937ad2afba\") " pod="openshift-multus/multus-hk75c" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.582237 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bdb2af2a-f12a-4cc9-ae9a-368e0f129c18-cnibin\") pod \"multus-additional-cni-plugins-klzvc\" (UID: \"bdb2af2a-f12a-4cc9-ae9a-368e0f129c18\") " pod="openshift-multus/multus-additional-cni-plugins-klzvc" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.582258 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7f24038e-0fe8-4285-86be-96e95c29827b-serviceca\") pod \"node-ca-5bsmj\" (UID: \"7f24038e-0fe8-4285-86be-96e95c29827b\") " pod="openshift-image-registry/node-ca-5bsmj" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.582315 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-env-overrides\") pod \"ovnkube-node-cbskd\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.582335 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9d0cffc0-c15f-4461-817c-1a937ad2afba-cni-binary-copy\") pod \"multus-hk75c\" (UID: \"9d0cffc0-c15f-4461-817c-1a937ad2afba\") " pod="openshift-multus/multus-hk75c" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.582356 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-node-log\") pod \"ovnkube-node-cbskd\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.582397 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-ovnkube-config\") pod \"ovnkube-node-cbskd\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.582416 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-ovn-node-metrics-cert\") pod \"ovnkube-node-cbskd\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.582436 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-ovnkube-script-lib\") pod \"ovnkube-node-cbskd\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.582475 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bdb2af2a-f12a-4cc9-ae9a-368e0f129c18-os-release\") pod \"multus-additional-cni-plugins-klzvc\" (UID: \"bdb2af2a-f12a-4cc9-ae9a-368e0f129c18\") " pod="openshift-multus/multus-additional-cni-plugins-klzvc" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.582493 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9d0cffc0-c15f-4461-817c-1a937ad2afba-host-run-netns\") pod \"multus-hk75c\" (UID: \"9d0cffc0-c15f-4461-817c-1a937ad2afba\") " pod="openshift-multus/multus-hk75c" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.582513 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/ad6724e5-48cf-4417-ae51-b1cb8c6af70d-rootfs\") pod \"machine-config-daemon-d4x6q\" (UID: \"ad6724e5-48cf-4417-ae51-b1cb8c6af70d\") " pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.582724 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-etc-openvswitch\") pod \"ovnkube-node-cbskd\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.582766 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-host-cni-bin\") pod \"ovnkube-node-cbskd\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.582786 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9d0cffc0-c15f-4461-817c-1a937ad2afba-os-release\") pod \"multus-hk75c\" (UID: \"9d0cffc0-c15f-4461-817c-1a937ad2afba\") " pod="openshift-multus/multus-hk75c" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.582805 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-host-run-ovn-kubernetes\") pod \"ovnkube-node-cbskd\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.582832 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9d0cffc0-c15f-4461-817c-1a937ad2afba-multus-socket-dir-parent\") pod \"multus-hk75c\" (UID: \"9d0cffc0-c15f-4461-817c-1a937ad2afba\") " pod="openshift-multus/multus-hk75c" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.582850 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9d0cffc0-c15f-4461-817c-1a937ad2afba-host-var-lib-cni-multus\") pod \"multus-hk75c\" (UID: \"9d0cffc0-c15f-4461-817c-1a937ad2afba\") " pod="openshift-multus/multus-hk75c" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.582870 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9d0cffc0-c15f-4461-817c-1a937ad2afba-hostroot\") pod \"multus-hk75c\" (UID: \"9d0cffc0-c15f-4461-817c-1a937ad2afba\") " pod="openshift-multus/multus-hk75c" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.582892 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-run-openvswitch\") pod \"ovnkube-node-cbskd\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.582911 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-host-cni-netd\") pod \"ovnkube-node-cbskd\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.582931 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5x5zb\" (UniqueName: \"kubernetes.io/projected/bdb2af2a-f12a-4cc9-ae9a-368e0f129c18-kube-api-access-5x5zb\") pod \"multus-additional-cni-plugins-klzvc\" (UID: \"bdb2af2a-f12a-4cc9-ae9a-368e0f129c18\") " pod="openshift-multus/multus-additional-cni-plugins-klzvc" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.582951 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9d0cffc0-c15f-4461-817c-1a937ad2afba-etc-kubernetes\") pod \"multus-hk75c\" (UID: \"9d0cffc0-c15f-4461-817c-1a937ad2afba\") " pod="openshift-multus/multus-hk75c" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.582982 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-log-socket\") pod \"ovnkube-node-cbskd\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.583000 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9d0cffc0-c15f-4461-817c-1a937ad2afba-cnibin\") pod \"multus-hk75c\" (UID: \"9d0cffc0-c15f-4461-817c-1a937ad2afba\") " pod="openshift-multus/multus-hk75c" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.583020 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.583040 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ad6724e5-48cf-4417-ae51-b1cb8c6af70d-proxy-tls\") pod \"machine-config-daemon-d4x6q\" (UID: \"ad6724e5-48cf-4417-ae51-b1cb8c6af70d\") " pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.583061 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bdb2af2a-f12a-4cc9-ae9a-368e0f129c18-system-cni-dir\") pod \"multus-additional-cni-plugins-klzvc\" (UID: \"bdb2af2a-f12a-4cc9-ae9a-368e0f129c18\") " pod="openshift-multus/multus-additional-cni-plugins-klzvc" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.583079 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9d0cffc0-c15f-4461-817c-1a937ad2afba-system-cni-dir\") pod \"multus-hk75c\" (UID: \"9d0cffc0-c15f-4461-817c-1a937ad2afba\") " pod="openshift-multus/multus-hk75c" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.583100 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ad6724e5-48cf-4417-ae51-b1cb8c6af70d-mcd-auth-proxy-config\") pod \"machine-config-daemon-d4x6q\" (UID: \"ad6724e5-48cf-4417-ae51-b1cb8c6af70d\") " pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.583119 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9d0cffc0-c15f-4461-817c-1a937ad2afba-host-var-lib-kubelet\") pod \"multus-hk75c\" (UID: \"9d0cffc0-c15f-4461-817c-1a937ad2afba\") " pod="openshift-multus/multus-hk75c" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.583138 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9d0cffc0-c15f-4461-817c-1a937ad2afba-multus-daemon-config\") pod \"multus-hk75c\" (UID: \"9d0cffc0-c15f-4461-817c-1a937ad2afba\") " pod="openshift-multus/multus-hk75c" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.583156 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsgd4\" (UniqueName: \"kubernetes.io/projected/a0c1abff-0fdb-44d5-8dd8-c15eb3fca612-kube-api-access-xsgd4\") pod \"node-resolver-tfcrc\" (UID: \"a0c1abff-0fdb-44d5-8dd8-c15eb3fca612\") " pod="openshift-dns/node-resolver-tfcrc" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.583175 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9d0cffc0-c15f-4461-817c-1a937ad2afba-host-var-lib-cni-bin\") pod \"multus-hk75c\" (UID: \"9d0cffc0-c15f-4461-817c-1a937ad2afba\") " pod="openshift-multus/multus-hk75c" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.583193 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-host-kubelet\") pod \"ovnkube-node-cbskd\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.583211 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-run-systemd\") pod \"ovnkube-node-cbskd\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.583228 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-var-lib-openvswitch\") pod \"ovnkube-node-cbskd\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.583247 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7f24038e-0fe8-4285-86be-96e95c29827b-host\") pod \"node-ca-5bsmj\" (UID: \"7f24038e-0fe8-4285-86be-96e95c29827b\") " pod="openshift-image-registry/node-ca-5bsmj" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.583265 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rksds\" (UniqueName: \"kubernetes.io/projected/7f24038e-0fe8-4285-86be-96e95c29827b-kube-api-access-rksds\") pod \"node-ca-5bsmj\" (UID: \"7f24038e-0fe8-4285-86be-96e95c29827b\") " pod="openshift-image-registry/node-ca-5bsmj" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.583284 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9d0cffc0-c15f-4461-817c-1a937ad2afba-host-run-k8s-cni-cncf-io\") pod \"multus-hk75c\" (UID: \"9d0cffc0-c15f-4461-817c-1a937ad2afba\") " pod="openshift-multus/multus-hk75c" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.583302 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-systemd-units\") pod \"ovnkube-node-cbskd\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.583320 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-run-ovn\") pod \"ovnkube-node-cbskd\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.583339 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bdb2af2a-f12a-4cc9-ae9a-368e0f129c18-tuning-conf-dir\") pod \"multus-additional-cni-plugins-klzvc\" (UID: \"bdb2af2a-f12a-4cc9-ae9a-368e0f129c18\") " pod="openshift-multus/multus-additional-cni-plugins-klzvc" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.583359 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kp2qs\" (UniqueName: \"kubernetes.io/projected/9d0cffc0-c15f-4461-817c-1a937ad2afba-kube-api-access-kp2qs\") pod \"multus-hk75c\" (UID: \"9d0cffc0-c15f-4461-817c-1a937ad2afba\") " pod="openshift-multus/multus-hk75c" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.583434 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.583448 4893 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.583460 4893 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.583470 4893 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.583481 4893 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.583492 4893 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.583503 4893 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.583515 4893 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.583559 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.583571 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.583582 4893 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.583592 4893 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.583603 4893 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.583613 4893 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.583623 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.583854 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a0c1abff-0fdb-44d5-8dd8-c15eb3fca612-hosts-file\") pod \"node-resolver-tfcrc\" (UID: \"a0c1abff-0fdb-44d5-8dd8-c15eb3fca612\") " pod="openshift-dns/node-resolver-tfcrc" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.583888 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-host-slash\") pod \"ovnkube-node-cbskd\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.584638 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/bdb2af2a-f12a-4cc9-ae9a-368e0f129c18-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-klzvc\" (UID: \"bdb2af2a-f12a-4cc9-ae9a-368e0f129c18\") " pod="openshift-multus/multus-additional-cni-plugins-klzvc" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.585173 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bdb2af2a-f12a-4cc9-ae9a-368e0f129c18-cni-binary-copy\") pod \"multus-additional-cni-plugins-klzvc\" (UID: \"bdb2af2a-f12a-4cc9-ae9a-368e0f129c18\") " pod="openshift-multus/multus-additional-cni-plugins-klzvc" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.585218 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9d0cffc0-c15f-4461-817c-1a937ad2afba-multus-conf-dir\") pod \"multus-hk75c\" (UID: \"9d0cffc0-c15f-4461-817c-1a937ad2afba\") " pod="openshift-multus/multus-hk75c" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.585244 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9d0cffc0-c15f-4461-817c-1a937ad2afba-host-run-multus-certs\") pod \"multus-hk75c\" (UID: \"9d0cffc0-c15f-4461-817c-1a937ad2afba\") " pod="openshift-multus/multus-hk75c" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.585494 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.585534 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-host-run-netns\") pod \"ovnkube-node-cbskd\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.585556 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-cbskd\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.585742 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9d0cffc0-c15f-4461-817c-1a937ad2afba-multus-cni-dir\") pod \"multus-hk75c\" (UID: \"9d0cffc0-c15f-4461-817c-1a937ad2afba\") " pod="openshift-multus/multus-hk75c" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.585795 4893 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.585809 4893 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.585820 4893 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.585834 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.585846 4893 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.585858 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.585869 4893 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.585878 4893 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.585887 4893 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.585896 4893 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.585904 4893 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.585912 4893 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.585921 4893 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.585929 4893 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.585937 4893 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.585945 4893 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.585953 4893 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.585961 4893 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.585969 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.585978 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.585986 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.585994 4893 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586002 4893 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586009 4893 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586017 4893 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586025 4893 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586032 4893 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586040 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586049 4893 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586057 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586065 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586073 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586081 4893 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586089 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586097 4893 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586106 4893 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586114 4893 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586122 4893 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586130 4893 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586139 4893 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586148 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586156 4893 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586164 4893 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586172 4893 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586180 4893 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586187 4893 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586195 4893 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586203 4893 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586212 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586220 4893 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586228 4893 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586237 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586246 4893 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586254 4893 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586261 4893 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586269 4893 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586276 4893 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586285 4893 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586293 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586301 4893 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586309 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586317 4893 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586325 4893 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586333 4893 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586340 4893 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586348 4893 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586356 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586365 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586373 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586380 4893 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586389 4893 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586396 4893 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586404 4893 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586412 4893 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586419 4893 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586436 4893 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586445 4893 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586452 4893 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586461 4893 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586469 4893 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586477 4893 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586485 4893 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586492 4893 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586500 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586508 4893 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586535 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586544 4893 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586552 4893 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586560 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586569 4893 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586577 4893 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586585 4893 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586593 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586607 4893 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586614 4893 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586622 4893 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586630 4893 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586638 4893 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586646 4893 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586654 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586662 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586670 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586678 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586685 4893 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586693 4893 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586701 4893 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586710 4893 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586718 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586726 4893 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586734 4893 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586743 4893 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586751 4893 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586756 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bdb2af2a-f12a-4cc9-ae9a-368e0f129c18-cnibin\") pod \"multus-additional-cni-plugins-klzvc\" (UID: \"bdb2af2a-f12a-4cc9-ae9a-368e0f129c18\") " pod="openshift-multus/multus-additional-cni-plugins-klzvc" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586760 4893 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586780 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/ad6724e5-48cf-4417-ae51-b1cb8c6af70d-rootfs\") pod \"machine-config-daemon-d4x6q\" (UID: \"ad6724e5-48cf-4417-ae51-b1cb8c6af70d\") " pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586790 4893 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.593557 4893 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.593604 4893 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.593630 4893 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.593647 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.593663 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.593685 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.593708 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.593723 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.593740 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.593781 4893 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.593797 4893 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.593817 4893 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.593832 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.593849 4893 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.593864 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.593883 4893 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.593897 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.593910 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.593938 4893 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.593951 4893 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.593964 4893 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.593982 4893 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.593995 4893 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.594015 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.594030 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.594048 4893 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.594061 4893 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.594073 4893 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.594089 4893 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.594103 4893 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.594118 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.594139 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.594151 4893 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.594163 4893 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.594175 4893 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.594192 4893 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.594203 4893 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.594214 4893 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.594225 4893 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.594243 4893 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.594255 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.594267 4893 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.594285 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.594297 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.594308 4893 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.594320 4893 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.594337 4893 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.594350 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.594363 4893 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.594375 4893 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.594393 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.594406 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.594419 4893 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.594432 4893 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.594450 4893 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.594464 4893 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.594477 4893 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.594498 4893 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.594511 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.594545 4893 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.594558 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.590593 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7f24038e-0fe8-4285-86be-96e95c29827b-serviceca\") pod \"node-ca-5bsmj\" (UID: \"7f24038e-0fe8-4285-86be-96e95c29827b\") " pod="openshift-image-registry/node-ca-5bsmj" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586729 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9d0cffc0-c15f-4461-817c-1a937ad2afba-etc-kubernetes\") pod \"multus-hk75c\" (UID: \"9d0cffc0-c15f-4461-817c-1a937ad2afba\") " pod="openshift-multus/multus-hk75c" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586811 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-etc-openvswitch\") pod \"ovnkube-node-cbskd\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586830 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-host-cni-bin\") pod \"ovnkube-node-cbskd\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586867 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9d0cffc0-c15f-4461-817c-1a937ad2afba-os-release\") pod \"multus-hk75c\" (UID: \"9d0cffc0-c15f-4461-817c-1a937ad2afba\") " pod="openshift-multus/multus-hk75c" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586882 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-host-run-ovn-kubernetes\") pod \"ovnkube-node-cbskd\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586904 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9d0cffc0-c15f-4461-817c-1a937ad2afba-multus-socket-dir-parent\") pod \"multus-hk75c\" (UID: \"9d0cffc0-c15f-4461-817c-1a937ad2afba\") " pod="openshift-multus/multus-hk75c" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586918 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9d0cffc0-c15f-4461-817c-1a937ad2afba-host-var-lib-cni-multus\") pod \"multus-hk75c\" (UID: \"9d0cffc0-c15f-4461-817c-1a937ad2afba\") " pod="openshift-multus/multus-hk75c" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586933 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9d0cffc0-c15f-4461-817c-1a937ad2afba-hostroot\") pod \"multus-hk75c\" (UID: \"9d0cffc0-c15f-4461-817c-1a937ad2afba\") " pod="openshift-multus/multus-hk75c" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586950 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-run-openvswitch\") pod \"ovnkube-node-cbskd\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.586965 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-host-cni-netd\") pod \"ovnkube-node-cbskd\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.587635 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-ovnkube-config\") pod \"ovnkube-node-cbskd\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.587778 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-env-overrides\") pod \"ovnkube-node-cbskd\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.587991 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-log-socket\") pod \"ovnkube-node-cbskd\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.588031 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9d0cffc0-c15f-4461-817c-1a937ad2afba-cnibin\") pod \"multus-hk75c\" (UID: \"9d0cffc0-c15f-4461-817c-1a937ad2afba\") " pod="openshift-multus/multus-hk75c" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.588057 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.588138 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9d0cffc0-c15f-4461-817c-1a937ad2afba-cni-binary-copy\") pod \"multus-hk75c\" (UID: \"9d0cffc0-c15f-4461-817c-1a937ad2afba\") " pod="openshift-multus/multus-hk75c" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.588168 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-node-log\") pod \"ovnkube-node-cbskd\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.588213 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7f24038e-0fe8-4285-86be-96e95c29827b-host\") pod \"node-ca-5bsmj\" (UID: \"7f24038e-0fe8-4285-86be-96e95c29827b\") " pod="openshift-image-registry/node-ca-5bsmj" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.588237 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9d0cffc0-c15f-4461-817c-1a937ad2afba-host-var-lib-cni-bin\") pod \"multus-hk75c\" (UID: \"9d0cffc0-c15f-4461-817c-1a937ad2afba\") " pod="openshift-multus/multus-hk75c" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.588256 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-host-kubelet\") pod \"ovnkube-node-cbskd\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.588280 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-run-systemd\") pod \"ovnkube-node-cbskd\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.588302 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-var-lib-openvswitch\") pod \"ovnkube-node-cbskd\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.588711 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bdb2af2a-f12a-4cc9-ae9a-368e0f129c18-system-cni-dir\") pod \"multus-additional-cni-plugins-klzvc\" (UID: \"bdb2af2a-f12a-4cc9-ae9a-368e0f129c18\") " pod="openshift-multus/multus-additional-cni-plugins-klzvc" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.588734 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-systemd-units\") pod \"ovnkube-node-cbskd\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.588826 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ad6724e5-48cf-4417-ae51-b1cb8c6af70d-mcd-auth-proxy-config\") pod \"machine-config-daemon-d4x6q\" (UID: \"ad6724e5-48cf-4417-ae51-b1cb8c6af70d\") " pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.588875 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9d0cffc0-c15f-4461-817c-1a937ad2afba-host-run-k8s-cni-cncf-io\") pod \"multus-hk75c\" (UID: \"9d0cffc0-c15f-4461-817c-1a937ad2afba\") " pod="openshift-multus/multus-hk75c" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.588889 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9d0cffc0-c15f-4461-817c-1a937ad2afba-system-cni-dir\") pod \"multus-hk75c\" (UID: \"9d0cffc0-c15f-4461-817c-1a937ad2afba\") " pod="openshift-multus/multus-hk75c" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.588898 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-run-ovn\") pod \"ovnkube-node-cbskd\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.588927 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bdb2af2a-f12a-4cc9-ae9a-368e0f129c18-os-release\") pod \"multus-additional-cni-plugins-klzvc\" (UID: \"bdb2af2a-f12a-4cc9-ae9a-368e0f129c18\") " pod="openshift-multus/multus-additional-cni-plugins-klzvc" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.588988 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9d0cffc0-c15f-4461-817c-1a937ad2afba-host-run-netns\") pod \"multus-hk75c\" (UID: \"9d0cffc0-c15f-4461-817c-1a937ad2afba\") " pod="openshift-multus/multus-hk75c" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.589005 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9d0cffc0-c15f-4461-817c-1a937ad2afba-host-var-lib-kubelet\") pod \"multus-hk75c\" (UID: \"9d0cffc0-c15f-4461-817c-1a937ad2afba\") " pod="openshift-multus/multus-hk75c" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.589396 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9d0cffc0-c15f-4461-817c-1a937ad2afba-multus-daemon-config\") pod \"multus-hk75c\" (UID: \"9d0cffc0-c15f-4461-817c-1a937ad2afba\") " pod="openshift-multus/multus-hk75c" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.589404 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-ovnkube-script-lib\") pod \"ovnkube-node-cbskd\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.597194 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-ovn-node-metrics-cert\") pod \"ovnkube-node-cbskd\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.599667 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bdb2af2a-f12a-4cc9-ae9a-368e0f129c18-tuning-conf-dir\") pod \"multus-additional-cni-plugins-klzvc\" (UID: \"bdb2af2a-f12a-4cc9-ae9a-368e0f129c18\") " pod="openshift-multus/multus-additional-cni-plugins-klzvc" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.603535 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ad6724e5-48cf-4417-ae51-b1cb8c6af70d-proxy-tls\") pod \"machine-config-daemon-d4x6q\" (UID: \"ad6724e5-48cf-4417-ae51-b1cb8c6af70d\") " pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.603582 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kp2qs\" (UniqueName: \"kubernetes.io/projected/9d0cffc0-c15f-4461-817c-1a937ad2afba-kube-api-access-kp2qs\") pod \"multus-hk75c\" (UID: \"9d0cffc0-c15f-4461-817c-1a937ad2afba\") " pod="openshift-multus/multus-hk75c" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.604714 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.604736 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.604744 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.604757 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.604766 4893 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:29Z","lastTransitionTime":"2026-03-14T07:00:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.606988 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5x5zb\" (UniqueName: \"kubernetes.io/projected/bdb2af2a-f12a-4cc9-ae9a-368e0f129c18-kube-api-access-5x5zb\") pod \"multus-additional-cni-plugins-klzvc\" (UID: \"bdb2af2a-f12a-4cc9-ae9a-368e0f129c18\") " pod="openshift-multus/multus-additional-cni-plugins-klzvc" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.608044 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbwwm\" (UniqueName: \"kubernetes.io/projected/ad6724e5-48cf-4417-ae51-b1cb8c6af70d-kube-api-access-kbwwm\") pod \"machine-config-daemon-d4x6q\" (UID: \"ad6724e5-48cf-4417-ae51-b1cb8c6af70d\") " pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.610374 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsgd4\" (UniqueName: \"kubernetes.io/projected/a0c1abff-0fdb-44d5-8dd8-c15eb3fca612-kube-api-access-xsgd4\") pod \"node-resolver-tfcrc\" (UID: \"a0c1abff-0fdb-44d5-8dd8-c15eb3fca612\") " pod="openshift-dns/node-resolver-tfcrc" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.617170 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zd7rs\" (UniqueName: \"kubernetes.io/projected/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-kube-api-access-zd7rs\") pod \"ovnkube-node-cbskd\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.617180 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rksds\" (UniqueName: \"kubernetes.io/projected/7f24038e-0fe8-4285-86be-96e95c29827b-kube-api-access-rksds\") pod \"node-ca-5bsmj\" (UID: \"7f24038e-0fe8-4285-86be-96e95c29827b\") " pod="openshift-image-registry/node-ca-5bsmj" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.688039 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.696982 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.702830 4893 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.706346 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.706371 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.706380 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.706395 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.706413 4893 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:29Z","lastTransitionTime":"2026-03-14T07:00:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.709346 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-5bsmj" Mar 14 07:00:29 crc kubenswrapper[4893]: W0314 07:00:29.712584 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-b877b987e0c69c0c1c5ac86440524cfd5b3843041cb0b1a0bf8f4d5e3a1c65c6 WatchSource:0}: Error finding container b877b987e0c69c0c1c5ac86440524cfd5b3843041cb0b1a0bf8f4d5e3a1c65c6: Status 404 returned error can't find the container with id b877b987e0c69c0c1c5ac86440524cfd5b3843041cb0b1a0bf8f4d5e3a1c65c6 Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.719226 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-tfcrc" Mar 14 07:00:29 crc kubenswrapper[4893]: W0314 07:00:29.729096 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f24038e_0fe8_4285_86be_96e95c29827b.slice/crio-fb9ceb72194a1999fda022ba41b1298c7581d1fbee3a316039c094e8d6382213 WatchSource:0}: Error finding container fb9ceb72194a1999fda022ba41b1298c7581d1fbee3a316039c094e8d6382213: Status 404 returned error can't find the container with id fb9ceb72194a1999fda022ba41b1298c7581d1fbee3a316039c094e8d6382213 Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.729970 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.742836 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-klzvc" Mar 14 07:00:29 crc kubenswrapper[4893]: W0314 07:00:29.749866 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0c1abff_0fdb_44d5_8dd8_c15eb3fca612.slice/crio-3d08c8fcb9bba21cce429376aafe72e871763c4216289e4f7b5bd868e489513f WatchSource:0}: Error finding container 3d08c8fcb9bba21cce429376aafe72e871763c4216289e4f7b5bd868e489513f: Status 404 returned error can't find the container with id 3d08c8fcb9bba21cce429376aafe72e871763c4216289e4f7b5bd868e489513f Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.752767 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.758817 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"b877b987e0c69c0c1c5ac86440524cfd5b3843041cb0b1a0bf8f4d5e3a1c65c6"} Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.759700 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-hk75c" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.760105 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-tfcrc" event={"ID":"a0c1abff-0fdb-44d5-8dd8-c15eb3fca612","Type":"ContainerStarted","Data":"3d08c8fcb9bba21cce429376aafe72e871763c4216289e4f7b5bd868e489513f"} Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.761309 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-5bsmj" event={"ID":"7f24038e-0fe8-4285-86be-96e95c29827b","Type":"ContainerStarted","Data":"fb9ceb72194a1999fda022ba41b1298c7581d1fbee3a316039c094e8d6382213"} Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.762078 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"64d01f5f0927f57a4fbc876b8feb0435c1ecd894d9bca6829f49ecb844071964"} Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.767900 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" Mar 14 07:00:29 crc kubenswrapper[4893]: W0314 07:00:29.796504 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbdb2af2a_f12a_4cc9_ae9a_368e0f129c18.slice/crio-52393543484be0e0fadc5614e3968d034688f72dbef4307530a5cf216ba58542 WatchSource:0}: Error finding container 52393543484be0e0fadc5614e3968d034688f72dbef4307530a5cf216ba58542: Status 404 returned error can't find the container with id 52393543484be0e0fadc5614e3968d034688f72dbef4307530a5cf216ba58542 Mar 14 07:00:29 crc kubenswrapper[4893]: W0314 07:00:29.802084 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d0cffc0_c15f_4461_817c_1a937ad2afba.slice/crio-a4f7576a77b8b7fe991e8968b0899eb1c40886a02cdf2b2cd7eafa611f70dcaa WatchSource:0}: Error finding container a4f7576a77b8b7fe991e8968b0899eb1c40886a02cdf2b2cd7eafa611f70dcaa: Status 404 returned error can't find the container with id a4f7576a77b8b7fe991e8968b0899eb1c40886a02cdf2b2cd7eafa611f70dcaa Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.808595 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.808644 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.808998 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.809020 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.809029 4893 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:29Z","lastTransitionTime":"2026-03-14T07:00:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:29 crc kubenswrapper[4893]: W0314 07:00:29.812789 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-e4b73d9d1a43e4e46dccbe21bc957b31f2831a0d29cfb4fc017d947e944bebd2 WatchSource:0}: Error finding container e4b73d9d1a43e4e46dccbe21bc957b31f2831a0d29cfb4fc017d947e944bebd2: Status 404 returned error can't find the container with id e4b73d9d1a43e4e46dccbe21bc957b31f2831a0d29cfb4fc017d947e944bebd2 Mar 14 07:00:29 crc kubenswrapper[4893]: W0314 07:00:29.820017 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa7a2543_8a0d_4d25_9e7a_bc387f9662df.slice/crio-2a560e3ee24dac91f1ed7b177a5be2c4f340aa317b22682e48b6608050ff8859 WatchSource:0}: Error finding container 2a560e3ee24dac91f1ed7b177a5be2c4f340aa317b22682e48b6608050ff8859: Status 404 returned error can't find the container with id 2a560e3ee24dac91f1ed7b177a5be2c4f340aa317b22682e48b6608050ff8859 Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.913781 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.913807 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.913816 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.913830 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.913838 4893 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:29Z","lastTransitionTime":"2026-03-14T07:00:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:29 crc kubenswrapper[4893]: I0314 07:00:29.996667 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:00:29 crc kubenswrapper[4893]: E0314 07:00:29.996860 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:00:30.996832838 +0000 UTC m=+110.259009650 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.019824 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.019887 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.019900 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.019919 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.019941 4893 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:30Z","lastTransitionTime":"2026-03-14T07:00:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.097630 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.097671 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.097692 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.097710 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:00:30 crc kubenswrapper[4893]: E0314 07:00:30.097813 4893 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 07:00:30 crc kubenswrapper[4893]: E0314 07:00:30.097828 4893 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 07:00:30 crc kubenswrapper[4893]: E0314 07:00:30.097839 4893 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 07:00:30 crc kubenswrapper[4893]: E0314 07:00:30.097873 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-14 07:00:31.097860917 +0000 UTC m=+110.360037709 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 07:00:30 crc kubenswrapper[4893]: E0314 07:00:30.097907 4893 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 07:00:30 crc kubenswrapper[4893]: E0314 07:00:30.097928 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 07:00:31.097921138 +0000 UTC m=+110.360097930 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 07:00:30 crc kubenswrapper[4893]: E0314 07:00:30.097967 4893 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 07:00:30 crc kubenswrapper[4893]: E0314 07:00:30.097992 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 07:00:31.09798516 +0000 UTC m=+110.360161952 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 07:00:30 crc kubenswrapper[4893]: E0314 07:00:30.098037 4893 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 07:00:30 crc kubenswrapper[4893]: E0314 07:00:30.098047 4893 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 07:00:30 crc kubenswrapper[4893]: E0314 07:00:30.098054 4893 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 07:00:30 crc kubenswrapper[4893]: E0314 07:00:30.098073 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-14 07:00:31.098067012 +0000 UTC m=+110.360243804 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.122274 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.122330 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.122347 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.122369 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.122383 4893 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:30Z","lastTransitionTime":"2026-03-14T07:00:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.224482 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.224542 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.224555 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.224571 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.224581 4893 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:30Z","lastTransitionTime":"2026-03-14T07:00:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.326711 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.326751 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.326763 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.326780 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.326792 4893 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:30Z","lastTransitionTime":"2026-03-14T07:00:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.428680 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.428740 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.428750 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.428765 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.428774 4893 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:30Z","lastTransitionTime":"2026-03-14T07:00:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.532131 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.532190 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.532204 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.532228 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.532247 4893 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:30Z","lastTransitionTime":"2026-03-14T07:00:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.634862 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.634908 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.634919 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.634935 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.634946 4893 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:30Z","lastTransitionTime":"2026-03-14T07:00:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.738113 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.738180 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.738192 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.738214 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.738226 4893 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:30Z","lastTransitionTime":"2026-03-14T07:00:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.766843 4893 generic.go:334] "Generic (PLEG): container finished" podID="fa7a2543-8a0d-4d25-9e7a-bc387f9662df" containerID="c88e3264725c6c08616c082527e59f27875f23b6b601b99504f2f273127cdb9f" exitCode=0 Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.766920 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" event={"ID":"fa7a2543-8a0d-4d25-9e7a-bc387f9662df","Type":"ContainerDied","Data":"c88e3264725c6c08616c082527e59f27875f23b6b601b99504f2f273127cdb9f"} Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.766951 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" event={"ID":"fa7a2543-8a0d-4d25-9e7a-bc387f9662df","Type":"ContainerStarted","Data":"2a560e3ee24dac91f1ed7b177a5be2c4f340aa317b22682e48b6608050ff8859"} Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.770896 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" event={"ID":"ad6724e5-48cf-4417-ae51-b1cb8c6af70d","Type":"ContainerStarted","Data":"523e9aca0e5eed44612057b10e02bef99c49b106a70747ec01930fa72cf66bfc"} Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.770973 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" event={"ID":"ad6724e5-48cf-4417-ae51-b1cb8c6af70d","Type":"ContainerStarted","Data":"0b93320f866f07b1494ab844854d58a4a60af1526c128c8f2df7794c38234a32"} Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.770989 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" event={"ID":"ad6724e5-48cf-4417-ae51-b1cb8c6af70d","Type":"ContainerStarted","Data":"81d44d26d2efbfab7119843e9ad357c02de68489182c4666b3f474aa3ef7cbd5"} Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.772618 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"448896e9ff516d8c9a1a737b78fc8aae1f7404b4e29069564072dfcefeeeb5a7"} Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.773889 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"e4b73d9d1a43e4e46dccbe21bc957b31f2831a0d29cfb4fc017d947e944bebd2"} Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.775813 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"4847f894b03f4ba9cef945d6ea3e6082ce906df7f650d421374cfcb3fc06d37f"} Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.775868 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"31ee0aeffeb990f98cf4da832e1d0aed3e0a2f98bafa533a677a7848ca2ed2f2"} Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.777400 4893 generic.go:334] "Generic (PLEG): container finished" podID="bdb2af2a-f12a-4cc9-ae9a-368e0f129c18" containerID="a5c0fb081068fe12c63e5e2b58027b3b89a53d4ab2a6e521509ae58936c0c3c8" exitCode=0 Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.777487 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-klzvc" event={"ID":"bdb2af2a-f12a-4cc9-ae9a-368e0f129c18","Type":"ContainerDied","Data":"a5c0fb081068fe12c63e5e2b58027b3b89a53d4ab2a6e521509ae58936c0c3c8"} Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.777513 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-klzvc" event={"ID":"bdb2af2a-f12a-4cc9-ae9a-368e0f129c18","Type":"ContainerStarted","Data":"52393543484be0e0fadc5614e3968d034688f72dbef4307530a5cf216ba58542"} Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.779870 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-tfcrc" event={"ID":"a0c1abff-0fdb-44d5-8dd8-c15eb3fca612","Type":"ContainerStarted","Data":"ad810608cf86dc6f4e3619dff881eca144ebe37c5689f55b55b258ec2689f062"} Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.781715 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-5bsmj" event={"ID":"7f24038e-0fe8-4285-86be-96e95c29827b","Type":"ContainerStarted","Data":"e2d87c66b4cd7cf1075f9c0513ac785fcb0e0baa2cf37a092c9c4118a0eb0c8c"} Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.783385 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hk75c" event={"ID":"9d0cffc0-c15f-4461-817c-1a937ad2afba","Type":"ContainerStarted","Data":"cc713a16aeafba57ddb4b72c228d42d3c34a69d0bcfca89de5ddf76c94a6b0fa"} Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.783622 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hk75c" event={"ID":"9d0cffc0-c15f-4461-817c-1a937ad2afba","Type":"ContainerStarted","Data":"a4f7576a77b8b7fe991e8968b0899eb1c40886a02cdf2b2cd7eafa611f70dcaa"} Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.790268 4893 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:30Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.806173 4893 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:30Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.821609 4893 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tfcrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0c1abff-0fdb-44d5-8dd8-c15eb3fca612\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xsgd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tfcrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:30Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.835367 4893 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:30Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.841867 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.841905 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.841915 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.841931 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.841943 4893 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:30Z","lastTransitionTime":"2026-03-14T07:00:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.846690 4893 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5bsmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f24038e-0fe8-4285-86be-96e95c29827b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rksds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5bsmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:30Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.858807 4893 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:30Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.873483 4893 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-klzvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdb2af2a-f12a-4cc9-ae9a-368e0f129c18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x5zb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x5zb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x5zb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x5zb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x5zb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x5zb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x5zb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-klzvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:30Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.884430 4893 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:30Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.896660 4893 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hk75c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d0cffc0-c15f-4461-817c-1a937ad2afba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kp2qs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hk75c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:30Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.914435 4893 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd7rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd7rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd7rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd7rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd7rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd7rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd7rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd7rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c88e3264725c6c08616c082527e59f27875f23b6b601b99504f2f273127cdb9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c88e3264725c6c08616c082527e59f27875f23b6b601b99504f2f273127cdb9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd7rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cbskd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:30Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.932232 4893 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:30Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.944970 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.944996 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.945004 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.945017 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.945026 4893 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:30Z","lastTransitionTime":"2026-03-14T07:00:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.946046 4893 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad6724e5-48cf-4417-ae51-b1cb8c6af70d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kbwwm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kbwwm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d4x6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:30Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.959018 4893 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://448896e9ff516d8c9a1a737b78fc8aae1f7404b4e29069564072dfcefeeeb5a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:30Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.972416 4893 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:30Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.984378 4893 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tfcrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0c1abff-0fdb-44d5-8dd8-c15eb3fca612\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad810608cf86dc6f4e3619dff881eca144ebe37c5689f55b55b258ec2689f062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xsgd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tfcrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:30Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:30 crc kubenswrapper[4893]: I0314 07:00:30.997763 4893 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:30Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.008362 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:00:31 crc kubenswrapper[4893]: E0314 07:00:31.008537 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:00:33.008495742 +0000 UTC m=+112.270672534 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.010426 4893 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5bsmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f24038e-0fe8-4285-86be-96e95c29827b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2d87c66b4cd7cf1075f9c0513ac785fcb0e0baa2cf37a092c9c4118a0eb0c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rksds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5bsmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.023247 4893 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4847f894b03f4ba9cef945d6ea3e6082ce906df7f650d421374cfcb3fc06d37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31ee0aeffeb990f98cf4da832e1d0aed3e0a2f98bafa533a677a7848ca2ed2f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.036565 4893 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-klzvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdb2af2a-f12a-4cc9-ae9a-368e0f129c18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x5zb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5c0fb081068fe12c63e5e2b58027b3b89a53d4ab2a6e521509ae58936c0c3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5c0fb081068fe12c63e5e2b58027b3b89a53d4ab2a6e521509ae58936c0c3c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x5zb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x5zb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x5zb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x5zb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x5zb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x5zb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-klzvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.046513 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.046565 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.046573 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.046588 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.046599 4893 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:31Z","lastTransitionTime":"2026-03-14T07:00:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.047986 4893 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.060467 4893 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hk75c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d0cffc0-c15f-4461-817c-1a937ad2afba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc713a16aeafba57ddb4b72c228d42d3c34a69d0bcfca89de5ddf76c94a6b0fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kp2qs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hk75c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.082341 4893 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd7rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd7rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd7rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd7rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd7rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd7rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd7rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd7rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c88e3264725c6c08616c082527e59f27875f23b6b601b99504f2f273127cdb9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c88e3264725c6c08616c082527e59f27875f23b6b601b99504f2f273127cdb9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd7rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cbskd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.099486 4893 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.109802 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.109855 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.109881 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.109911 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:00:31 crc kubenswrapper[4893]: E0314 07:00:31.110016 4893 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 07:00:31 crc kubenswrapper[4893]: E0314 07:00:31.110067 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 07:00:33.110052753 +0000 UTC m=+112.372229545 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 07:00:31 crc kubenswrapper[4893]: E0314 07:00:31.110275 4893 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 07:00:31 crc kubenswrapper[4893]: E0314 07:00:31.110301 4893 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 07:00:31 crc kubenswrapper[4893]: E0314 07:00:31.110321 4893 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 07:00:31 crc kubenswrapper[4893]: E0314 07:00:31.110335 4893 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 07:00:31 crc kubenswrapper[4893]: E0314 07:00:31.110345 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 07:00:33.11032987 +0000 UTC m=+112.372506662 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 07:00:31 crc kubenswrapper[4893]: E0314 07:00:31.110362 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-14 07:00:33.110353981 +0000 UTC m=+112.372530773 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 07:00:31 crc kubenswrapper[4893]: E0314 07:00:31.110609 4893 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 07:00:31 crc kubenswrapper[4893]: E0314 07:00:31.110695 4893 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 07:00:31 crc kubenswrapper[4893]: E0314 07:00:31.110771 4893 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 07:00:31 crc kubenswrapper[4893]: E0314 07:00:31.110884 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-14 07:00:33.110871914 +0000 UTC m=+112.373048806 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.112711 4893 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad6724e5-48cf-4417-ae51-b1cb8c6af70d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://523e9aca0e5eed44612057b10e02bef99c49b106a70747ec01930fa72cf66bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kbwwm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b93320f866f07b1494ab844854d58a4a60af1526c128c8f2df7794c38234a32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kbwwm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d4x6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.150469 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.150501 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.150512 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.150546 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.150560 4893 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:31Z","lastTransitionTime":"2026-03-14T07:00:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.254303 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.254348 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.254357 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.254375 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.254386 4893 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:31Z","lastTransitionTime":"2026-03-14T07:00:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.357758 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.357800 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.357810 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.357826 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.357837 4893 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:31Z","lastTransitionTime":"2026-03-14T07:00:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.375888 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:00:31 crc kubenswrapper[4893]: E0314 07:00:31.376026 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.376319 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:00:31 crc kubenswrapper[4893]: E0314 07:00:31.376375 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.376738 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:00:31 crc kubenswrapper[4893]: E0314 07:00:31.376791 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.382169 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.383628 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.386315 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.387669 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.394872 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.396102 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.396915 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.398231 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.399003 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.400010 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.400608 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.401920 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.402533 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.403620 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.404379 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.404799 4893 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.405508 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.406296 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.406936 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.408763 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.410045 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.412870 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.413675 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.417008 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.417790 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.418924 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.419808 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.421501 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.423003 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.423645 4893 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hk75c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d0cffc0-c15f-4461-817c-1a937ad2afba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc713a16aeafba57ddb4b72c228d42d3c34a69d0bcfca89de5ddf76c94a6b0fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kp2qs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hk75c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.423750 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.424792 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.425321 4893 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.425428 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.427692 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.428335 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.428829 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.430465 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.431855 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.432429 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.433797 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.437468 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.439016 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.441137 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.442965 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.443683 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.444773 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.445752 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.447680 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.448692 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.449043 4893 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd7rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd7rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd7rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd7rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd7rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd7rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd7rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd7rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c88e3264725c6c08616c082527e59f27875f23b6b601b99504f2f273127cdb9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c88e3264725c6c08616c082527e59f27875f23b6b601b99504f2f273127cdb9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd7rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cbskd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.449858 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.450380 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.451374 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.451915 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.452502 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.454395 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.455147 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.460763 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.460814 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.460827 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.460887 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.460906 4893 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:31Z","lastTransitionTime":"2026-03-14T07:00:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.471735 4893 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4847f894b03f4ba9cef945d6ea3e6082ce906df7f650d421374cfcb3fc06d37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31ee0aeffeb990f98cf4da832e1d0aed3e0a2f98bafa533a677a7848ca2ed2f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.485690 4893 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-klzvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdb2af2a-f12a-4cc9-ae9a-368e0f129c18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x5zb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5c0fb081068fe12c63e5e2b58027b3b89a53d4ab2a6e521509ae58936c0c3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5c0fb081068fe12c63e5e2b58027b3b89a53d4ab2a6e521509ae58936c0c3c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x5zb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x5zb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x5zb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x5zb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x5zb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x5zb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-klzvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.497826 4893 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad6724e5-48cf-4417-ae51-b1cb8c6af70d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://523e9aca0e5eed44612057b10e02bef99c49b106a70747ec01930fa72cf66bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kbwwm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b93320f866f07b1494ab844854d58a4a60af1526c128c8f2df7794c38234a32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kbwwm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d4x6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.515897 4893 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.529727 4893 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tfcrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0c1abff-0fdb-44d5-8dd8-c15eb3fca612\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad810608cf86dc6f4e3619dff881eca144ebe37c5689f55b55b258ec2689f062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xsgd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tfcrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.543863 4893 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.553412 4893 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5bsmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f24038e-0fe8-4285-86be-96e95c29827b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2d87c66b4cd7cf1075f9c0513ac785fcb0e0baa2cf37a092c9c4118a0eb0c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rksds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5bsmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.562574 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.562602 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.562610 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.562624 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.562636 4893 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:31Z","lastTransitionTime":"2026-03-14T07:00:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.564065 4893 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://448896e9ff516d8c9a1a737b78fc8aae1f7404b4e29069564072dfcefeeeb5a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.573764 4893 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.664259 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.664293 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.664302 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.664318 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.664329 4893 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:31Z","lastTransitionTime":"2026-03-14T07:00:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.766681 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.766715 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.766725 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.766740 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.766751 4893 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:31Z","lastTransitionTime":"2026-03-14T07:00:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.791483 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-klzvc" event={"ID":"bdb2af2a-f12a-4cc9-ae9a-368e0f129c18","Type":"ContainerStarted","Data":"28699ce876444560690e715a56673846cc3a90335fa837b181232e537f6fa62f"} Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.795080 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" event={"ID":"fa7a2543-8a0d-4d25-9e7a-bc387f9662df","Type":"ContainerStarted","Data":"ddc3fe94134a2ec81663c46aa2f6886c3f380d1c7c5eba5a895d452f8d140953"} Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.795115 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" event={"ID":"fa7a2543-8a0d-4d25-9e7a-bc387f9662df","Type":"ContainerStarted","Data":"03a4f2bcd347b9f116d0bb722090bc3d64c5bbada85d7a2085b0df772b554219"} Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.795124 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" event={"ID":"fa7a2543-8a0d-4d25-9e7a-bc387f9662df","Type":"ContainerStarted","Data":"6f567feeedfca9666b2b3013232517cb979a7009fe9e67e796e5cfa0e8c747be"} Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.795132 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" event={"ID":"fa7a2543-8a0d-4d25-9e7a-bc387f9662df","Type":"ContainerStarted","Data":"d26c4e4548c14b8e7000a08e95791dbba2dbf19d141a9339a1189eef4a6671d4"} Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.805980 4893 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://448896e9ff516d8c9a1a737b78fc8aae1f7404b4e29069564072dfcefeeeb5a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.821428 4893 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.836867 4893 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tfcrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0c1abff-0fdb-44d5-8dd8-c15eb3fca612\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad810608cf86dc6f4e3619dff881eca144ebe37c5689f55b55b258ec2689f062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xsgd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tfcrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.851589 4893 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.871771 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.871803 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.871811 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.871831 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.871841 4893 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:31Z","lastTransitionTime":"2026-03-14T07:00:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.873374 4893 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5bsmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f24038e-0fe8-4285-86be-96e95c29827b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2d87c66b4cd7cf1075f9c0513ac785fcb0e0baa2cf37a092c9c4118a0eb0c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rksds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5bsmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.903033 4893 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080e3e57-b882-4fa3-9a79-e607e9d60285\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42891f5e3a86654fc47e53765ed9c9e2ce9d0e39ed604bf43ce1c76b529ed497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:58:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8974c67f4129571db545fe36e7f7f95724131ca2c106f911fb6818194973164b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:58:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://492e2e17e5f7c53b824566055b07fef61f9fb8bac73a68ca7c75594d234132c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:58:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8ef7634592d5d444110f1a0effaa6ed17ea63d7159bcb2ccc5137d7ab2d5bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:58:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95004116439f8ed03a26654e0d7eaf325a1c1810c8799d6a9ab77642607e98c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:58:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f95e14126ef8d4e1faa84d25aab7dcf4bad7686037ad804d5a026263069ab672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f95e14126ef8d4e1faa84d25aab7dcf4bad7686037ad804d5a026263069ab672\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42efd85c7afaac4c43b9abfc44f5beaee818177931d19d862a16e7a73baea7f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42efd85c7afaac4c43b9abfc44f5beaee818177931d19d862a16e7a73baea7f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://12e9eb77cfafad5ca5ca1ca71de32acd0bd172b4ff6bba5d43d51b1eb43c3a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12e9eb77cfafad5ca5ca1ca71de32acd0bd172b4ff6bba5d43d51b1eb43c3a1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.924195 4893 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4847f894b03f4ba9cef945d6ea3e6082ce906df7f650d421374cfcb3fc06d37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31ee0aeffeb990f98cf4da832e1d0aed3e0a2f98bafa533a677a7848ca2ed2f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.974687 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.974762 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.974793 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.974823 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:31 crc kubenswrapper[4893]: I0314 07:00:31.974835 4893 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:31Z","lastTransitionTime":"2026-03-14T07:00:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.015467 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-hk75c" podStartSLOduration=49.015449541 podStartE2EDuration="49.015449541s" podCreationTimestamp="2026-03-14 06:59:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:00:31.990301072 +0000 UTC m=+111.252477864" watchObservedRunningTime="2026-03-14 07:00:32.015449541 +0000 UTC m=+111.277626323" Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.040465 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podStartSLOduration=49.040448178 podStartE2EDuration="49.040448178s" podCreationTimestamp="2026-03-14 06:59:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:00:32.040314484 +0000 UTC m=+111.302491286" watchObservedRunningTime="2026-03-14 07:00:32.040448178 +0000 UTC m=+111.302624970" Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.077146 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.077174 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.077181 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.077193 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.077202 4893 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:32Z","lastTransitionTime":"2026-03-14T07:00:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.097775 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wn48b"] Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.098156 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wn48b" Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.100194 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.102615 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.129601 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=1.129579657 podStartE2EDuration="1.129579657s" podCreationTimestamp="2026-03-14 07:00:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:00:32.128999084 +0000 UTC m=+111.391175896" watchObservedRunningTime="2026-03-14 07:00:32.129579657 +0000 UTC m=+111.391756469" Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.142330 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-7qz9v"] Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.142802 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7qz9v" Mar 14 07:00:32 crc kubenswrapper[4893]: E0314 07:00:32.142858 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7qz9v" podUID="3bbb5656-173c-4f14-b307-e5195b172ddd" Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.179843 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.179892 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.179902 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.179918 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.179930 4893 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:32Z","lastTransitionTime":"2026-03-14T07:00:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.206646 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-tfcrc" podStartSLOduration=49.206624325 podStartE2EDuration="49.206624325s" podCreationTimestamp="2026-03-14 06:59:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:00:32.206584974 +0000 UTC m=+111.468761766" watchObservedRunningTime="2026-03-14 07:00:32.206624325 +0000 UTC m=+111.468801117" Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.222022 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1a62643f-27cd-449a-ba98-c3ad3d21f579-env-overrides\") pod \"ovnkube-control-plane-749d76644c-wn48b\" (UID: \"1a62643f-27cd-449a-ba98-c3ad3d21f579\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wn48b" Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.222081 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s27zb\" (UniqueName: \"kubernetes.io/projected/3bbb5656-173c-4f14-b307-e5195b172ddd-kube-api-access-s27zb\") pod \"network-metrics-daemon-7qz9v\" (UID: \"3bbb5656-173c-4f14-b307-e5195b172ddd\") " pod="openshift-multus/network-metrics-daemon-7qz9v" Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.222103 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1a62643f-27cd-449a-ba98-c3ad3d21f579-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-wn48b\" (UID: \"1a62643f-27cd-449a-ba98-c3ad3d21f579\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wn48b" Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.222124 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1a62643f-27cd-449a-ba98-c3ad3d21f579-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-wn48b\" (UID: \"1a62643f-27cd-449a-ba98-c3ad3d21f579\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wn48b" Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.222142 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkrhq\" (UniqueName: \"kubernetes.io/projected/1a62643f-27cd-449a-ba98-c3ad3d21f579-kube-api-access-jkrhq\") pod \"ovnkube-control-plane-749d76644c-wn48b\" (UID: \"1a62643f-27cd-449a-ba98-c3ad3d21f579\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wn48b" Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.222157 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3bbb5656-173c-4f14-b307-e5195b172ddd-metrics-certs\") pod \"network-metrics-daemon-7qz9v\" (UID: \"3bbb5656-173c-4f14-b307-e5195b172ddd\") " pod="openshift-multus/network-metrics-daemon-7qz9v" Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.235358 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-5bsmj" podStartSLOduration=49.235343642 podStartE2EDuration="49.235343642s" podCreationTimestamp="2026-03-14 06:59:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:00:32.235054855 +0000 UTC m=+111.497231647" watchObservedRunningTime="2026-03-14 07:00:32.235343642 +0000 UTC m=+111.497520434" Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.282167 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.282203 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.282214 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.282228 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.282239 4893 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:32Z","lastTransitionTime":"2026-03-14T07:00:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.322987 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1a62643f-27cd-449a-ba98-c3ad3d21f579-env-overrides\") pod \"ovnkube-control-plane-749d76644c-wn48b\" (UID: \"1a62643f-27cd-449a-ba98-c3ad3d21f579\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wn48b" Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.323125 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s27zb\" (UniqueName: \"kubernetes.io/projected/3bbb5656-173c-4f14-b307-e5195b172ddd-kube-api-access-s27zb\") pod \"network-metrics-daemon-7qz9v\" (UID: \"3bbb5656-173c-4f14-b307-e5195b172ddd\") " pod="openshift-multus/network-metrics-daemon-7qz9v" Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.323228 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1a62643f-27cd-449a-ba98-c3ad3d21f579-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-wn48b\" (UID: \"1a62643f-27cd-449a-ba98-c3ad3d21f579\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wn48b" Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.323391 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1a62643f-27cd-449a-ba98-c3ad3d21f579-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-wn48b\" (UID: \"1a62643f-27cd-449a-ba98-c3ad3d21f579\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wn48b" Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.323494 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkrhq\" (UniqueName: \"kubernetes.io/projected/1a62643f-27cd-449a-ba98-c3ad3d21f579-kube-api-access-jkrhq\") pod \"ovnkube-control-plane-749d76644c-wn48b\" (UID: \"1a62643f-27cd-449a-ba98-c3ad3d21f579\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wn48b" Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.323618 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3bbb5656-173c-4f14-b307-e5195b172ddd-metrics-certs\") pod \"network-metrics-daemon-7qz9v\" (UID: \"3bbb5656-173c-4f14-b307-e5195b172ddd\") " pod="openshift-multus/network-metrics-daemon-7qz9v" Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.323752 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1a62643f-27cd-449a-ba98-c3ad3d21f579-env-overrides\") pod \"ovnkube-control-plane-749d76644c-wn48b\" (UID: \"1a62643f-27cd-449a-ba98-c3ad3d21f579\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wn48b" Mar 14 07:00:32 crc kubenswrapper[4893]: E0314 07:00:32.323789 4893 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.323930 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1a62643f-27cd-449a-ba98-c3ad3d21f579-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-wn48b\" (UID: \"1a62643f-27cd-449a-ba98-c3ad3d21f579\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wn48b" Mar 14 07:00:32 crc kubenswrapper[4893]: E0314 07:00:32.323981 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bbb5656-173c-4f14-b307-e5195b172ddd-metrics-certs podName:3bbb5656-173c-4f14-b307-e5195b172ddd nodeName:}" failed. No retries permitted until 2026-03-14 07:00:32.823939859 +0000 UTC m=+112.086116681 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3bbb5656-173c-4f14-b307-e5195b172ddd-metrics-certs") pod "network-metrics-daemon-7qz9v" (UID: "3bbb5656-173c-4f14-b307-e5195b172ddd") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.332048 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1a62643f-27cd-449a-ba98-c3ad3d21f579-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-wn48b\" (UID: \"1a62643f-27cd-449a-ba98-c3ad3d21f579\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wn48b" Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.342905 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkrhq\" (UniqueName: \"kubernetes.io/projected/1a62643f-27cd-449a-ba98-c3ad3d21f579-kube-api-access-jkrhq\") pod \"ovnkube-control-plane-749d76644c-wn48b\" (UID: \"1a62643f-27cd-449a-ba98-c3ad3d21f579\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wn48b" Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.344346 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s27zb\" (UniqueName: \"kubernetes.io/projected/3bbb5656-173c-4f14-b307-e5195b172ddd-kube-api-access-s27zb\") pod \"network-metrics-daemon-7qz9v\" (UID: \"3bbb5656-173c-4f14-b307-e5195b172ddd\") " pod="openshift-multus/network-metrics-daemon-7qz9v" Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.383997 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.384041 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.384054 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.384070 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.384080 4893 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:32Z","lastTransitionTime":"2026-03-14T07:00:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.412587 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wn48b" Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.486928 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.487004 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.487024 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.487053 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.487074 4893 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:32Z","lastTransitionTime":"2026-03-14T07:00:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.589825 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.589871 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.589882 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.589901 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.589913 4893 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:32Z","lastTransitionTime":"2026-03-14T07:00:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.692971 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.693011 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.693024 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.693040 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.693052 4893 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:32Z","lastTransitionTime":"2026-03-14T07:00:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.795813 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.795880 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.795896 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.795919 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.795937 4893 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:32Z","lastTransitionTime":"2026-03-14T07:00:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.801038 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" event={"ID":"fa7a2543-8a0d-4d25-9e7a-bc387f9662df","Type":"ContainerStarted","Data":"f37f44adb89d717c3f8f96c0bec6073b95833268279249dd58f74cc6101590b7"} Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.801095 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" event={"ID":"fa7a2543-8a0d-4d25-9e7a-bc387f9662df","Type":"ContainerStarted","Data":"67df34ecb8f8d052c704bb968928a75eda35defdf1070c97adace5a42bf07a75"} Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.802419 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"edc5f089ca4221afb9859fcd0d47ae8262eac499bd59d1fb5f12c44802f7f7c5"} Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.807787 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wn48b" event={"ID":"1a62643f-27cd-449a-ba98-c3ad3d21f579","Type":"ContainerStarted","Data":"4c48347adbbfee57c5606aa50b5c6111f99db7500cd5cad8b8cf02c3be7c5539"} Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.807823 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wn48b" event={"ID":"1a62643f-27cd-449a-ba98-c3ad3d21f579","Type":"ContainerStarted","Data":"5cfe30b7e1ce56daf0110ddb57b409b85f14afa9cdbb3de9f83e8eb27500d0f4"} Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.807836 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wn48b" event={"ID":"1a62643f-27cd-449a-ba98-c3ad3d21f579","Type":"ContainerStarted","Data":"a69e1f9fb2767e9a8c2320683bfbde8aa5795ab5a4a408d308366f4c919584c2"} Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.816837 4893 generic.go:334] "Generic (PLEG): container finished" podID="bdb2af2a-f12a-4cc9-ae9a-368e0f129c18" containerID="28699ce876444560690e715a56673846cc3a90335fa837b181232e537f6fa62f" exitCode=0 Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.816905 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-klzvc" event={"ID":"bdb2af2a-f12a-4cc9-ae9a-368e0f129c18","Type":"ContainerDied","Data":"28699ce876444560690e715a56673846cc3a90335fa837b181232e537f6fa62f"} Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.832114 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3bbb5656-173c-4f14-b307-e5195b172ddd-metrics-certs\") pod \"network-metrics-daemon-7qz9v\" (UID: \"3bbb5656-173c-4f14-b307-e5195b172ddd\") " pod="openshift-multus/network-metrics-daemon-7qz9v" Mar 14 07:00:32 crc kubenswrapper[4893]: E0314 07:00:32.832187 4893 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 14 07:00:32 crc kubenswrapper[4893]: E0314 07:00:32.832334 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bbb5656-173c-4f14-b307-e5195b172ddd-metrics-certs podName:3bbb5656-173c-4f14-b307-e5195b172ddd nodeName:}" failed. No retries permitted until 2026-03-14 07:00:33.832304742 +0000 UTC m=+113.094481704 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3bbb5656-173c-4f14-b307-e5195b172ddd-metrics-certs") pod "network-metrics-daemon-7qz9v" (UID: "3bbb5656-173c-4f14-b307-e5195b172ddd") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.846066 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wn48b" podStartSLOduration=49.846013415 podStartE2EDuration="49.846013415s" podCreationTimestamp="2026-03-14 06:59:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:00:32.844744704 +0000 UTC m=+112.106921496" watchObservedRunningTime="2026-03-14 07:00:32.846013415 +0000 UTC m=+112.108190207" Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.898427 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.898791 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.898803 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.898817 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:32 crc kubenswrapper[4893]: I0314 07:00:32.898828 4893 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:32Z","lastTransitionTime":"2026-03-14T07:00:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:33 crc kubenswrapper[4893]: I0314 07:00:33.001328 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:33 crc kubenswrapper[4893]: I0314 07:00:33.001428 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:33 crc kubenswrapper[4893]: I0314 07:00:33.001454 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:33 crc kubenswrapper[4893]: I0314 07:00:33.001489 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:33 crc kubenswrapper[4893]: I0314 07:00:33.001509 4893 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:33Z","lastTransitionTime":"2026-03-14T07:00:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:33 crc kubenswrapper[4893]: I0314 07:00:33.034853 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:00:33 crc kubenswrapper[4893]: E0314 07:00:33.035069 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:00:37.035049927 +0000 UTC m=+116.297226719 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:00:33 crc kubenswrapper[4893]: I0314 07:00:33.105447 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:33 crc kubenswrapper[4893]: I0314 07:00:33.105515 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:33 crc kubenswrapper[4893]: I0314 07:00:33.105570 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:33 crc kubenswrapper[4893]: I0314 07:00:33.105591 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:33 crc kubenswrapper[4893]: I0314 07:00:33.105602 4893 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:33Z","lastTransitionTime":"2026-03-14T07:00:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:33 crc kubenswrapper[4893]: I0314 07:00:33.135986 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:00:33 crc kubenswrapper[4893]: I0314 07:00:33.136026 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:00:33 crc kubenswrapper[4893]: I0314 07:00:33.136044 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:00:33 crc kubenswrapper[4893]: I0314 07:00:33.136087 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:00:33 crc kubenswrapper[4893]: E0314 07:00:33.136157 4893 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 07:00:33 crc kubenswrapper[4893]: E0314 07:00:33.136211 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 07:00:37.136196729 +0000 UTC m=+116.398373521 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 07:00:33 crc kubenswrapper[4893]: E0314 07:00:33.136212 4893 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 07:00:33 crc kubenswrapper[4893]: E0314 07:00:33.136215 4893 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 07:00:33 crc kubenswrapper[4893]: E0314 07:00:33.136228 4893 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 07:00:33 crc kubenswrapper[4893]: E0314 07:00:33.136243 4893 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 07:00:33 crc kubenswrapper[4893]: E0314 07:00:33.136279 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 07:00:37.1362626 +0000 UTC m=+116.398439392 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 07:00:33 crc kubenswrapper[4893]: E0314 07:00:33.136300 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-14 07:00:37.136291271 +0000 UTC m=+116.398468063 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 07:00:33 crc kubenswrapper[4893]: E0314 07:00:33.136363 4893 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 07:00:33 crc kubenswrapper[4893]: E0314 07:00:33.136395 4893 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 07:00:33 crc kubenswrapper[4893]: E0314 07:00:33.136405 4893 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 07:00:33 crc kubenswrapper[4893]: E0314 07:00:33.136439 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-14 07:00:37.136431244 +0000 UTC m=+116.398608036 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 07:00:33 crc kubenswrapper[4893]: I0314 07:00:33.208572 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:33 crc kubenswrapper[4893]: I0314 07:00:33.208603 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:33 crc kubenswrapper[4893]: I0314 07:00:33.208612 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:33 crc kubenswrapper[4893]: I0314 07:00:33.208624 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:33 crc kubenswrapper[4893]: I0314 07:00:33.208632 4893 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:33Z","lastTransitionTime":"2026-03-14T07:00:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:33 crc kubenswrapper[4893]: I0314 07:00:33.310740 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:33 crc kubenswrapper[4893]: I0314 07:00:33.310785 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:33 crc kubenswrapper[4893]: I0314 07:00:33.310797 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:33 crc kubenswrapper[4893]: I0314 07:00:33.310812 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:33 crc kubenswrapper[4893]: I0314 07:00:33.310824 4893 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:33Z","lastTransitionTime":"2026-03-14T07:00:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:33 crc kubenswrapper[4893]: I0314 07:00:33.376100 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:00:33 crc kubenswrapper[4893]: E0314 07:00:33.376213 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 07:00:33 crc kubenswrapper[4893]: I0314 07:00:33.376567 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:00:33 crc kubenswrapper[4893]: E0314 07:00:33.376618 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 07:00:33 crc kubenswrapper[4893]: I0314 07:00:33.376061 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:00:33 crc kubenswrapper[4893]: E0314 07:00:33.378821 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 07:00:33 crc kubenswrapper[4893]: I0314 07:00:33.412953 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:33 crc kubenswrapper[4893]: I0314 07:00:33.412987 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:33 crc kubenswrapper[4893]: I0314 07:00:33.412995 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:33 crc kubenswrapper[4893]: I0314 07:00:33.413009 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:33 crc kubenswrapper[4893]: I0314 07:00:33.413018 4893 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:33Z","lastTransitionTime":"2026-03-14T07:00:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:33 crc kubenswrapper[4893]: I0314 07:00:33.515982 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:33 crc kubenswrapper[4893]: I0314 07:00:33.516709 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:33 crc kubenswrapper[4893]: I0314 07:00:33.516743 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:33 crc kubenswrapper[4893]: I0314 07:00:33.516765 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:33 crc kubenswrapper[4893]: I0314 07:00:33.516775 4893 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:33Z","lastTransitionTime":"2026-03-14T07:00:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:33 crc kubenswrapper[4893]: I0314 07:00:33.619091 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:33 crc kubenswrapper[4893]: I0314 07:00:33.619128 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:33 crc kubenswrapper[4893]: I0314 07:00:33.619138 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:33 crc kubenswrapper[4893]: I0314 07:00:33.619153 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:33 crc kubenswrapper[4893]: I0314 07:00:33.619163 4893 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:33Z","lastTransitionTime":"2026-03-14T07:00:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:33 crc kubenswrapper[4893]: I0314 07:00:33.721284 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:33 crc kubenswrapper[4893]: I0314 07:00:33.721337 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:33 crc kubenswrapper[4893]: I0314 07:00:33.721349 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:33 crc kubenswrapper[4893]: I0314 07:00:33.721372 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:33 crc kubenswrapper[4893]: I0314 07:00:33.721384 4893 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:33Z","lastTransitionTime":"2026-03-14T07:00:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:33 crc kubenswrapper[4893]: I0314 07:00:33.822858 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:33 crc kubenswrapper[4893]: I0314 07:00:33.822893 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:33 crc kubenswrapper[4893]: I0314 07:00:33.822903 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:33 crc kubenswrapper[4893]: I0314 07:00:33.822917 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:33 crc kubenswrapper[4893]: I0314 07:00:33.822922 4893 generic.go:334] "Generic (PLEG): container finished" podID="bdb2af2a-f12a-4cc9-ae9a-368e0f129c18" containerID="b41c95413fb5dffcfa5bd50ef13e952eb50ac59d55255bce4c3f5cc6b0e26acc" exitCode=0 Mar 14 07:00:33 crc kubenswrapper[4893]: I0314 07:00:33.823001 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-klzvc" event={"ID":"bdb2af2a-f12a-4cc9-ae9a-368e0f129c18","Type":"ContainerDied","Data":"b41c95413fb5dffcfa5bd50ef13e952eb50ac59d55255bce4c3f5cc6b0e26acc"} Mar 14 07:00:33 crc kubenswrapper[4893]: I0314 07:00:33.822927 4893 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:33Z","lastTransitionTime":"2026-03-14T07:00:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:33 crc kubenswrapper[4893]: I0314 07:00:33.846593 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3bbb5656-173c-4f14-b307-e5195b172ddd-metrics-certs\") pod \"network-metrics-daemon-7qz9v\" (UID: \"3bbb5656-173c-4f14-b307-e5195b172ddd\") " pod="openshift-multus/network-metrics-daemon-7qz9v" Mar 14 07:00:33 crc kubenswrapper[4893]: E0314 07:00:33.846748 4893 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 14 07:00:33 crc kubenswrapper[4893]: E0314 07:00:33.846824 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bbb5656-173c-4f14-b307-e5195b172ddd-metrics-certs podName:3bbb5656-173c-4f14-b307-e5195b172ddd nodeName:}" failed. No retries permitted until 2026-03-14 07:00:35.846808035 +0000 UTC m=+115.108984827 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3bbb5656-173c-4f14-b307-e5195b172ddd-metrics-certs") pod "network-metrics-daemon-7qz9v" (UID: "3bbb5656-173c-4f14-b307-e5195b172ddd") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 14 07:00:33 crc kubenswrapper[4893]: I0314 07:00:33.925550 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:33 crc kubenswrapper[4893]: I0314 07:00:33.925586 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:33 crc kubenswrapper[4893]: I0314 07:00:33.925594 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:33 crc kubenswrapper[4893]: I0314 07:00:33.925607 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:33 crc kubenswrapper[4893]: I0314 07:00:33.925616 4893 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:33Z","lastTransitionTime":"2026-03-14T07:00:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:34 crc kubenswrapper[4893]: I0314 07:00:34.030274 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:34 crc kubenswrapper[4893]: I0314 07:00:34.030451 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:34 crc kubenswrapper[4893]: I0314 07:00:34.030561 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:34 crc kubenswrapper[4893]: I0314 07:00:34.030671 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:34 crc kubenswrapper[4893]: I0314 07:00:34.030757 4893 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:34Z","lastTransitionTime":"2026-03-14T07:00:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:34 crc kubenswrapper[4893]: I0314 07:00:34.134095 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:34 crc kubenswrapper[4893]: I0314 07:00:34.134130 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:34 crc kubenswrapper[4893]: I0314 07:00:34.134138 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:34 crc kubenswrapper[4893]: I0314 07:00:34.134153 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:34 crc kubenswrapper[4893]: I0314 07:00:34.134163 4893 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:34Z","lastTransitionTime":"2026-03-14T07:00:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:34 crc kubenswrapper[4893]: I0314 07:00:34.237069 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:34 crc kubenswrapper[4893]: I0314 07:00:34.237103 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:34 crc kubenswrapper[4893]: I0314 07:00:34.237113 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:34 crc kubenswrapper[4893]: I0314 07:00:34.237129 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:34 crc kubenswrapper[4893]: I0314 07:00:34.237139 4893 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:34Z","lastTransitionTime":"2026-03-14T07:00:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:34 crc kubenswrapper[4893]: I0314 07:00:34.339219 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:34 crc kubenswrapper[4893]: I0314 07:00:34.339246 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:34 crc kubenswrapper[4893]: I0314 07:00:34.339254 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:34 crc kubenswrapper[4893]: I0314 07:00:34.339266 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:34 crc kubenswrapper[4893]: I0314 07:00:34.339275 4893 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:34Z","lastTransitionTime":"2026-03-14T07:00:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:34 crc kubenswrapper[4893]: I0314 07:00:34.375989 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7qz9v" Mar 14 07:00:34 crc kubenswrapper[4893]: E0314 07:00:34.376086 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7qz9v" podUID="3bbb5656-173c-4f14-b307-e5195b172ddd" Mar 14 07:00:34 crc kubenswrapper[4893]: I0314 07:00:34.441853 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:34 crc kubenswrapper[4893]: I0314 07:00:34.441884 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:34 crc kubenswrapper[4893]: I0314 07:00:34.441892 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:34 crc kubenswrapper[4893]: I0314 07:00:34.441906 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:34 crc kubenswrapper[4893]: I0314 07:00:34.441915 4893 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:34Z","lastTransitionTime":"2026-03-14T07:00:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:34 crc kubenswrapper[4893]: I0314 07:00:34.544678 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:34 crc kubenswrapper[4893]: I0314 07:00:34.544734 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:34 crc kubenswrapper[4893]: I0314 07:00:34.544750 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:34 crc kubenswrapper[4893]: I0314 07:00:34.544772 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:34 crc kubenswrapper[4893]: I0314 07:00:34.544788 4893 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:34Z","lastTransitionTime":"2026-03-14T07:00:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:34 crc kubenswrapper[4893]: I0314 07:00:34.648215 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:34 crc kubenswrapper[4893]: I0314 07:00:34.648262 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:34 crc kubenswrapper[4893]: I0314 07:00:34.648279 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:34 crc kubenswrapper[4893]: I0314 07:00:34.648301 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:34 crc kubenswrapper[4893]: I0314 07:00:34.648317 4893 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:34Z","lastTransitionTime":"2026-03-14T07:00:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:34 crc kubenswrapper[4893]: I0314 07:00:34.751176 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:34 crc kubenswrapper[4893]: I0314 07:00:34.751249 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:34 crc kubenswrapper[4893]: I0314 07:00:34.751274 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:34 crc kubenswrapper[4893]: I0314 07:00:34.751333 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:34 crc kubenswrapper[4893]: I0314 07:00:34.751351 4893 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:34Z","lastTransitionTime":"2026-03-14T07:00:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:34 crc kubenswrapper[4893]: I0314 07:00:34.829289 4893 generic.go:334] "Generic (PLEG): container finished" podID="bdb2af2a-f12a-4cc9-ae9a-368e0f129c18" containerID="4663d50a04b909d56def6a7fca0b66270e887cf519589a5434bd4a1cccb9b3c7" exitCode=0 Mar 14 07:00:34 crc kubenswrapper[4893]: I0314 07:00:34.829389 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-klzvc" event={"ID":"bdb2af2a-f12a-4cc9-ae9a-368e0f129c18","Type":"ContainerDied","Data":"4663d50a04b909d56def6a7fca0b66270e887cf519589a5434bd4a1cccb9b3c7"} Mar 14 07:00:34 crc kubenswrapper[4893]: I0314 07:00:34.840661 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" event={"ID":"fa7a2543-8a0d-4d25-9e7a-bc387f9662df","Type":"ContainerStarted","Data":"99ccd9e40fe6a70d8fd34f384f62c4df059e937f65a039b068126039882d4784"} Mar 14 07:00:34 crc kubenswrapper[4893]: I0314 07:00:34.855589 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:34 crc kubenswrapper[4893]: I0314 07:00:34.855633 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:34 crc kubenswrapper[4893]: I0314 07:00:34.855645 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:34 crc kubenswrapper[4893]: I0314 07:00:34.855662 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:34 crc kubenswrapper[4893]: I0314 07:00:34.855674 4893 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:34Z","lastTransitionTime":"2026-03-14T07:00:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:34 crc kubenswrapper[4893]: I0314 07:00:34.958067 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:34 crc kubenswrapper[4893]: I0314 07:00:34.958103 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:34 crc kubenswrapper[4893]: I0314 07:00:34.958112 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:34 crc kubenswrapper[4893]: I0314 07:00:34.958131 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:34 crc kubenswrapper[4893]: I0314 07:00:34.958141 4893 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:34Z","lastTransitionTime":"2026-03-14T07:00:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:35 crc kubenswrapper[4893]: I0314 07:00:35.059681 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:35 crc kubenswrapper[4893]: I0314 07:00:35.059714 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:35 crc kubenswrapper[4893]: I0314 07:00:35.059732 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:35 crc kubenswrapper[4893]: I0314 07:00:35.059745 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:35 crc kubenswrapper[4893]: I0314 07:00:35.059754 4893 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:35Z","lastTransitionTime":"2026-03-14T07:00:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:35 crc kubenswrapper[4893]: I0314 07:00:35.075953 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:35 crc kubenswrapper[4893]: I0314 07:00:35.075985 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:35 crc kubenswrapper[4893]: I0314 07:00:35.075993 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:35 crc kubenswrapper[4893]: I0314 07:00:35.076007 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:35 crc kubenswrapper[4893]: I0314 07:00:35.076020 4893 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:35Z","lastTransitionTime":"2026-03-14T07:00:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:35 crc kubenswrapper[4893]: I0314 07:00:35.130016 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-8kvjr"] Mar 14 07:00:35 crc kubenswrapper[4893]: I0314 07:00:35.130635 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8kvjr" Mar 14 07:00:35 crc kubenswrapper[4893]: I0314 07:00:35.132672 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 14 07:00:35 crc kubenswrapper[4893]: I0314 07:00:35.132694 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 14 07:00:35 crc kubenswrapper[4893]: I0314 07:00:35.133219 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 14 07:00:35 crc kubenswrapper[4893]: I0314 07:00:35.133431 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 14 07:00:35 crc kubenswrapper[4893]: I0314 07:00:35.259246 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8dc0ff48-dce2-4b1b-863c-cc56390a78ec-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-8kvjr\" (UID: \"8dc0ff48-dce2-4b1b-863c-cc56390a78ec\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8kvjr" Mar 14 07:00:35 crc kubenswrapper[4893]: I0314 07:00:35.259950 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8dc0ff48-dce2-4b1b-863c-cc56390a78ec-service-ca\") pod \"cluster-version-operator-5c965bbfc6-8kvjr\" (UID: \"8dc0ff48-dce2-4b1b-863c-cc56390a78ec\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8kvjr" Mar 14 07:00:35 crc kubenswrapper[4893]: I0314 07:00:35.260083 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/8dc0ff48-dce2-4b1b-863c-cc56390a78ec-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-8kvjr\" (UID: \"8dc0ff48-dce2-4b1b-863c-cc56390a78ec\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8kvjr" Mar 14 07:00:35 crc kubenswrapper[4893]: I0314 07:00:35.260189 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/8dc0ff48-dce2-4b1b-863c-cc56390a78ec-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-8kvjr\" (UID: \"8dc0ff48-dce2-4b1b-863c-cc56390a78ec\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8kvjr" Mar 14 07:00:35 crc kubenswrapper[4893]: I0314 07:00:35.260295 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8dc0ff48-dce2-4b1b-863c-cc56390a78ec-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-8kvjr\" (UID: \"8dc0ff48-dce2-4b1b-863c-cc56390a78ec\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8kvjr" Mar 14 07:00:35 crc kubenswrapper[4893]: I0314 07:00:35.361007 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8dc0ff48-dce2-4b1b-863c-cc56390a78ec-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-8kvjr\" (UID: \"8dc0ff48-dce2-4b1b-863c-cc56390a78ec\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8kvjr" Mar 14 07:00:35 crc kubenswrapper[4893]: I0314 07:00:35.361093 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8dc0ff48-dce2-4b1b-863c-cc56390a78ec-service-ca\") pod \"cluster-version-operator-5c965bbfc6-8kvjr\" (UID: \"8dc0ff48-dce2-4b1b-863c-cc56390a78ec\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8kvjr" Mar 14 07:00:35 crc kubenswrapper[4893]: I0314 07:00:35.361119 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/8dc0ff48-dce2-4b1b-863c-cc56390a78ec-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-8kvjr\" (UID: \"8dc0ff48-dce2-4b1b-863c-cc56390a78ec\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8kvjr" Mar 14 07:00:35 crc kubenswrapper[4893]: I0314 07:00:35.361146 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/8dc0ff48-dce2-4b1b-863c-cc56390a78ec-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-8kvjr\" (UID: \"8dc0ff48-dce2-4b1b-863c-cc56390a78ec\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8kvjr" Mar 14 07:00:35 crc kubenswrapper[4893]: I0314 07:00:35.361171 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8dc0ff48-dce2-4b1b-863c-cc56390a78ec-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-8kvjr\" (UID: \"8dc0ff48-dce2-4b1b-863c-cc56390a78ec\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8kvjr" Mar 14 07:00:35 crc kubenswrapper[4893]: I0314 07:00:35.361233 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/8dc0ff48-dce2-4b1b-863c-cc56390a78ec-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-8kvjr\" (UID: \"8dc0ff48-dce2-4b1b-863c-cc56390a78ec\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8kvjr" Mar 14 07:00:35 crc kubenswrapper[4893]: I0314 07:00:35.361283 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/8dc0ff48-dce2-4b1b-863c-cc56390a78ec-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-8kvjr\" (UID: \"8dc0ff48-dce2-4b1b-863c-cc56390a78ec\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8kvjr" Mar 14 07:00:35 crc kubenswrapper[4893]: I0314 07:00:35.362051 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8dc0ff48-dce2-4b1b-863c-cc56390a78ec-service-ca\") pod \"cluster-version-operator-5c965bbfc6-8kvjr\" (UID: \"8dc0ff48-dce2-4b1b-863c-cc56390a78ec\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8kvjr" Mar 14 07:00:35 crc kubenswrapper[4893]: I0314 07:00:35.373817 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8dc0ff48-dce2-4b1b-863c-cc56390a78ec-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-8kvjr\" (UID: \"8dc0ff48-dce2-4b1b-863c-cc56390a78ec\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8kvjr" Mar 14 07:00:35 crc kubenswrapper[4893]: I0314 07:00:35.376551 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:00:35 crc kubenswrapper[4893]: E0314 07:00:35.376682 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 07:00:35 crc kubenswrapper[4893]: I0314 07:00:35.376795 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:00:35 crc kubenswrapper[4893]: I0314 07:00:35.377097 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:00:35 crc kubenswrapper[4893]: E0314 07:00:35.377246 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 07:00:35 crc kubenswrapper[4893]: E0314 07:00:35.377327 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 07:00:35 crc kubenswrapper[4893]: I0314 07:00:35.381297 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8dc0ff48-dce2-4b1b-863c-cc56390a78ec-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-8kvjr\" (UID: \"8dc0ff48-dce2-4b1b-863c-cc56390a78ec\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8kvjr" Mar 14 07:00:35 crc kubenswrapper[4893]: I0314 07:00:35.385363 4893 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 14 07:00:35 crc kubenswrapper[4893]: I0314 07:00:35.388388 4893 scope.go:117] "RemoveContainer" containerID="431bfcfc9563ced2c11a429c8d1775119884bdebac6d38a8341dc0958b1f5aa6" Mar 14 07:00:35 crc kubenswrapper[4893]: E0314 07:00:35.388630 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 14 07:00:35 crc kubenswrapper[4893]: I0314 07:00:35.389039 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 14 07:00:35 crc kubenswrapper[4893]: I0314 07:00:35.393347 4893 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 14 07:00:35 crc kubenswrapper[4893]: I0314 07:00:35.449391 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8kvjr" Mar 14 07:00:35 crc kubenswrapper[4893]: W0314 07:00:35.465726 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8dc0ff48_dce2_4b1b_863c_cc56390a78ec.slice/crio-c7c71beea7db7a2b52a57856cef6be090b8715bd34a5e8b43fa4a8e0f06293dd WatchSource:0}: Error finding container c7c71beea7db7a2b52a57856cef6be090b8715bd34a5e8b43fa4a8e0f06293dd: Status 404 returned error can't find the container with id c7c71beea7db7a2b52a57856cef6be090b8715bd34a5e8b43fa4a8e0f06293dd Mar 14 07:00:35 crc kubenswrapper[4893]: I0314 07:00:35.844132 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8kvjr" event={"ID":"8dc0ff48-dce2-4b1b-863c-cc56390a78ec","Type":"ContainerStarted","Data":"c7c71beea7db7a2b52a57856cef6be090b8715bd34a5e8b43fa4a8e0f06293dd"} Mar 14 07:00:35 crc kubenswrapper[4893]: I0314 07:00:35.846605 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-klzvc" event={"ID":"bdb2af2a-f12a-4cc9-ae9a-368e0f129c18","Type":"ContainerStarted","Data":"481d1a305bc90d99212120fc7f84c0bb2c3057cda5b25cc2c02fd287c120217a"} Mar 14 07:00:35 crc kubenswrapper[4893]: I0314 07:00:35.847069 4893 scope.go:117] "RemoveContainer" containerID="431bfcfc9563ced2c11a429c8d1775119884bdebac6d38a8341dc0958b1f5aa6" Mar 14 07:00:35 crc kubenswrapper[4893]: E0314 07:00:35.847240 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 14 07:00:35 crc kubenswrapper[4893]: I0314 07:00:35.865619 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3bbb5656-173c-4f14-b307-e5195b172ddd-metrics-certs\") pod \"network-metrics-daemon-7qz9v\" (UID: \"3bbb5656-173c-4f14-b307-e5195b172ddd\") " pod="openshift-multus/network-metrics-daemon-7qz9v" Mar 14 07:00:35 crc kubenswrapper[4893]: E0314 07:00:35.865803 4893 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 14 07:00:35 crc kubenswrapper[4893]: E0314 07:00:35.865868 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bbb5656-173c-4f14-b307-e5195b172ddd-metrics-certs podName:3bbb5656-173c-4f14-b307-e5195b172ddd nodeName:}" failed. No retries permitted until 2026-03-14 07:00:39.86585061 +0000 UTC m=+119.128027402 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3bbb5656-173c-4f14-b307-e5195b172ddd-metrics-certs") pod "network-metrics-daemon-7qz9v" (UID: "3bbb5656-173c-4f14-b307-e5195b172ddd") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 14 07:00:36 crc kubenswrapper[4893]: I0314 07:00:36.376553 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7qz9v" Mar 14 07:00:36 crc kubenswrapper[4893]: E0314 07:00:36.376740 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7qz9v" podUID="3bbb5656-173c-4f14-b307-e5195b172ddd" Mar 14 07:00:36 crc kubenswrapper[4893]: I0314 07:00:36.856879 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" event={"ID":"fa7a2543-8a0d-4d25-9e7a-bc387f9662df","Type":"ContainerStarted","Data":"7a807995d9c790c0f2ca6db51e41256c6db5022f6edfa08930fed8f85a99319e"} Mar 14 07:00:36 crc kubenswrapper[4893]: I0314 07:00:36.857415 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" Mar 14 07:00:36 crc kubenswrapper[4893]: I0314 07:00:36.857548 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" Mar 14 07:00:36 crc kubenswrapper[4893]: I0314 07:00:36.857597 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" Mar 14 07:00:36 crc kubenswrapper[4893]: I0314 07:00:36.862071 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8kvjr" event={"ID":"8dc0ff48-dce2-4b1b-863c-cc56390a78ec","Type":"ContainerStarted","Data":"54843697675a1c7b4b1b1b8fb6f6be974ddafa98c219af4f5593f994cbf38d22"} Mar 14 07:00:36 crc kubenswrapper[4893]: I0314 07:00:36.890041 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" Mar 14 07:00:36 crc kubenswrapper[4893]: I0314 07:00:36.896962 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" Mar 14 07:00:36 crc kubenswrapper[4893]: I0314 07:00:36.920131 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" podStartSLOduration=53.920111597 podStartE2EDuration="53.920111597s" podCreationTimestamp="2026-03-14 06:59:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:00:36.902250664 +0000 UTC m=+116.164427476" watchObservedRunningTime="2026-03-14 07:00:36.920111597 +0000 UTC m=+116.182288389" Mar 14 07:00:36 crc kubenswrapper[4893]: I0314 07:00:36.992206 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8kvjr" podStartSLOduration=53.992187833 podStartE2EDuration="53.992187833s" podCreationTimestamp="2026-03-14 06:59:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:00:36.99161002 +0000 UTC m=+116.253786832" watchObservedRunningTime="2026-03-14 07:00:36.992187833 +0000 UTC m=+116.254364625" Mar 14 07:00:37 crc kubenswrapper[4893]: I0314 07:00:37.080936 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:00:37 crc kubenswrapper[4893]: E0314 07:00:37.081106 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:00:45.081087459 +0000 UTC m=+124.343264251 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:00:37 crc kubenswrapper[4893]: I0314 07:00:37.182045 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:00:37 crc kubenswrapper[4893]: I0314 07:00:37.182096 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:00:37 crc kubenswrapper[4893]: I0314 07:00:37.182125 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:00:37 crc kubenswrapper[4893]: I0314 07:00:37.182155 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:00:37 crc kubenswrapper[4893]: E0314 07:00:37.182295 4893 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 07:00:37 crc kubenswrapper[4893]: E0314 07:00:37.182290 4893 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 07:00:37 crc kubenswrapper[4893]: E0314 07:00:37.182361 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 07:00:45.182344084 +0000 UTC m=+124.444520876 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 07:00:37 crc kubenswrapper[4893]: E0314 07:00:37.182356 4893 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 07:00:37 crc kubenswrapper[4893]: E0314 07:00:37.182409 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 07:00:45.182379155 +0000 UTC m=+124.444556007 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 07:00:37 crc kubenswrapper[4893]: E0314 07:00:37.182412 4893 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 07:00:37 crc kubenswrapper[4893]: E0314 07:00:37.182442 4893 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 07:00:37 crc kubenswrapper[4893]: E0314 07:00:37.182498 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-14 07:00:45.182481017 +0000 UTC m=+124.444657809 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 07:00:37 crc kubenswrapper[4893]: E0314 07:00:37.182305 4893 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 07:00:37 crc kubenswrapper[4893]: E0314 07:00:37.182541 4893 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 07:00:37 crc kubenswrapper[4893]: E0314 07:00:37.182555 4893 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 07:00:37 crc kubenswrapper[4893]: E0314 07:00:37.182591 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-14 07:00:45.182582999 +0000 UTC m=+124.444759791 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 07:00:37 crc kubenswrapper[4893]: I0314 07:00:37.376487 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:00:37 crc kubenswrapper[4893]: I0314 07:00:37.376548 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:00:37 crc kubenswrapper[4893]: I0314 07:00:37.376583 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:00:37 crc kubenswrapper[4893]: E0314 07:00:37.376650 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 07:00:37 crc kubenswrapper[4893]: E0314 07:00:37.376791 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 07:00:37 crc kubenswrapper[4893]: E0314 07:00:37.376971 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 07:00:37 crc kubenswrapper[4893]: I0314 07:00:37.867664 4893 generic.go:334] "Generic (PLEG): container finished" podID="bdb2af2a-f12a-4cc9-ae9a-368e0f129c18" containerID="481d1a305bc90d99212120fc7f84c0bb2c3057cda5b25cc2c02fd287c120217a" exitCode=0 Mar 14 07:00:37 crc kubenswrapper[4893]: I0314 07:00:37.867764 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-klzvc" event={"ID":"bdb2af2a-f12a-4cc9-ae9a-368e0f129c18","Type":"ContainerDied","Data":"481d1a305bc90d99212120fc7f84c0bb2c3057cda5b25cc2c02fd287c120217a"} Mar 14 07:00:38 crc kubenswrapper[4893]: I0314 07:00:38.375770 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7qz9v" Mar 14 07:00:38 crc kubenswrapper[4893]: E0314 07:00:38.375888 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7qz9v" podUID="3bbb5656-173c-4f14-b307-e5195b172ddd" Mar 14 07:00:38 crc kubenswrapper[4893]: I0314 07:00:38.873665 4893 generic.go:334] "Generic (PLEG): container finished" podID="bdb2af2a-f12a-4cc9-ae9a-368e0f129c18" containerID="e2143665a2b7a49d2b12716a963613dc3f913cafbdf789be237b57d8356cdf9a" exitCode=0 Mar 14 07:00:38 crc kubenswrapper[4893]: I0314 07:00:38.873720 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-klzvc" event={"ID":"bdb2af2a-f12a-4cc9-ae9a-368e0f129c18","Type":"ContainerDied","Data":"e2143665a2b7a49d2b12716a963613dc3f913cafbdf789be237b57d8356cdf9a"} Mar 14 07:00:39 crc kubenswrapper[4893]: I0314 07:00:39.376317 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:00:39 crc kubenswrapper[4893]: I0314 07:00:39.376356 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:00:39 crc kubenswrapper[4893]: I0314 07:00:39.376387 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:00:39 crc kubenswrapper[4893]: E0314 07:00:39.376447 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 07:00:39 crc kubenswrapper[4893]: E0314 07:00:39.376526 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 07:00:39 crc kubenswrapper[4893]: E0314 07:00:39.376637 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 07:00:39 crc kubenswrapper[4893]: I0314 07:00:39.603969 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-7qz9v"] Mar 14 07:00:39 crc kubenswrapper[4893]: I0314 07:00:39.604380 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7qz9v" Mar 14 07:00:39 crc kubenswrapper[4893]: E0314 07:00:39.604476 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7qz9v" podUID="3bbb5656-173c-4f14-b307-e5195b172ddd" Mar 14 07:00:39 crc kubenswrapper[4893]: I0314 07:00:39.879489 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-klzvc" event={"ID":"bdb2af2a-f12a-4cc9-ae9a-368e0f129c18","Type":"ContainerStarted","Data":"f583ce79ddd4926b13f8f3ef3d558a8baae25507a13b16720baa77698431f015"} Mar 14 07:00:39 crc kubenswrapper[4893]: I0314 07:00:39.901052 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-klzvc" podStartSLOduration=56.901035388 podStartE2EDuration="56.901035388s" podCreationTimestamp="2026-03-14 06:59:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:00:39.899247704 +0000 UTC m=+119.161424516" watchObservedRunningTime="2026-03-14 07:00:39.901035388 +0000 UTC m=+119.163212180" Mar 14 07:00:39 crc kubenswrapper[4893]: I0314 07:00:39.911733 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3bbb5656-173c-4f14-b307-e5195b172ddd-metrics-certs\") pod \"network-metrics-daemon-7qz9v\" (UID: \"3bbb5656-173c-4f14-b307-e5195b172ddd\") " pod="openshift-multus/network-metrics-daemon-7qz9v" Mar 14 07:00:39 crc kubenswrapper[4893]: E0314 07:00:39.911909 4893 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 14 07:00:39 crc kubenswrapper[4893]: E0314 07:00:39.911971 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bbb5656-173c-4f14-b307-e5195b172ddd-metrics-certs podName:3bbb5656-173c-4f14-b307-e5195b172ddd nodeName:}" failed. No retries permitted until 2026-03-14 07:00:47.911952633 +0000 UTC m=+127.174129425 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3bbb5656-173c-4f14-b307-e5195b172ddd-metrics-certs") pod "network-metrics-daemon-7qz9v" (UID: "3bbb5656-173c-4f14-b307-e5195b172ddd") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 14 07:00:40 crc kubenswrapper[4893]: I0314 07:00:40.013399 4893 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 14 07:00:41 crc kubenswrapper[4893]: E0314 07:00:41.338394 4893 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 14 07:00:41 crc kubenswrapper[4893]: I0314 07:00:41.375819 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:00:41 crc kubenswrapper[4893]: I0314 07:00:41.376005 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:00:41 crc kubenswrapper[4893]: E0314 07:00:41.377729 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 07:00:41 crc kubenswrapper[4893]: I0314 07:00:41.377751 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:00:41 crc kubenswrapper[4893]: I0314 07:00:41.377783 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7qz9v" Mar 14 07:00:41 crc kubenswrapper[4893]: E0314 07:00:41.377941 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 07:00:41 crc kubenswrapper[4893]: E0314 07:00:41.378089 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 07:00:41 crc kubenswrapper[4893]: E0314 07:00:41.378146 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7qz9v" podUID="3bbb5656-173c-4f14-b307-e5195b172ddd" Mar 14 07:00:43 crc kubenswrapper[4893]: I0314 07:00:43.376376 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:00:43 crc kubenswrapper[4893]: I0314 07:00:43.376442 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7qz9v" Mar 14 07:00:43 crc kubenswrapper[4893]: I0314 07:00:43.376464 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:00:43 crc kubenswrapper[4893]: E0314 07:00:43.376522 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 07:00:43 crc kubenswrapper[4893]: E0314 07:00:43.376619 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7qz9v" podUID="3bbb5656-173c-4f14-b307-e5195b172ddd" Mar 14 07:00:43 crc kubenswrapper[4893]: I0314 07:00:43.376647 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:00:43 crc kubenswrapper[4893]: E0314 07:00:43.376786 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 07:00:43 crc kubenswrapper[4893]: E0314 07:00:43.376882 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 07:00:45 crc kubenswrapper[4893]: I0314 07:00:45.155507 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:00:45 crc kubenswrapper[4893]: E0314 07:00:45.155712 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:01:01.155687466 +0000 UTC m=+140.417864258 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:00:45 crc kubenswrapper[4893]: I0314 07:00:45.256825 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:00:45 crc kubenswrapper[4893]: I0314 07:00:45.257095 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:00:45 crc kubenswrapper[4893]: E0314 07:00:45.257015 4893 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 07:00:45 crc kubenswrapper[4893]: I0314 07:00:45.257188 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:00:45 crc kubenswrapper[4893]: E0314 07:00:45.257257 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 07:01:01.257241067 +0000 UTC m=+140.519417959 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 07:00:45 crc kubenswrapper[4893]: E0314 07:00:45.257172 4893 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 07:00:45 crc kubenswrapper[4893]: E0314 07:00:45.257336 4893 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 07:00:45 crc kubenswrapper[4893]: E0314 07:00:45.257362 4893 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 07:00:45 crc kubenswrapper[4893]: I0314 07:00:45.257352 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:00:45 crc kubenswrapper[4893]: E0314 07:00:45.257430 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-14 07:01:01.257410521 +0000 UTC m=+140.519587313 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 07:00:45 crc kubenswrapper[4893]: E0314 07:00:45.257512 4893 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 07:00:45 crc kubenswrapper[4893]: E0314 07:00:45.257605 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 07:01:01.257584235 +0000 UTC m=+140.519761097 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 07:00:45 crc kubenswrapper[4893]: E0314 07:00:45.257747 4893 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 07:00:45 crc kubenswrapper[4893]: E0314 07:00:45.257820 4893 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 07:00:45 crc kubenswrapper[4893]: E0314 07:00:45.257897 4893 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 07:00:45 crc kubenswrapper[4893]: E0314 07:00:45.258009 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-14 07:01:01.257993506 +0000 UTC m=+140.520170388 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 07:00:45 crc kubenswrapper[4893]: I0314 07:00:45.376295 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:00:45 crc kubenswrapper[4893]: I0314 07:00:45.376291 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:00:45 crc kubenswrapper[4893]: I0314 07:00:45.376432 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:00:45 crc kubenswrapper[4893]: E0314 07:00:45.376578 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 07:00:45 crc kubenswrapper[4893]: I0314 07:00:45.376634 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7qz9v" Mar 14 07:00:45 crc kubenswrapper[4893]: E0314 07:00:45.376719 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 07:00:45 crc kubenswrapper[4893]: E0314 07:00:45.376784 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7qz9v" podUID="3bbb5656-173c-4f14-b307-e5195b172ddd" Mar 14 07:00:45 crc kubenswrapper[4893]: E0314 07:00:45.376890 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 07:00:47 crc kubenswrapper[4893]: I0314 07:00:47.375638 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7qz9v" Mar 14 07:00:47 crc kubenswrapper[4893]: I0314 07:00:47.375820 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:00:47 crc kubenswrapper[4893]: I0314 07:00:47.375843 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:00:47 crc kubenswrapper[4893]: I0314 07:00:47.376678 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:00:47 crc kubenswrapper[4893]: I0314 07:00:47.376767 4893 scope.go:117] "RemoveContainer" containerID="431bfcfc9563ced2c11a429c8d1775119884bdebac6d38a8341dc0958b1f5aa6" Mar 14 07:00:47 crc kubenswrapper[4893]: I0314 07:00:47.377392 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 14 07:00:47 crc kubenswrapper[4893]: I0314 07:00:47.378244 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 14 07:00:47 crc kubenswrapper[4893]: I0314 07:00:47.378540 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 14 07:00:47 crc kubenswrapper[4893]: I0314 07:00:47.379400 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 14 07:00:47 crc kubenswrapper[4893]: I0314 07:00:47.379502 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 14 07:00:47 crc kubenswrapper[4893]: I0314 07:00:47.379500 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 14 07:00:47 crc kubenswrapper[4893]: I0314 07:00:47.906847 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 14 07:00:47 crc kubenswrapper[4893]: I0314 07:00:47.908769 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"086579b9d9b1129c4045ccc79d9052a9d293d62e1e6f19b8283acaa562aabe61"} Mar 14 07:00:47 crc kubenswrapper[4893]: I0314 07:00:47.909338 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 07:00:47 crc kubenswrapper[4893]: I0314 07:00:47.932596 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=12.93257588 podStartE2EDuration="12.93257588s" podCreationTimestamp="2026-03-14 07:00:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:00:47.931780621 +0000 UTC m=+127.193957453" watchObservedRunningTime="2026-03-14 07:00:47.93257588 +0000 UTC m=+127.194752692" Mar 14 07:00:47 crc kubenswrapper[4893]: I0314 07:00:47.981642 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3bbb5656-173c-4f14-b307-e5195b172ddd-metrics-certs\") pod \"network-metrics-daemon-7qz9v\" (UID: \"3bbb5656-173c-4f14-b307-e5195b172ddd\") " pod="openshift-multus/network-metrics-daemon-7qz9v" Mar 14 07:00:47 crc kubenswrapper[4893]: I0314 07:00:47.992488 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3bbb5656-173c-4f14-b307-e5195b172ddd-metrics-certs\") pod \"network-metrics-daemon-7qz9v\" (UID: \"3bbb5656-173c-4f14-b307-e5195b172ddd\") " pod="openshift-multus/network-metrics-daemon-7qz9v" Mar 14 07:00:48 crc kubenswrapper[4893]: I0314 07:00:48.290745 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7qz9v" Mar 14 07:00:48 crc kubenswrapper[4893]: I0314 07:00:48.488002 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-7qz9v"] Mar 14 07:00:48 crc kubenswrapper[4893]: I0314 07:00:48.912442 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7qz9v" event={"ID":"3bbb5656-173c-4f14-b307-e5195b172ddd","Type":"ContainerStarted","Data":"e41b5741633864f24c7cfda616e9b0baf19d5b8d7bffd437f2c381d546d978e8"} Mar 14 07:00:48 crc kubenswrapper[4893]: I0314 07:00:48.912483 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7qz9v" event={"ID":"3bbb5656-173c-4f14-b307-e5195b172ddd","Type":"ContainerStarted","Data":"edcde6424938325c6c2fe19af0f3230051589daa2f0cf168497df10c43b997cd"} Mar 14 07:00:48 crc kubenswrapper[4893]: I0314 07:00:48.912496 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7qz9v" event={"ID":"3bbb5656-173c-4f14-b307-e5195b172ddd","Type":"ContainerStarted","Data":"9dced8d421f04a93d32400a980730cd7b52da6fd7e9602fd4383da6085184914"} Mar 14 07:00:48 crc kubenswrapper[4893]: I0314 07:00:48.924219 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-7qz9v" podStartSLOduration=65.924208788 podStartE2EDuration="1m5.924208788s" podCreationTimestamp="2026-03-14 06:59:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:00:48.922951048 +0000 UTC m=+128.185127840" watchObservedRunningTime="2026-03-14 07:00:48.924208788 +0000 UTC m=+128.186385570" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.321183 4893 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.356683 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-66mm8"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.357035 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-66mm8" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.358972 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-qgm7k"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.359172 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-qgm7k" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.360711 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.360983 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.361196 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.361391 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.361705 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.361749 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-m8wwx"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.361927 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-psm2j"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.362116 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-psm2j" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.362347 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m8wwx" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.367269 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-cgrf8"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.367855 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-sfrgk"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.368221 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-sfrgk" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.368322 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-btl2s"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.368737 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-cgrf8" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.368807 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-btl2s" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.370717 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mjzxr"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.371283 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.371555 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-vbg6j"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.371914 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-vbg6j" Mar 14 07:00:55 crc kubenswrapper[4893]: W0314 07:00:55.376278 4893 reflector.go:561] object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data": failed to list *v1.Secret: secrets "v4-0-config-user-idp-0-file-data" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Mar 14 07:00:55 crc kubenswrapper[4893]: W0314 07:00:55.376293 4893 reflector.go:561] object-"openshift-image-registry"/"installation-pull-secrets": failed to list *v1.Secret: secrets "installation-pull-secrets" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-image-registry": no relationship found between node 'crc' and this object Mar 14 07:00:55 crc kubenswrapper[4893]: E0314 07:00:55.376323 4893 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-user-idp-0-file-data\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"v4-0-config-user-idp-0-file-data\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 14 07:00:55 crc kubenswrapper[4893]: E0314 07:00:55.376352 4893 reflector.go:158] "Unhandled Error" err="object-\"openshift-image-registry\"/\"installation-pull-secrets\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"installation-pull-secrets\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-image-registry\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 14 07:00:55 crc kubenswrapper[4893]: W0314 07:00:55.376391 4893 reflector.go:561] object-"openshift-machine-api"/"kube-rbac-proxy": failed to list *v1.ConfigMap: configmaps "kube-rbac-proxy" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Mar 14 07:00:55 crc kubenswrapper[4893]: E0314 07:00:55.376406 4893 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"kube-rbac-proxy\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-rbac-proxy\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 14 07:00:55 crc kubenswrapper[4893]: W0314 07:00:55.376432 4893 reflector.go:561] object-"openshift-machine-api"/"machine-api-operator-images": failed to list *v1.ConfigMap: configmaps "machine-api-operator-images" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Mar 14 07:00:55 crc kubenswrapper[4893]: E0314 07:00:55.376459 4893 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"machine-api-operator-images\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"machine-api-operator-images\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 14 07:00:55 crc kubenswrapper[4893]: W0314 07:00:55.376460 4893 reflector.go:561] object-"openshift-apiserver"/"image-import-ca": failed to list *v1.ConfigMap: configmaps "image-import-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Mar 14 07:00:55 crc kubenswrapper[4893]: E0314 07:00:55.376505 4893 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"image-import-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"image-import-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 14 07:00:55 crc kubenswrapper[4893]: W0314 07:00:55.384807 4893 reflector.go:561] object-"openshift-apiserver"/"etcd-client": failed to list *v1.Secret: secrets "etcd-client" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.384900 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 14 07:00:55 crc kubenswrapper[4893]: E0314 07:00:55.384924 4893 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"etcd-client\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"etcd-client\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 14 07:00:55 crc kubenswrapper[4893]: W0314 07:00:55.385198 4893 reflector.go:561] object-"openshift-apiserver"/"audit-1": failed to list *v1.ConfigMap: configmaps "audit-1" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Mar 14 07:00:55 crc kubenswrapper[4893]: E0314 07:00:55.385236 4893 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"audit-1\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"audit-1\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 14 07:00:55 crc kubenswrapper[4893]: W0314 07:00:55.385297 4893 reflector.go:561] object-"openshift-machine-api"/"machine-api-operator-tls": failed to list *v1.Secret: secrets "machine-api-operator-tls" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Mar 14 07:00:55 crc kubenswrapper[4893]: E0314 07:00:55.385320 4893 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"machine-api-operator-tls\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"machine-api-operator-tls\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 14 07:00:55 crc kubenswrapper[4893]: W0314 07:00:55.385323 4893 reflector.go:561] object-"openshift-authentication"/"v4-0-config-user-template-error": failed to list *v1.Secret: secrets "v4-0-config-user-template-error" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Mar 14 07:00:55 crc kubenswrapper[4893]: E0314 07:00:55.385427 4893 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-user-template-error\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"v4-0-config-user-template-error\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 14 07:00:55 crc kubenswrapper[4893]: W0314 07:00:55.385399 4893 reflector.go:561] object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff": failed to list *v1.Secret: secrets "openshift-apiserver-sa-dockercfg-djjff" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Mar 14 07:00:55 crc kubenswrapper[4893]: E0314 07:00:55.385457 4893 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"openshift-apiserver-sa-dockercfg-djjff\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"openshift-apiserver-sa-dockercfg-djjff\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.385402 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 14 07:00:55 crc kubenswrapper[4893]: W0314 07:00:55.385939 4893 reflector.go:561] object-"openshift-apiserver"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Mar 14 07:00:55 crc kubenswrapper[4893]: E0314 07:00:55.385992 4893 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 14 07:00:55 crc kubenswrapper[4893]: W0314 07:00:55.386064 4893 reflector.go:561] object-"openshift-image-registry"/"registry-dockercfg-kzzsd": failed to list *v1.Secret: secrets "registry-dockercfg-kzzsd" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-image-registry": no relationship found between node 'crc' and this object Mar 14 07:00:55 crc kubenswrapper[4893]: E0314 07:00:55.386105 4893 reflector.go:158] "Unhandled Error" err="object-\"openshift-image-registry\"/\"registry-dockercfg-kzzsd\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"registry-dockercfg-kzzsd\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-image-registry\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 14 07:00:55 crc kubenswrapper[4893]: W0314 07:00:55.386124 4893 reflector.go:561] object-"openshift-authentication"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Mar 14 07:00:55 crc kubenswrapper[4893]: W0314 07:00:55.386232 4893 reflector.go:561] object-"openshift-authentication"/"v4-0-config-system-service-ca": failed to list *v1.ConfigMap: configmaps "v4-0-config-system-service-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Mar 14 07:00:55 crc kubenswrapper[4893]: E0314 07:00:55.386149 4893 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 14 07:00:55 crc kubenswrapper[4893]: E0314 07:00:55.386265 4893 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-system-service-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"v4-0-config-system-service-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 14 07:00:55 crc kubenswrapper[4893]: W0314 07:00:55.386940 4893 reflector.go:561] object-"openshift-console-operator"/"console-operator-config": failed to list *v1.ConfigMap: configmaps "console-operator-config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-console-operator": no relationship found between node 'crc' and this object Mar 14 07:00:55 crc kubenswrapper[4893]: E0314 07:00:55.386971 4893 reflector.go:158] "Unhandled Error" err="object-\"openshift-console-operator\"/\"console-operator-config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"console-operator-config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-console-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.387001 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 14 07:00:55 crc kubenswrapper[4893]: W0314 07:00:55.399494 4893 reflector.go:561] object-"openshift-authentication"/"v4-0-config-system-serving-cert": failed to list *v1.Secret: secrets "v4-0-config-system-serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Mar 14 07:00:55 crc kubenswrapper[4893]: E0314 07:00:55.399578 4893 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-system-serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"v4-0-config-system-serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 14 07:00:55 crc kubenswrapper[4893]: W0314 07:00:55.399633 4893 reflector.go:561] object-"openshift-apiserver"/"trusted-ca-bundle": failed to list *v1.ConfigMap: configmaps "trusted-ca-bundle" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Mar 14 07:00:55 crc kubenswrapper[4893]: E0314 07:00:55.399646 4893 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"trusted-ca-bundle\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"trusted-ca-bundle\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 14 07:00:55 crc kubenswrapper[4893]: W0314 07:00:55.399679 4893 reflector.go:561] object-"openshift-machine-api"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Mar 14 07:00:55 crc kubenswrapper[4893]: E0314 07:00:55.399691 4893 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 14 07:00:55 crc kubenswrapper[4893]: W0314 07:00:55.399704 4893 reflector.go:561] object-"openshift-apiserver"/"etcd-serving-ca": failed to list *v1.ConfigMap: configmaps "etcd-serving-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Mar 14 07:00:55 crc kubenswrapper[4893]: E0314 07:00:55.399716 4893 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"etcd-serving-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"etcd-serving-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 14 07:00:55 crc kubenswrapper[4893]: W0314 07:00:55.399746 4893 reflector.go:561] object-"openshift-authentication"/"v4-0-config-system-router-certs": failed to list *v1.Secret: secrets "v4-0-config-system-router-certs" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Mar 14 07:00:55 crc kubenswrapper[4893]: E0314 07:00:55.399756 4893 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-system-router-certs\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"v4-0-config-system-router-certs\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.399815 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.400051 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.400176 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.400273 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.400502 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 14 07:00:55 crc kubenswrapper[4893]: W0314 07:00:55.400636 4893 reflector.go:561] object-"openshift-console"/"service-ca": failed to list *v1.ConfigMap: configmaps "service-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-console": no relationship found between node 'crc' and this object Mar 14 07:00:55 crc kubenswrapper[4893]: E0314 07:00:55.400727 4893 reflector.go:158] "Unhandled Error" err="object-\"openshift-console\"/\"service-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"service-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-console\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.400502 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 14 07:00:55 crc kubenswrapper[4893]: W0314 07:00:55.400834 4893 reflector.go:561] object-"openshift-console-operator"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-console-operator": no relationship found between node 'crc' and this object Mar 14 07:00:55 crc kubenswrapper[4893]: E0314 07:00:55.400988 4893 reflector.go:158] "Unhandled Error" err="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-console-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.401371 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.401555 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.401728 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.401936 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 14 07:00:55 crc kubenswrapper[4893]: W0314 07:00:55.402041 4893 reflector.go:561] object-"openshift-apiserver"/"encryption-config-1": failed to list *v1.Secret: secrets "encryption-config-1" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Mar 14 07:00:55 crc kubenswrapper[4893]: E0314 07:00:55.402149 4893 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"encryption-config-1\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"encryption-config-1\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.402330 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 14 07:00:55 crc kubenswrapper[4893]: W0314 07:00:55.402461 4893 reflector.go:561] object-"openshift-authentication"/"v4-0-config-user-template-provider-selection": failed to list *v1.Secret: secrets "v4-0-config-user-template-provider-selection" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Mar 14 07:00:55 crc kubenswrapper[4893]: E0314 07:00:55.402482 4893 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-user-template-provider-selection\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"v4-0-config-user-template-provider-selection\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.402555 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.402600 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 14 07:00:55 crc kubenswrapper[4893]: W0314 07:00:55.402058 4893 reflector.go:561] object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr": failed to list *v1.Secret: secrets "console-operator-dockercfg-4xjcr" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-console-operator": no relationship found between node 'crc' and this object Mar 14 07:00:55 crc kubenswrapper[4893]: E0314 07:00:55.402850 4893 reflector.go:158] "Unhandled Error" err="object-\"openshift-console-operator\"/\"console-operator-dockercfg-4xjcr\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"console-operator-dockercfg-4xjcr\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-console-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.402331 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 14 07:00:55 crc kubenswrapper[4893]: W0314 07:00:55.403007 4893 reflector.go:561] object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7": failed to list *v1.Secret: secrets "machine-api-operator-dockercfg-mfbb7" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Mar 14 07:00:55 crc kubenswrapper[4893]: E0314 07:00:55.403072 4893 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"machine-api-operator-dockercfg-mfbb7\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"machine-api-operator-dockercfg-mfbb7\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 14 07:00:55 crc kubenswrapper[4893]: W0314 07:00:55.403125 4893 reflector.go:561] object-"openshift-console-operator"/"trusted-ca": failed to list *v1.ConfigMap: configmaps "trusted-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-console-operator": no relationship found between node 'crc' and this object Mar 14 07:00:55 crc kubenswrapper[4893]: E0314 07:00:55.403151 4893 reflector.go:158] "Unhandled Error" err="object-\"openshift-console-operator\"/\"trusted-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"trusted-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-console-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 14 07:00:55 crc kubenswrapper[4893]: W0314 07:00:55.403199 4893 reflector.go:561] object-"openshift-console-operator"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-console-operator": no relationship found between node 'crc' and this object Mar 14 07:00:55 crc kubenswrapper[4893]: E0314 07:00:55.403210 4893 reflector.go:158] "Unhandled Error" err="object-\"openshift-console-operator\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-console-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 14 07:00:55 crc kubenswrapper[4893]: W0314 07:00:55.402924 4893 reflector.go:561] object-"openshift-image-registry"/"trusted-ca": failed to list *v1.ConfigMap: configmaps "trusted-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-image-registry": no relationship found between node 'crc' and this object Mar 14 07:00:55 crc kubenswrapper[4893]: E0314 07:00:55.403466 4893 reflector.go:158] "Unhandled Error" err="object-\"openshift-image-registry\"/\"trusted-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"trusted-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-image-registry\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 14 07:00:55 crc kubenswrapper[4893]: W0314 07:00:55.403558 4893 reflector.go:561] object-"openshift-authentication"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Mar 14 07:00:55 crc kubenswrapper[4893]: E0314 07:00:55.403579 4893 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 14 07:00:55 crc kubenswrapper[4893]: W0314 07:00:55.402969 4893 reflector.go:561] object-"openshift-apiserver"/"config": failed to list *v1.ConfigMap: configmaps "config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Mar 14 07:00:55 crc kubenswrapper[4893]: E0314 07:00:55.403600 4893 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 14 07:00:55 crc kubenswrapper[4893]: W0314 07:00:55.403844 4893 reflector.go:561] object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle": failed to list *v1.ConfigMap: configmaps "v4-0-config-system-trusted-ca-bundle" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Mar 14 07:00:55 crc kubenswrapper[4893]: E0314 07:00:55.403931 4893 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-system-trusted-ca-bundle\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"v4-0-config-system-trusted-ca-bundle\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 14 07:00:55 crc kubenswrapper[4893]: W0314 07:00:55.404211 4893 reflector.go:561] object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template": failed to list *v1.Secret: secrets "v4-0-config-system-ocp-branding-template" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Mar 14 07:00:55 crc kubenswrapper[4893]: W0314 07:00:55.404224 4893 reflector.go:561] object-"openshift-authentication"/"v4-0-config-system-cliconfig": failed to list *v1.ConfigMap: configmaps "v4-0-config-system-cliconfig" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Mar 14 07:00:55 crc kubenswrapper[4893]: E0314 07:00:55.404238 4893 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-system-ocp-branding-template\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"v4-0-config-system-ocp-branding-template\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 14 07:00:55 crc kubenswrapper[4893]: E0314 07:00:55.404250 4893 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-system-cliconfig\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"v4-0-config-system-cliconfig\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 14 07:00:55 crc kubenswrapper[4893]: W0314 07:00:55.404271 4893 reflector.go:561] object-"openshift-authentication"/"audit": failed to list *v1.ConfigMap: configmaps "audit" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Mar 14 07:00:55 crc kubenswrapper[4893]: E0314 07:00:55.404281 4893 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"audit\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"audit\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 14 07:00:55 crc kubenswrapper[4893]: W0314 07:00:55.404294 4893 reflector.go:561] object-"openshift-apiserver"/"serving-cert": failed to list *v1.Secret: secrets "serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Mar 14 07:00:55 crc kubenswrapper[4893]: W0314 07:00:55.404309 4893 reflector.go:561] object-"openshift-machine-api"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Mar 14 07:00:55 crc kubenswrapper[4893]: E0314 07:00:55.404318 4893 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 14 07:00:55 crc kubenswrapper[4893]: E0314 07:00:55.404306 4893 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.404356 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 14 07:00:55 crc kubenswrapper[4893]: W0314 07:00:55.404914 4893 reflector.go:561] object-"openshift-image-registry"/"image-registry-tls": failed to list *v1.Secret: secrets "image-registry-tls" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-image-registry": no relationship found between node 'crc' and this object Mar 14 07:00:55 crc kubenswrapper[4893]: E0314 07:00:55.404947 4893 reflector.go:158] "Unhandled Error" err="object-\"openshift-image-registry\"/\"image-registry-tls\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"image-registry-tls\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-image-registry\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 14 07:00:55 crc kubenswrapper[4893]: W0314 07:00:55.405100 4893 reflector.go:561] object-"openshift-apiserver"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Mar 14 07:00:55 crc kubenswrapper[4893]: E0314 07:00:55.405119 4893 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.405558 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.405740 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-x2sb4"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.405864 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.406284 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-x2sb4" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.407000 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-7wtvk"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.407378 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-wdnt5"] Mar 14 07:00:55 crc kubenswrapper[4893]: W0314 07:00:55.408065 4893 reflector.go:561] object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc": failed to list *v1.Secret: secrets "oauth-openshift-dockercfg-znhcc" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Mar 14 07:00:55 crc kubenswrapper[4893]: E0314 07:00:55.408101 4893 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"oauth-openshift-dockercfg-znhcc\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"oauth-openshift-dockercfg-znhcc\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.408473 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qvvds"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.408757 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qvvds" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.408939 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7wtvk" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.409100 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wdnt5" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.412014 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.412245 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.413237 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-vkgvv"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.413470 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.413781 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.413785 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-lmn9t"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.413863 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-vkgvv" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.414448 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qq2pq"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.414410 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lmn9t" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.415400 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qq2pq" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.415812 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-gnrsd"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.416325 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-gnrsd" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.416547 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zzldb"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.416978 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zzldb" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.417945 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jj2bg"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.418216 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jj2bg" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.421346 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-k4jvl"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.421835 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-52b8f"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.422204 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-52b8f" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.422387 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-k4jvl" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.422842 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.425312 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.425360 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.425470 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-ss7r8"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.425568 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.425676 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.425857 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.425973 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.426043 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h2xwg"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.426300 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-ss7r8" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.426450 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h2xwg" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.426748 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jn9rp"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.427386 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jn9rp" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.440978 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.442594 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.443788 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h4wwz"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.444679 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.445131 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.446052 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.446258 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.466627 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-ccjq4"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.474274 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v7lxq"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.474671 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v7lxq" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.467501 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.475030 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ccjq4" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.468008 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h4wwz" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.472440 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ef07db86-4677-41ce-8b6d-7960cc63a9b8-etcd-serving-ca\") pod \"apiserver-76f77b778f-btl2s\" (UID: \"ef07db86-4677-41ce-8b6d-7960cc63a9b8\") " pod="openshift-apiserver/apiserver-76f77b778f-btl2s" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.475344 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/ef07db86-4677-41ce-8b6d-7960cc63a9b8-image-import-ca\") pod \"apiserver-76f77b778f-btl2s\" (UID: \"ef07db86-4677-41ce-8b6d-7960cc63a9b8\") " pod="openshift-apiserver/apiserver-76f77b778f-btl2s" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.475362 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e25c5b47-b7b9-4213-931b-5d6691aaa2d4-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jj2bg\" (UID: \"e25c5b47-b7b9-4213-931b-5d6691aaa2d4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jj2bg" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.475383 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ef07db86-4677-41ce-8b6d-7960cc63a9b8-encryption-config\") pod \"apiserver-76f77b778f-btl2s\" (UID: \"ef07db86-4677-41ce-8b6d-7960cc63a9b8\") " pod="openshift-apiserver/apiserver-76f77b778f-btl2s" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.475397 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55b95986-0c2a-4589-98cf-e3834cf5982d-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-66mm8\" (UID: \"55b95986-0c2a-4589-98cf-e3834cf5982d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-66mm8" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.475414 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e25c5b47-b7b9-4213-931b-5d6691aaa2d4-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jj2bg\" (UID: \"e25c5b47-b7b9-4213-931b-5d6691aaa2d4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jj2bg" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.475429 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f37e61e-cd0d-4394-bdfa-a7c08c0742f3-config\") pod \"controller-manager-879f6c89f-x2sb4\" (UID: \"6f37e61e-cd0d-4394-bdfa-a7c08c0742f3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x2sb4" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.475447 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9-console-config\") pod \"console-f9d7485db-psm2j\" (UID: \"ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9\") " pod="openshift-console/console-f9d7485db-psm2j" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.475461 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ef07db86-4677-41ce-8b6d-7960cc63a9b8-trusted-ca-bundle\") pod \"apiserver-76f77b778f-btl2s\" (UID: \"ef07db86-4677-41ce-8b6d-7960cc63a9b8\") " pod="openshift-apiserver/apiserver-76f77b778f-btl2s" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.475475 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/39b2c526-e373-404b-a607-bdd8d29c8fae-audit-policies\") pod \"apiserver-7bbb656c7d-lmn9t\" (UID: \"39b2c526-e373-404b-a607-bdd8d29c8fae\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lmn9t" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.475495 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bdb8df65-c9b0-488a-9653-c0e97024027b-available-featuregates\") pod \"openshift-config-operator-7777fb866f-wdnt5\" (UID: \"bdb8df65-c9b0-488a-9653-c0e97024027b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wdnt5" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.475512 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/807a844d-47c0-4dc3-b820-0ea0069a28b6-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-zzldb\" (UID: \"807a844d-47c0-4dc3-b820-0ea0069a28b6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zzldb" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.475542 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55b95986-0c2a-4589-98cf-e3834cf5982d-config\") pod \"openshift-apiserver-operator-796bbdcf4f-66mm8\" (UID: \"55b95986-0c2a-4589-98cf-e3834cf5982d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-66mm8" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.475557 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rg6g\" (UniqueName: \"kubernetes.io/projected/ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9-kube-api-access-4rg6g\") pod \"console-f9d7485db-psm2j\" (UID: \"ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9\") " pod="openshift-console/console-f9d7485db-psm2j" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.475573 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef07db86-4677-41ce-8b6d-7960cc63a9b8-serving-cert\") pod \"apiserver-76f77b778f-btl2s\" (UID: \"ef07db86-4677-41ce-8b6d-7960cc63a9b8\") " pod="openshift-apiserver/apiserver-76f77b778f-btl2s" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.475588 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6f37e61e-cd0d-4394-bdfa-a7c08c0742f3-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-x2sb4\" (UID: \"6f37e61e-cd0d-4394-bdfa-a7c08c0742f3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x2sb4" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.475603 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7bb36e2b-cd32-41b1-aa71-e9e6c8e8ce4b-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-k4jvl\" (UID: \"7bb36e2b-cd32-41b1-aa71-e9e6c8e8ce4b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-k4jvl" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.475617 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9-console-serving-cert\") pod \"console-f9d7485db-psm2j\" (UID: \"ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9\") " pod="openshift-console/console-f9d7485db-psm2j" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.475633 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/871c529e-b263-4600-982a-d7be266f86e4-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-vbg6j\" (UID: \"871c529e-b263-4600-982a-d7be266f86e4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vbg6j" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.475649 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqqml\" (UniqueName: \"kubernetes.io/projected/bdb8df65-c9b0-488a-9653-c0e97024027b-kube-api-access-hqqml\") pod \"openshift-config-operator-7777fb866f-wdnt5\" (UID: \"bdb8df65-c9b0-488a-9653-c0e97024027b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wdnt5" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.475663 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ef07db86-4677-41ce-8b6d-7960cc63a9b8-node-pullsecrets\") pod \"apiserver-76f77b778f-btl2s\" (UID: \"ef07db86-4677-41ce-8b6d-7960cc63a9b8\") " pod="openshift-apiserver/apiserver-76f77b778f-btl2s" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.475677 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtp6t\" (UniqueName: \"kubernetes.io/projected/ef07db86-4677-41ce-8b6d-7960cc63a9b8-kube-api-access-gtp6t\") pod \"apiserver-76f77b778f-btl2s\" (UID: \"ef07db86-4677-41ce-8b6d-7960cc63a9b8\") " pod="openshift-apiserver/apiserver-76f77b778f-btl2s" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.475692 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7pcb\" (UniqueName: \"kubernetes.io/projected/7bb36e2b-cd32-41b1-aa71-e9e6c8e8ce4b-kube-api-access-s7pcb\") pod \"multus-admission-controller-857f4d67dd-k4jvl\" (UID: \"7bb36e2b-cd32-41b1-aa71-e9e6c8e8ce4b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-k4jvl" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.475707 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f37e61e-cd0d-4394-bdfa-a7c08c0742f3-serving-cert\") pod \"controller-manager-879f6c89f-x2sb4\" (UID: \"6f37e61e-cd0d-4394-bdfa-a7c08c0742f3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x2sb4" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.475725 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e25c5b47-b7b9-4213-931b-5d6691aaa2d4-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jj2bg\" (UID: \"e25c5b47-b7b9-4213-931b-5d6691aaa2d4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jj2bg" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.475740 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/39b2c526-e373-404b-a607-bdd8d29c8fae-audit-dir\") pod \"apiserver-7bbb656c7d-lmn9t\" (UID: \"39b2c526-e373-404b-a607-bdd8d29c8fae\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lmn9t" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.475757 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ef07db86-4677-41ce-8b6d-7960cc63a9b8-audit-dir\") pod \"apiserver-76f77b778f-btl2s\" (UID: \"ef07db86-4677-41ce-8b6d-7960cc63a9b8\") " pod="openshift-apiserver/apiserver-76f77b778f-btl2s" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.475770 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8d258\" (UniqueName: \"kubernetes.io/projected/8cacf151-878c-4b98-accc-7731c17b2de8-kube-api-access-8d258\") pod \"olm-operator-6b444d44fb-h2xwg\" (UID: \"8cacf151-878c-4b98-accc-7731c17b2de8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h2xwg" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.475788 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/871c529e-b263-4600-982a-d7be266f86e4-config\") pod \"machine-api-operator-5694c8668f-vbg6j\" (UID: \"871c529e-b263-4600-982a-d7be266f86e4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vbg6j" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.475801 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9-console-oauth-config\") pod \"console-f9d7485db-psm2j\" (UID: \"ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9\") " pod="openshift-console/console-f9d7485db-psm2j" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.475819 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mk4m\" (UniqueName: \"kubernetes.io/projected/44475c5a-f261-48e3-be03-2a5b4f127e5a-kube-api-access-7mk4m\") pod \"migrator-59844c95c7-52b8f\" (UID: \"44475c5a-f261-48e3-be03-2a5b4f127e5a\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-52b8f" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.475836 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bdb8df65-c9b0-488a-9653-c0e97024027b-serving-cert\") pod \"openshift-config-operator-7777fb866f-wdnt5\" (UID: \"bdb8df65-c9b0-488a-9653-c0e97024027b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wdnt5" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.475851 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef07db86-4677-41ce-8b6d-7960cc63a9b8-config\") pod \"apiserver-76f77b778f-btl2s\" (UID: \"ef07db86-4677-41ce-8b6d-7960cc63a9b8\") " pod="openshift-apiserver/apiserver-76f77b778f-btl2s" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.475867 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdxkd\" (UniqueName: \"kubernetes.io/projected/55b95986-0c2a-4589-98cf-e3834cf5982d-kube-api-access-mdxkd\") pod \"openshift-apiserver-operator-796bbdcf4f-66mm8\" (UID: \"55b95986-0c2a-4589-98cf-e3834cf5982d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-66mm8" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.475881 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9-trusted-ca-bundle\") pod \"console-f9d7485db-psm2j\" (UID: \"ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9\") " pod="openshift-console/console-f9d7485db-psm2j" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.475898 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pjrq\" (UniqueName: \"kubernetes.io/projected/871c529e-b263-4600-982a-d7be266f86e4-kube-api-access-7pjrq\") pod \"machine-api-operator-5694c8668f-vbg6j\" (UID: \"871c529e-b263-4600-982a-d7be266f86e4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vbg6j" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.475915 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/807a844d-47c0-4dc3-b820-0ea0069a28b6-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-zzldb\" (UID: \"807a844d-47c0-4dc3-b820-0ea0069a28b6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zzldb" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.475931 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6f37e61e-cd0d-4394-bdfa-a7c08c0742f3-client-ca\") pod \"controller-manager-879f6c89f-x2sb4\" (UID: \"6f37e61e-cd0d-4394-bdfa-a7c08c0742f3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x2sb4" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.475949 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9-oauth-serving-cert\") pod \"console-f9d7485db-psm2j\" (UID: \"ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9\") " pod="openshift-console/console-f9d7485db-psm2j" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.475967 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8cacf151-878c-4b98-accc-7731c17b2de8-srv-cert\") pod \"olm-operator-6b444d44fb-h2xwg\" (UID: \"8cacf151-878c-4b98-accc-7731c17b2de8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h2xwg" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.475986 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8cacf151-878c-4b98-accc-7731c17b2de8-profile-collector-cert\") pod \"olm-operator-6b444d44fb-h2xwg\" (UID: \"8cacf151-878c-4b98-accc-7731c17b2de8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h2xwg" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.476021 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39b2c526-e373-404b-a607-bdd8d29c8fae-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-lmn9t\" (UID: \"39b2c526-e373-404b-a607-bdd8d29c8fae\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lmn9t" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.476041 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ef07db86-4677-41ce-8b6d-7960cc63a9b8-etcd-client\") pod \"apiserver-76f77b778f-btl2s\" (UID: \"ef07db86-4677-41ce-8b6d-7960cc63a9b8\") " pod="openshift-apiserver/apiserver-76f77b778f-btl2s" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.476075 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/39b2c526-e373-404b-a607-bdd8d29c8fae-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-lmn9t\" (UID: \"39b2c526-e373-404b-a607-bdd8d29c8fae\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lmn9t" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.476097 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/871c529e-b263-4600-982a-d7be266f86e4-images\") pod \"machine-api-operator-5694c8668f-vbg6j\" (UID: \"871c529e-b263-4600-982a-d7be266f86e4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vbg6j" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.476119 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/39b2c526-e373-404b-a607-bdd8d29c8fae-etcd-client\") pod \"apiserver-7bbb656c7d-lmn9t\" (UID: \"39b2c526-e373-404b-a607-bdd8d29c8fae\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lmn9t" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.476134 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sb8bt\" (UniqueName: \"kubernetes.io/projected/39b2c526-e373-404b-a607-bdd8d29c8fae-kube-api-access-sb8bt\") pod \"apiserver-7bbb656c7d-lmn9t\" (UID: \"39b2c526-e373-404b-a607-bdd8d29c8fae\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lmn9t" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.476150 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/39b2c526-e373-404b-a607-bdd8d29c8fae-serving-cert\") pod \"apiserver-7bbb656c7d-lmn9t\" (UID: \"39b2c526-e373-404b-a607-bdd8d29c8fae\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lmn9t" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.476166 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/ef07db86-4677-41ce-8b6d-7960cc63a9b8-audit\") pod \"apiserver-76f77b778f-btl2s\" (UID: \"ef07db86-4677-41ce-8b6d-7960cc63a9b8\") " pod="openshift-apiserver/apiserver-76f77b778f-btl2s" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.476180 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/39b2c526-e373-404b-a607-bdd8d29c8fae-encryption-config\") pod \"apiserver-7bbb656c7d-lmn9t\" (UID: \"39b2c526-e373-404b-a607-bdd8d29c8fae\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lmn9t" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.476194 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9-service-ca\") pod \"console-f9d7485db-psm2j\" (UID: \"ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9\") " pod="openshift-console/console-f9d7485db-psm2j" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.476209 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbpjl\" (UniqueName: \"kubernetes.io/projected/6f37e61e-cd0d-4394-bdfa-a7c08c0742f3-kube-api-access-wbpjl\") pod \"controller-manager-879f6c89f-x2sb4\" (UID: \"6f37e61e-cd0d-4394-bdfa-a7c08c0742f3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x2sb4" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.476229 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/807a844d-47c0-4dc3-b820-0ea0069a28b6-config\") pod \"kube-controller-manager-operator-78b949d7b-zzldb\" (UID: \"807a844d-47c0-4dc3-b820-0ea0069a28b6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zzldb" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.469346 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.469378 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.469495 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.469553 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.469680 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.469704 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.469738 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.469977 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.470030 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.470061 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.470093 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.470123 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.470150 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.470180 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.470207 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.470234 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.470261 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.470289 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.472063 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.472729 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.480547 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.487840 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-2vgnl"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.488424 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-2vgnl" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.491777 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wt8zp"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.492391 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vcv9c"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.493272 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wt8zp" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.501674 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tsbfw"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.502398 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vcv9c" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.503602 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.509095 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-7l845"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.509225 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tsbfw" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.509822 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-spcr2"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.510165 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7l845" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.511509 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-57kx4"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.512035 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jn8vg"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.512293 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-spcr2" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.512290 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-57kx4" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.512449 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-xmj8x"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.512876 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-jn8vg" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.512890 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-66mm8"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.512912 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-wnwsg"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.513017 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-xmj8x" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.513777 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557860-g42hf"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.514216 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wnwsg" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.514563 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-sfrgk"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.514596 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-qgm7k"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.514608 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-vbg6j"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.514694 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557860-g42hf" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.515629 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mjzxr"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.517212 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-psm2j"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.518376 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-gnrsd"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.519448 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qq2pq"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.520763 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-cgrf8"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.522043 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-2p482"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.523867 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.524894 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-2p482" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.526491 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-wdnt5"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.530390 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-57kx4"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.532667 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-x2sb4"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.535754 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-m8wwx"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.541881 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.542128 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-52b8f"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.542224 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-btl2s"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.543166 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-vkgvv"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.544120 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jj2bg"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.546715 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vcv9c"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.548018 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v7lxq"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.550007 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zzldb"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.551439 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qvvds"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.552828 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-ccjq4"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.554017 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-k4jvl"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.555596 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-7l845"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.557199 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h4wwz"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.558641 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.561364 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-6xtr4"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.562112 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-6xtr4" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.563143 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wt8zp"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.564094 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-spcr2"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.565257 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-lmn9t"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.566087 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h2xwg"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.567209 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jn9rp"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.568241 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tsbfw"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.570006 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-phr5l"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.571016 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-phr5l" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.571106 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jn8vg"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.572201 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557860-g42hf"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.573092 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-wnwsg"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.574017 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-2vgnl"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.574980 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-phr5l"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.575913 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-xmj8x"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.576790 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/807a844d-47c0-4dc3-b820-0ea0069a28b6-config\") pod \"kube-controller-manager-operator-78b949d7b-zzldb\" (UID: \"807a844d-47c0-4dc3-b820-0ea0069a28b6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zzldb" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.576814 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ef07db86-4677-41ce-8b6d-7960cc63a9b8-etcd-serving-ca\") pod \"apiserver-76f77b778f-btl2s\" (UID: \"ef07db86-4677-41ce-8b6d-7960cc63a9b8\") " pod="openshift-apiserver/apiserver-76f77b778f-btl2s" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.576830 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/ef07db86-4677-41ce-8b6d-7960cc63a9b8-image-import-ca\") pod \"apiserver-76f77b778f-btl2s\" (UID: \"ef07db86-4677-41ce-8b6d-7960cc63a9b8\") " pod="openshift-apiserver/apiserver-76f77b778f-btl2s" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.576847 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e25c5b47-b7b9-4213-931b-5d6691aaa2d4-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jj2bg\" (UID: \"e25c5b47-b7b9-4213-931b-5d6691aaa2d4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jj2bg" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.576864 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ef07db86-4677-41ce-8b6d-7960cc63a9b8-encryption-config\") pod \"apiserver-76f77b778f-btl2s\" (UID: \"ef07db86-4677-41ce-8b6d-7960cc63a9b8\") " pod="openshift-apiserver/apiserver-76f77b778f-btl2s" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.576878 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55b95986-0c2a-4589-98cf-e3834cf5982d-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-66mm8\" (UID: \"55b95986-0c2a-4589-98cf-e3834cf5982d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-66mm8" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.576893 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e25c5b47-b7b9-4213-931b-5d6691aaa2d4-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jj2bg\" (UID: \"e25c5b47-b7b9-4213-931b-5d6691aaa2d4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jj2bg" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.576907 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f37e61e-cd0d-4394-bdfa-a7c08c0742f3-config\") pod \"controller-manager-879f6c89f-x2sb4\" (UID: \"6f37e61e-cd0d-4394-bdfa-a7c08c0742f3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x2sb4" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.576921 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9-console-config\") pod \"console-f9d7485db-psm2j\" (UID: \"ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9\") " pod="openshift-console/console-f9d7485db-psm2j" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.576935 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/39b2c526-e373-404b-a607-bdd8d29c8fae-audit-policies\") pod \"apiserver-7bbb656c7d-lmn9t\" (UID: \"39b2c526-e373-404b-a607-bdd8d29c8fae\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lmn9t" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.576951 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bdb8df65-c9b0-488a-9653-c0e97024027b-available-featuregates\") pod \"openshift-config-operator-7777fb866f-wdnt5\" (UID: \"bdb8df65-c9b0-488a-9653-c0e97024027b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wdnt5" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.576967 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ef07db86-4677-41ce-8b6d-7960cc63a9b8-trusted-ca-bundle\") pod \"apiserver-76f77b778f-btl2s\" (UID: \"ef07db86-4677-41ce-8b6d-7960cc63a9b8\") " pod="openshift-apiserver/apiserver-76f77b778f-btl2s" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.576982 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/807a844d-47c0-4dc3-b820-0ea0069a28b6-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-zzldb\" (UID: \"807a844d-47c0-4dc3-b820-0ea0069a28b6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zzldb" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.576996 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55b95986-0c2a-4589-98cf-e3834cf5982d-config\") pod \"openshift-apiserver-operator-796bbdcf4f-66mm8\" (UID: \"55b95986-0c2a-4589-98cf-e3834cf5982d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-66mm8" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.577013 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rg6g\" (UniqueName: \"kubernetes.io/projected/ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9-kube-api-access-4rg6g\") pod \"console-f9d7485db-psm2j\" (UID: \"ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9\") " pod="openshift-console/console-f9d7485db-psm2j" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.577029 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef07db86-4677-41ce-8b6d-7960cc63a9b8-serving-cert\") pod \"apiserver-76f77b778f-btl2s\" (UID: \"ef07db86-4677-41ce-8b6d-7960cc63a9b8\") " pod="openshift-apiserver/apiserver-76f77b778f-btl2s" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.577044 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6f37e61e-cd0d-4394-bdfa-a7c08c0742f3-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-x2sb4\" (UID: \"6f37e61e-cd0d-4394-bdfa-a7c08c0742f3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x2sb4" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.577061 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7bb36e2b-cd32-41b1-aa71-e9e6c8e8ce4b-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-k4jvl\" (UID: \"7bb36e2b-cd32-41b1-aa71-e9e6c8e8ce4b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-k4jvl" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.577075 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9-console-serving-cert\") pod \"console-f9d7485db-psm2j\" (UID: \"ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9\") " pod="openshift-console/console-f9d7485db-psm2j" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.577091 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/871c529e-b263-4600-982a-d7be266f86e4-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-vbg6j\" (UID: \"871c529e-b263-4600-982a-d7be266f86e4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vbg6j" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.577104 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqqml\" (UniqueName: \"kubernetes.io/projected/bdb8df65-c9b0-488a-9653-c0e97024027b-kube-api-access-hqqml\") pod \"openshift-config-operator-7777fb866f-wdnt5\" (UID: \"bdb8df65-c9b0-488a-9653-c0e97024027b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wdnt5" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.577132 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ef07db86-4677-41ce-8b6d-7960cc63a9b8-node-pullsecrets\") pod \"apiserver-76f77b778f-btl2s\" (UID: \"ef07db86-4677-41ce-8b6d-7960cc63a9b8\") " pod="openshift-apiserver/apiserver-76f77b778f-btl2s" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.577148 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtp6t\" (UniqueName: \"kubernetes.io/projected/ef07db86-4677-41ce-8b6d-7960cc63a9b8-kube-api-access-gtp6t\") pod \"apiserver-76f77b778f-btl2s\" (UID: \"ef07db86-4677-41ce-8b6d-7960cc63a9b8\") " pod="openshift-apiserver/apiserver-76f77b778f-btl2s" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.577163 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7pcb\" (UniqueName: \"kubernetes.io/projected/7bb36e2b-cd32-41b1-aa71-e9e6c8e8ce4b-kube-api-access-s7pcb\") pod \"multus-admission-controller-857f4d67dd-k4jvl\" (UID: \"7bb36e2b-cd32-41b1-aa71-e9e6c8e8ce4b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-k4jvl" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.577180 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e25c5b47-b7b9-4213-931b-5d6691aaa2d4-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jj2bg\" (UID: \"e25c5b47-b7b9-4213-931b-5d6691aaa2d4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jj2bg" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.577194 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f37e61e-cd0d-4394-bdfa-a7c08c0742f3-serving-cert\") pod \"controller-manager-879f6c89f-x2sb4\" (UID: \"6f37e61e-cd0d-4394-bdfa-a7c08c0742f3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x2sb4" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.577212 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ef07db86-4677-41ce-8b6d-7960cc63a9b8-audit-dir\") pod \"apiserver-76f77b778f-btl2s\" (UID: \"ef07db86-4677-41ce-8b6d-7960cc63a9b8\") " pod="openshift-apiserver/apiserver-76f77b778f-btl2s" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.577228 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8d258\" (UniqueName: \"kubernetes.io/projected/8cacf151-878c-4b98-accc-7731c17b2de8-kube-api-access-8d258\") pod \"olm-operator-6b444d44fb-h2xwg\" (UID: \"8cacf151-878c-4b98-accc-7731c17b2de8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h2xwg" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.577242 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/39b2c526-e373-404b-a607-bdd8d29c8fae-audit-dir\") pod \"apiserver-7bbb656c7d-lmn9t\" (UID: \"39b2c526-e373-404b-a607-bdd8d29c8fae\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lmn9t" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.577259 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/871c529e-b263-4600-982a-d7be266f86e4-config\") pod \"machine-api-operator-5694c8668f-vbg6j\" (UID: \"871c529e-b263-4600-982a-d7be266f86e4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vbg6j" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.577275 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mk4m\" (UniqueName: \"kubernetes.io/projected/44475c5a-f261-48e3-be03-2a5b4f127e5a-kube-api-access-7mk4m\") pod \"migrator-59844c95c7-52b8f\" (UID: \"44475c5a-f261-48e3-be03-2a5b4f127e5a\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-52b8f" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.577290 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9-console-oauth-config\") pod \"console-f9d7485db-psm2j\" (UID: \"ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9\") " pod="openshift-console/console-f9d7485db-psm2j" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.577305 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bdb8df65-c9b0-488a-9653-c0e97024027b-serving-cert\") pod \"openshift-config-operator-7777fb866f-wdnt5\" (UID: \"bdb8df65-c9b0-488a-9653-c0e97024027b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wdnt5" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.577320 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef07db86-4677-41ce-8b6d-7960cc63a9b8-config\") pod \"apiserver-76f77b778f-btl2s\" (UID: \"ef07db86-4677-41ce-8b6d-7960cc63a9b8\") " pod="openshift-apiserver/apiserver-76f77b778f-btl2s" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.577336 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdxkd\" (UniqueName: \"kubernetes.io/projected/55b95986-0c2a-4589-98cf-e3834cf5982d-kube-api-access-mdxkd\") pod \"openshift-apiserver-operator-796bbdcf4f-66mm8\" (UID: \"55b95986-0c2a-4589-98cf-e3834cf5982d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-66mm8" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.577352 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9-trusted-ca-bundle\") pod \"console-f9d7485db-psm2j\" (UID: \"ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9\") " pod="openshift-console/console-f9d7485db-psm2j" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.577368 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pjrq\" (UniqueName: \"kubernetes.io/projected/871c529e-b263-4600-982a-d7be266f86e4-kube-api-access-7pjrq\") pod \"machine-api-operator-5694c8668f-vbg6j\" (UID: \"871c529e-b263-4600-982a-d7be266f86e4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vbg6j" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.577383 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/807a844d-47c0-4dc3-b820-0ea0069a28b6-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-zzldb\" (UID: \"807a844d-47c0-4dc3-b820-0ea0069a28b6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zzldb" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.577397 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6f37e61e-cd0d-4394-bdfa-a7c08c0742f3-client-ca\") pod \"controller-manager-879f6c89f-x2sb4\" (UID: \"6f37e61e-cd0d-4394-bdfa-a7c08c0742f3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x2sb4" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.577411 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9-oauth-serving-cert\") pod \"console-f9d7485db-psm2j\" (UID: \"ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9\") " pod="openshift-console/console-f9d7485db-psm2j" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.577425 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8cacf151-878c-4b98-accc-7731c17b2de8-srv-cert\") pod \"olm-operator-6b444d44fb-h2xwg\" (UID: \"8cacf151-878c-4b98-accc-7731c17b2de8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h2xwg" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.577441 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8cacf151-878c-4b98-accc-7731c17b2de8-profile-collector-cert\") pod \"olm-operator-6b444d44fb-h2xwg\" (UID: \"8cacf151-878c-4b98-accc-7731c17b2de8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h2xwg" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.577464 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39b2c526-e373-404b-a607-bdd8d29c8fae-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-lmn9t\" (UID: \"39b2c526-e373-404b-a607-bdd8d29c8fae\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lmn9t" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.577479 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ef07db86-4677-41ce-8b6d-7960cc63a9b8-etcd-client\") pod \"apiserver-76f77b778f-btl2s\" (UID: \"ef07db86-4677-41ce-8b6d-7960cc63a9b8\") " pod="openshift-apiserver/apiserver-76f77b778f-btl2s" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.577507 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/39b2c526-e373-404b-a607-bdd8d29c8fae-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-lmn9t\" (UID: \"39b2c526-e373-404b-a607-bdd8d29c8fae\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lmn9t" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.577540 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/871c529e-b263-4600-982a-d7be266f86e4-images\") pod \"machine-api-operator-5694c8668f-vbg6j\" (UID: \"871c529e-b263-4600-982a-d7be266f86e4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vbg6j" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.577561 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/39b2c526-e373-404b-a607-bdd8d29c8fae-etcd-client\") pod \"apiserver-7bbb656c7d-lmn9t\" (UID: \"39b2c526-e373-404b-a607-bdd8d29c8fae\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lmn9t" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.577575 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sb8bt\" (UniqueName: \"kubernetes.io/projected/39b2c526-e373-404b-a607-bdd8d29c8fae-kube-api-access-sb8bt\") pod \"apiserver-7bbb656c7d-lmn9t\" (UID: \"39b2c526-e373-404b-a607-bdd8d29c8fae\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lmn9t" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.577591 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/39b2c526-e373-404b-a607-bdd8d29c8fae-serving-cert\") pod \"apiserver-7bbb656c7d-lmn9t\" (UID: \"39b2c526-e373-404b-a607-bdd8d29c8fae\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lmn9t" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.577606 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/ef07db86-4677-41ce-8b6d-7960cc63a9b8-audit\") pod \"apiserver-76f77b778f-btl2s\" (UID: \"ef07db86-4677-41ce-8b6d-7960cc63a9b8\") " pod="openshift-apiserver/apiserver-76f77b778f-btl2s" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.577620 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/39b2c526-e373-404b-a607-bdd8d29c8fae-encryption-config\") pod \"apiserver-7bbb656c7d-lmn9t\" (UID: \"39b2c526-e373-404b-a607-bdd8d29c8fae\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lmn9t" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.577634 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9-service-ca\") pod \"console-f9d7485db-psm2j\" (UID: \"ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9\") " pod="openshift-console/console-f9d7485db-psm2j" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.577649 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbpjl\" (UniqueName: \"kubernetes.io/projected/6f37e61e-cd0d-4394-bdfa-a7c08c0742f3-kube-api-access-wbpjl\") pod \"controller-manager-879f6c89f-x2sb4\" (UID: \"6f37e61e-cd0d-4394-bdfa-a7c08c0742f3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x2sb4" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.578385 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ef07db86-4677-41ce-8b6d-7960cc63a9b8-audit-dir\") pod \"apiserver-76f77b778f-btl2s\" (UID: \"ef07db86-4677-41ce-8b6d-7960cc63a9b8\") " pod="openshift-apiserver/apiserver-76f77b778f-btl2s" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.578407 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/807a844d-47c0-4dc3-b820-0ea0069a28b6-config\") pod \"kube-controller-manager-operator-78b949d7b-zzldb\" (UID: \"807a844d-47c0-4dc3-b820-0ea0069a28b6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zzldb" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.578772 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/39b2c526-e373-404b-a607-bdd8d29c8fae-audit-dir\") pod \"apiserver-7bbb656c7d-lmn9t\" (UID: \"39b2c526-e373-404b-a607-bdd8d29c8fae\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lmn9t" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.579321 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39b2c526-e373-404b-a607-bdd8d29c8fae-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-lmn9t\" (UID: \"39b2c526-e373-404b-a607-bdd8d29c8fae\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lmn9t" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.579421 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/39b2c526-e373-404b-a607-bdd8d29c8fae-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-lmn9t\" (UID: \"39b2c526-e373-404b-a607-bdd8d29c8fae\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lmn9t" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.579478 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55b95986-0c2a-4589-98cf-e3834cf5982d-config\") pod \"openshift-apiserver-operator-796bbdcf4f-66mm8\" (UID: \"55b95986-0c2a-4589-98cf-e3834cf5982d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-66mm8" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.579634 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.580482 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ef07db86-4677-41ce-8b6d-7960cc63a9b8-node-pullsecrets\") pod \"apiserver-76f77b778f-btl2s\" (UID: \"ef07db86-4677-41ce-8b6d-7960cc63a9b8\") " pod="openshift-apiserver/apiserver-76f77b778f-btl2s" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.581113 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6f37e61e-cd0d-4394-bdfa-a7c08c0742f3-client-ca\") pod \"controller-manager-879f6c89f-x2sb4\" (UID: \"6f37e61e-cd0d-4394-bdfa-a7c08c0742f3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x2sb4" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.581793 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/39b2c526-e373-404b-a607-bdd8d29c8fae-audit-policies\") pod \"apiserver-7bbb656c7d-lmn9t\" (UID: \"39b2c526-e373-404b-a607-bdd8d29c8fae\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lmn9t" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.582379 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9-console-config\") pod \"console-f9d7485db-psm2j\" (UID: \"ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9\") " pod="openshift-console/console-f9d7485db-psm2j" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.582379 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9-trusted-ca-bundle\") pod \"console-f9d7485db-psm2j\" (UID: \"ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9\") " pod="openshift-console/console-f9d7485db-psm2j" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.582637 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bdb8df65-c9b0-488a-9653-c0e97024027b-available-featuregates\") pod \"openshift-config-operator-7777fb866f-wdnt5\" (UID: \"bdb8df65-c9b0-488a-9653-c0e97024027b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wdnt5" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.584566 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9-console-serving-cert\") pod \"console-f9d7485db-psm2j\" (UID: \"ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9\") " pod="openshift-console/console-f9d7485db-psm2j" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.583139 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bdb8df65-c9b0-488a-9653-c0e97024027b-serving-cert\") pod \"openshift-config-operator-7777fb866f-wdnt5\" (UID: \"bdb8df65-c9b0-488a-9653-c0e97024027b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wdnt5" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.583173 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55b95986-0c2a-4589-98cf-e3834cf5982d-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-66mm8\" (UID: \"55b95986-0c2a-4589-98cf-e3834cf5982d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-66mm8" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.583189 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/39b2c526-e373-404b-a607-bdd8d29c8fae-etcd-client\") pod \"apiserver-7bbb656c7d-lmn9t\" (UID: \"39b2c526-e373-404b-a607-bdd8d29c8fae\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lmn9t" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.583977 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9-oauth-serving-cert\") pod \"console-f9d7485db-psm2j\" (UID: \"ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9\") " pod="openshift-console/console-f9d7485db-psm2j" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.584352 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/39b2c526-e373-404b-a607-bdd8d29c8fae-encryption-config\") pod \"apiserver-7bbb656c7d-lmn9t\" (UID: \"39b2c526-e373-404b-a607-bdd8d29c8fae\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lmn9t" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.584508 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f37e61e-cd0d-4394-bdfa-a7c08c0742f3-config\") pod \"controller-manager-879f6c89f-x2sb4\" (UID: \"6f37e61e-cd0d-4394-bdfa-a7c08c0742f3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x2sb4" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.584513 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/39b2c526-e373-404b-a607-bdd8d29c8fae-serving-cert\") pod \"apiserver-7bbb656c7d-lmn9t\" (UID: \"39b2c526-e373-404b-a607-bdd8d29c8fae\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lmn9t" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.582675 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6f37e61e-cd0d-4394-bdfa-a7c08c0742f3-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-x2sb4\" (UID: \"6f37e61e-cd0d-4394-bdfa-a7c08c0742f3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x2sb4" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.585101 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9-console-oauth-config\") pod \"console-f9d7485db-psm2j\" (UID: \"ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9\") " pod="openshift-console/console-f9d7485db-psm2j" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.592146 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f37e61e-cd0d-4394-bdfa-a7c08c0742f3-serving-cert\") pod \"controller-manager-879f6c89f-x2sb4\" (UID: \"6f37e61e-cd0d-4394-bdfa-a7c08c0742f3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x2sb4" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.597742 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-jh6jw"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.599023 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-jh6jw" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.600859 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.608333 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/807a844d-47c0-4dc3-b820-0ea0069a28b6-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-zzldb\" (UID: \"807a844d-47c0-4dc3-b820-0ea0069a28b6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zzldb" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.609268 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-xxwxx"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.610047 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-xxwxx" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.611146 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-xxwxx"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.613255 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-jh6jw"] Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.619317 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.639973 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.645221 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e25c5b47-b7b9-4213-931b-5d6691aaa2d4-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jj2bg\" (UID: \"e25c5b47-b7b9-4213-931b-5d6691aaa2d4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jj2bg" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.659081 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.678589 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.681693 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e25c5b47-b7b9-4213-931b-5d6691aaa2d4-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jj2bg\" (UID: \"e25c5b47-b7b9-4213-931b-5d6691aaa2d4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jj2bg" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.698925 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.719144 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.724082 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7bb36e2b-cd32-41b1-aa71-e9e6c8e8ce4b-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-k4jvl\" (UID: \"7bb36e2b-cd32-41b1-aa71-e9e6c8e8ce4b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-k4jvl" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.739817 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.759276 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.778613 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.798659 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.819022 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.839202 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.859506 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.879135 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.899774 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.918867 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.932773 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8cacf151-878c-4b98-accc-7731c17b2de8-srv-cert\") pod \"olm-operator-6b444d44fb-h2xwg\" (UID: \"8cacf151-878c-4b98-accc-7731c17b2de8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h2xwg" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.939470 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.959409 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.972616 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8cacf151-878c-4b98-accc-7731c17b2de8-profile-collector-cert\") pod \"olm-operator-6b444d44fb-h2xwg\" (UID: \"8cacf151-878c-4b98-accc-7731c17b2de8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h2xwg" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.979813 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 14 07:00:55 crc kubenswrapper[4893]: I0314 07:00:55.999174 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 14 07:00:56 crc kubenswrapper[4893]: I0314 07:00:56.018896 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 14 07:00:56 crc kubenswrapper[4893]: I0314 07:00:56.038685 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 14 07:00:56 crc kubenswrapper[4893]: I0314 07:00:56.060856 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 14 07:00:56 crc kubenswrapper[4893]: I0314 07:00:56.079574 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 14 07:00:56 crc kubenswrapper[4893]: I0314 07:00:56.099613 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 14 07:00:56 crc kubenswrapper[4893]: I0314 07:00:56.120015 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 14 07:00:56 crc kubenswrapper[4893]: I0314 07:00:56.179614 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 14 07:00:56 crc kubenswrapper[4893]: I0314 07:00:56.199271 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 14 07:00:56 crc kubenswrapper[4893]: I0314 07:00:56.218353 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 14 07:00:56 crc kubenswrapper[4893]: I0314 07:00:56.239595 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 14 07:00:56 crc kubenswrapper[4893]: I0314 07:00:56.259937 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 14 07:00:56 crc kubenswrapper[4893]: I0314 07:00:56.279408 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 14 07:00:56 crc kubenswrapper[4893]: I0314 07:00:56.304803 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 14 07:00:56 crc kubenswrapper[4893]: I0314 07:00:56.319276 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 14 07:00:56 crc kubenswrapper[4893]: I0314 07:00:56.339090 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 14 07:00:56 crc kubenswrapper[4893]: I0314 07:00:56.358721 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 14 07:00:56 crc kubenswrapper[4893]: I0314 07:00:56.379409 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 14 07:00:56 crc kubenswrapper[4893]: I0314 07:00:56.398773 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 14 07:00:56 crc kubenswrapper[4893]: I0314 07:00:56.419643 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 14 07:00:56 crc kubenswrapper[4893]: I0314 07:00:56.438620 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 14 07:00:56 crc kubenswrapper[4893]: I0314 07:00:56.460183 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 14 07:00:56 crc kubenswrapper[4893]: I0314 07:00:56.479280 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 14 07:00:56 crc kubenswrapper[4893]: I0314 07:00:56.498219 4893 request.go:700] Waited for 1.009398127s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-etcd-operator/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Mar 14 07:00:56 crc kubenswrapper[4893]: I0314 07:00:56.500483 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 14 07:00:56 crc kubenswrapper[4893]: I0314 07:00:56.519627 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 14 07:00:56 crc kubenswrapper[4893]: I0314 07:00:56.539456 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 14 07:00:56 crc kubenswrapper[4893]: I0314 07:00:56.559686 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 14 07:00:56 crc kubenswrapper[4893]: E0314 07:00:56.579453 4893 configmap.go:193] Couldn't get configMap openshift-apiserver/config: failed to sync configmap cache: timed out waiting for the condition Mar 14 07:00:56 crc kubenswrapper[4893]: E0314 07:00:56.579508 4893 secret.go:188] Couldn't get secret openshift-apiserver/etcd-client: failed to sync secret cache: timed out waiting for the condition Mar 14 07:00:56 crc kubenswrapper[4893]: E0314 07:00:56.579564 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ef07db86-4677-41ce-8b6d-7960cc63a9b8-config podName:ef07db86-4677-41ce-8b6d-7960cc63a9b8 nodeName:}" failed. No retries permitted until 2026-03-14 07:00:57.079513831 +0000 UTC m=+136.341690623 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/ef07db86-4677-41ce-8b6d-7960cc63a9b8-config") pod "apiserver-76f77b778f-btl2s" (UID: "ef07db86-4677-41ce-8b6d-7960cc63a9b8") : failed to sync configmap cache: timed out waiting for the condition Mar 14 07:00:56 crc kubenswrapper[4893]: E0314 07:00:56.579676 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ef07db86-4677-41ce-8b6d-7960cc63a9b8-etcd-client podName:ef07db86-4677-41ce-8b6d-7960cc63a9b8 nodeName:}" failed. No retries permitted until 2026-03-14 07:00:57.079650355 +0000 UTC m=+136.341827157 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/ef07db86-4677-41ce-8b6d-7960cc63a9b8-etcd-client") pod "apiserver-76f77b778f-btl2s" (UID: "ef07db86-4677-41ce-8b6d-7960cc63a9b8") : failed to sync secret cache: timed out waiting for the condition Mar 14 07:00:56 crc kubenswrapper[4893]: E0314 07:00:56.579455 4893 configmap.go:193] Couldn't get configMap openshift-machine-api/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Mar 14 07:00:56 crc kubenswrapper[4893]: E0314 07:00:56.579748 4893 configmap.go:193] Couldn't get configMap openshift-apiserver/image-import-ca: failed to sync configmap cache: timed out waiting for the condition Mar 14 07:00:56 crc kubenswrapper[4893]: E0314 07:00:56.579770 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/871c529e-b263-4600-982a-d7be266f86e4-config podName:871c529e-b263-4600-982a-d7be266f86e4 nodeName:}" failed. No retries permitted until 2026-03-14 07:00:57.079743447 +0000 UTC m=+136.341920319 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/871c529e-b263-4600-982a-d7be266f86e4-config") pod "machine-api-operator-5694c8668f-vbg6j" (UID: "871c529e-b263-4600-982a-d7be266f86e4") : failed to sync configmap cache: timed out waiting for the condition Mar 14 07:00:56 crc kubenswrapper[4893]: E0314 07:00:56.579800 4893 secret.go:188] Couldn't get secret openshift-apiserver/encryption-config-1: failed to sync secret cache: timed out waiting for the condition Mar 14 07:00:56 crc kubenswrapper[4893]: E0314 07:00:56.579797 4893 configmap.go:193] Couldn't get configMap openshift-apiserver/etcd-serving-ca: failed to sync configmap cache: timed out waiting for the condition Mar 14 07:00:56 crc kubenswrapper[4893]: E0314 07:00:56.579809 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ef07db86-4677-41ce-8b6d-7960cc63a9b8-image-import-ca podName:ef07db86-4677-41ce-8b6d-7960cc63a9b8 nodeName:}" failed. No retries permitted until 2026-03-14 07:00:57.079789878 +0000 UTC m=+136.341966730 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "image-import-ca" (UniqueName: "kubernetes.io/configmap/ef07db86-4677-41ce-8b6d-7960cc63a9b8-image-import-ca") pod "apiserver-76f77b778f-btl2s" (UID: "ef07db86-4677-41ce-8b6d-7960cc63a9b8") : failed to sync configmap cache: timed out waiting for the condition Mar 14 07:00:56 crc kubenswrapper[4893]: E0314 07:00:56.579841 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ef07db86-4677-41ce-8b6d-7960cc63a9b8-encryption-config podName:ef07db86-4677-41ce-8b6d-7960cc63a9b8 nodeName:}" failed. No retries permitted until 2026-03-14 07:00:57.079830729 +0000 UTC m=+136.342007531 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "encryption-config" (UniqueName: "kubernetes.io/secret/ef07db86-4677-41ce-8b6d-7960cc63a9b8-encryption-config") pod "apiserver-76f77b778f-btl2s" (UID: "ef07db86-4677-41ce-8b6d-7960cc63a9b8") : failed to sync secret cache: timed out waiting for the condition Mar 14 07:00:56 crc kubenswrapper[4893]: E0314 07:00:56.579862 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ef07db86-4677-41ce-8b6d-7960cc63a9b8-etcd-serving-ca podName:ef07db86-4677-41ce-8b6d-7960cc63a9b8 nodeName:}" failed. No retries permitted until 2026-03-14 07:00:57.079851979 +0000 UTC m=+136.342028781 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-serving-ca" (UniqueName: "kubernetes.io/configmap/ef07db86-4677-41ce-8b6d-7960cc63a9b8-etcd-serving-ca") pod "apiserver-76f77b778f-btl2s" (UID: "ef07db86-4677-41ce-8b6d-7960cc63a9b8") : failed to sync configmap cache: timed out waiting for the condition Mar 14 07:00:56 crc kubenswrapper[4893]: E0314 07:00:56.579862 4893 secret.go:188] Couldn't get secret openshift-machine-api/machine-api-operator-tls: failed to sync secret cache: timed out waiting for the condition Mar 14 07:00:56 crc kubenswrapper[4893]: E0314 07:00:56.579941 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/871c529e-b263-4600-982a-d7be266f86e4-machine-api-operator-tls podName:871c529e-b263-4600-982a-d7be266f86e4 nodeName:}" failed. No retries permitted until 2026-03-14 07:00:57.079922471 +0000 UTC m=+136.342099343 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "machine-api-operator-tls" (UniqueName: "kubernetes.io/secret/871c529e-b263-4600-982a-d7be266f86e4-machine-api-operator-tls") pod "machine-api-operator-5694c8668f-vbg6j" (UID: "871c529e-b263-4600-982a-d7be266f86e4") : failed to sync secret cache: timed out waiting for the condition Mar 14 07:00:56 crc kubenswrapper[4893]: E0314 07:00:56.579989 4893 secret.go:188] Couldn't get secret openshift-apiserver/serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 14 07:00:56 crc kubenswrapper[4893]: E0314 07:00:56.580164 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ef07db86-4677-41ce-8b6d-7960cc63a9b8-serving-cert podName:ef07db86-4677-41ce-8b6d-7960cc63a9b8 nodeName:}" failed. No retries permitted until 2026-03-14 07:00:57.080143216 +0000 UTC m=+136.342320148 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/ef07db86-4677-41ce-8b6d-7960cc63a9b8-serving-cert") pod "apiserver-76f77b778f-btl2s" (UID: "ef07db86-4677-41ce-8b6d-7960cc63a9b8") : failed to sync secret cache: timed out waiting for the condition Mar 14 07:00:56 crc kubenswrapper[4893]: I0314 07:00:56.580713 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 14 07:00:56 crc kubenswrapper[4893]: E0314 07:00:56.581653 4893 configmap.go:193] Couldn't get configMap openshift-machine-api/machine-api-operator-images: failed to sync configmap cache: timed out waiting for the condition Mar 14 07:00:56 crc kubenswrapper[4893]: E0314 07:00:56.581790 4893 configmap.go:193] Couldn't get configMap openshift-console/service-ca: failed to sync configmap cache: timed out waiting for the condition Mar 14 07:00:56 crc kubenswrapper[4893]: E0314 07:00:56.581867 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9-service-ca podName:ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9 nodeName:}" failed. No retries permitted until 2026-03-14 07:00:57.081830037 +0000 UTC m=+136.344006839 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca" (UniqueName: "kubernetes.io/configmap/ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9-service-ca") pod "console-f9d7485db-psm2j" (UID: "ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9") : failed to sync configmap cache: timed out waiting for the condition Mar 14 07:00:56 crc kubenswrapper[4893]: E0314 07:00:56.581698 4893 configmap.go:193] Couldn't get configMap openshift-apiserver/trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Mar 14 07:00:56 crc kubenswrapper[4893]: E0314 07:00:56.581935 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ef07db86-4677-41ce-8b6d-7960cc63a9b8-trusted-ca-bundle podName:ef07db86-4677-41ce-8b6d-7960cc63a9b8 nodeName:}" failed. No retries permitted until 2026-03-14 07:00:57.081898389 +0000 UTC m=+136.344075191 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/ef07db86-4677-41ce-8b6d-7960cc63a9b8-trusted-ca-bundle") pod "apiserver-76f77b778f-btl2s" (UID: "ef07db86-4677-41ce-8b6d-7960cc63a9b8") : failed to sync configmap cache: timed out waiting for the condition Mar 14 07:00:56 crc kubenswrapper[4893]: E0314 07:00:56.581723 4893 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-1: failed to sync configmap cache: timed out waiting for the condition Mar 14 07:00:56 crc kubenswrapper[4893]: E0314 07:00:56.581968 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ef07db86-4677-41ce-8b6d-7960cc63a9b8-audit podName:ef07db86-4677-41ce-8b6d-7960cc63a9b8 nodeName:}" failed. No retries permitted until 2026-03-14 07:00:57.081961081 +0000 UTC m=+136.344137883 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/ef07db86-4677-41ce-8b6d-7960cc63a9b8-audit") pod "apiserver-76f77b778f-btl2s" (UID: "ef07db86-4677-41ce-8b6d-7960cc63a9b8") : failed to sync configmap cache: timed out waiting for the condition Mar 14 07:00:56 crc kubenswrapper[4893]: E0314 07:00:56.582283 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/871c529e-b263-4600-982a-d7be266f86e4-images podName:871c529e-b263-4600-982a-d7be266f86e4 nodeName:}" failed. No retries permitted until 2026-03-14 07:00:57.082258048 +0000 UTC m=+136.344434890 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/871c529e-b263-4600-982a-d7be266f86e4-images") pod "machine-api-operator-5694c8668f-vbg6j" (UID: "871c529e-b263-4600-982a-d7be266f86e4") : failed to sync configmap cache: timed out waiting for the condition Mar 14 07:00:56 crc kubenswrapper[4893]: I0314 07:00:56.600854 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 14 07:00:56 crc kubenswrapper[4893]: I0314 07:00:56.619189 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 14 07:00:56 crc kubenswrapper[4893]: I0314 07:00:56.639929 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 14 07:00:56 crc kubenswrapper[4893]: I0314 07:00:56.658909 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 14 07:00:56 crc kubenswrapper[4893]: I0314 07:00:56.679627 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 14 07:00:56 crc kubenswrapper[4893]: I0314 07:00:56.701161 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 14 07:00:56 crc kubenswrapper[4893]: I0314 07:00:56.720357 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 14 07:00:56 crc kubenswrapper[4893]: I0314 07:00:56.740129 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 14 07:00:56 crc kubenswrapper[4893]: I0314 07:00:56.759064 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 14 07:00:56 crc kubenswrapper[4893]: I0314 07:00:56.779699 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 14 07:00:56 crc kubenswrapper[4893]: I0314 07:00:56.799175 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 14 07:00:56 crc kubenswrapper[4893]: I0314 07:00:56.819763 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 14 07:00:56 crc kubenswrapper[4893]: I0314 07:00:56.839477 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 14 07:00:56 crc kubenswrapper[4893]: I0314 07:00:56.859655 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 14 07:00:56 crc kubenswrapper[4893]: I0314 07:00:56.891653 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 14 07:00:56 crc kubenswrapper[4893]: I0314 07:00:56.900829 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 14 07:00:56 crc kubenswrapper[4893]: I0314 07:00:56.920776 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 14 07:00:56 crc kubenswrapper[4893]: I0314 07:00:56.939672 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 14 07:00:56 crc kubenswrapper[4893]: I0314 07:00:56.959781 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 14 07:00:56 crc kubenswrapper[4893]: I0314 07:00:56.979734 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.000795 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.020109 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.039463 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.060764 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.080043 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.096331 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/871c529e-b263-4600-982a-d7be266f86e4-config\") pod \"machine-api-operator-5694c8668f-vbg6j\" (UID: \"871c529e-b263-4600-982a-d7be266f86e4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vbg6j" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.096698 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef07db86-4677-41ce-8b6d-7960cc63a9b8-config\") pod \"apiserver-76f77b778f-btl2s\" (UID: \"ef07db86-4677-41ce-8b6d-7960cc63a9b8\") " pod="openshift-apiserver/apiserver-76f77b778f-btl2s" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.096942 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ef07db86-4677-41ce-8b6d-7960cc63a9b8-etcd-client\") pod \"apiserver-76f77b778f-btl2s\" (UID: \"ef07db86-4677-41ce-8b6d-7960cc63a9b8\") " pod="openshift-apiserver/apiserver-76f77b778f-btl2s" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.097286 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/871c529e-b263-4600-982a-d7be266f86e4-images\") pod \"machine-api-operator-5694c8668f-vbg6j\" (UID: \"871c529e-b263-4600-982a-d7be266f86e4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vbg6j" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.097580 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/ef07db86-4677-41ce-8b6d-7960cc63a9b8-audit\") pod \"apiserver-76f77b778f-btl2s\" (UID: \"ef07db86-4677-41ce-8b6d-7960cc63a9b8\") " pod="openshift-apiserver/apiserver-76f77b778f-btl2s" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.097801 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9-service-ca\") pod \"console-f9d7485db-psm2j\" (UID: \"ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9\") " pod="openshift-console/console-f9d7485db-psm2j" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.098053 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ef07db86-4677-41ce-8b6d-7960cc63a9b8-etcd-serving-ca\") pod \"apiserver-76f77b778f-btl2s\" (UID: \"ef07db86-4677-41ce-8b6d-7960cc63a9b8\") " pod="openshift-apiserver/apiserver-76f77b778f-btl2s" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.098343 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/ef07db86-4677-41ce-8b6d-7960cc63a9b8-image-import-ca\") pod \"apiserver-76f77b778f-btl2s\" (UID: \"ef07db86-4677-41ce-8b6d-7960cc63a9b8\") " pod="openshift-apiserver/apiserver-76f77b778f-btl2s" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.098622 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ef07db86-4677-41ce-8b6d-7960cc63a9b8-encryption-config\") pod \"apiserver-76f77b778f-btl2s\" (UID: \"ef07db86-4677-41ce-8b6d-7960cc63a9b8\") " pod="openshift-apiserver/apiserver-76f77b778f-btl2s" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.098847 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ef07db86-4677-41ce-8b6d-7960cc63a9b8-trusted-ca-bundle\") pod \"apiserver-76f77b778f-btl2s\" (UID: \"ef07db86-4677-41ce-8b6d-7960cc63a9b8\") " pod="openshift-apiserver/apiserver-76f77b778f-btl2s" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.099064 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef07db86-4677-41ce-8b6d-7960cc63a9b8-serving-cert\") pod \"apiserver-76f77b778f-btl2s\" (UID: \"ef07db86-4677-41ce-8b6d-7960cc63a9b8\") " pod="openshift-apiserver/apiserver-76f77b778f-btl2s" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.099310 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/871c529e-b263-4600-982a-d7be266f86e4-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-vbg6j\" (UID: \"871c529e-b263-4600-982a-d7be266f86e4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vbg6j" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.100460 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.119187 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.139195 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.159505 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.179463 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.200294 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.220658 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.240478 4893 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.259139 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.281432 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.330017 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbpjl\" (UniqueName: \"kubernetes.io/projected/6f37e61e-cd0d-4394-bdfa-a7c08c0742f3-kube-api-access-wbpjl\") pod \"controller-manager-879f6c89f-x2sb4\" (UID: \"6f37e61e-cd0d-4394-bdfa-a7c08c0742f3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x2sb4" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.337868 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sb8bt\" (UniqueName: \"kubernetes.io/projected/39b2c526-e373-404b-a607-bdd8d29c8fae-kube-api-access-sb8bt\") pod \"apiserver-7bbb656c7d-lmn9t\" (UID: \"39b2c526-e373-404b-a607-bdd8d29c8fae\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lmn9t" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.359808 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/807a844d-47c0-4dc3-b820-0ea0069a28b6-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-zzldb\" (UID: \"807a844d-47c0-4dc3-b820-0ea0069a28b6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zzldb" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.366318 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lmn9t" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.397593 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8d258\" (UniqueName: \"kubernetes.io/projected/8cacf151-878c-4b98-accc-7731c17b2de8-kube-api-access-8d258\") pod \"olm-operator-6b444d44fb-h2xwg\" (UID: \"8cacf151-878c-4b98-accc-7731c17b2de8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h2xwg" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.421712 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mk4m\" (UniqueName: \"kubernetes.io/projected/44475c5a-f261-48e3-be03-2a5b4f127e5a-kube-api-access-7mk4m\") pod \"migrator-59844c95c7-52b8f\" (UID: \"44475c5a-f261-48e3-be03-2a5b4f127e5a\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-52b8f" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.428325 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zzldb" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.436825 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e25c5b47-b7b9-4213-931b-5d6691aaa2d4-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jj2bg\" (UID: \"e25c5b47-b7b9-4213-931b-5d6691aaa2d4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jj2bg" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.444771 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-52b8f" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.459992 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rg6g\" (UniqueName: \"kubernetes.io/projected/ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9-kube-api-access-4rg6g\") pod \"console-f9d7485db-psm2j\" (UID: \"ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9\") " pod="openshift-console/console-f9d7485db-psm2j" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.477067 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h2xwg" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.490326 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqqml\" (UniqueName: \"kubernetes.io/projected/bdb8df65-c9b0-488a-9653-c0e97024027b-kube-api-access-hqqml\") pod \"openshift-config-operator-7777fb866f-wdnt5\" (UID: \"bdb8df65-c9b0-488a-9653-c0e97024027b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wdnt5" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.502675 4893 request.go:700] Waited for 1.922043173s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-multus/serviceaccounts/multus-ac/token Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.526736 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7pcb\" (UniqueName: \"kubernetes.io/projected/7bb36e2b-cd32-41b1-aa71-e9e6c8e8ce4b-kube-api-access-s7pcb\") pod \"multus-admission-controller-857f4d67dd-k4jvl\" (UID: \"7bb36e2b-cd32-41b1-aa71-e9e6c8e8ce4b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-k4jvl" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.542022 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.544239 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdxkd\" (UniqueName: \"kubernetes.io/projected/55b95986-0c2a-4589-98cf-e3834cf5982d-kube-api-access-mdxkd\") pod \"openshift-apiserver-operator-796bbdcf4f-66mm8\" (UID: \"55b95986-0c2a-4589-98cf-e3834cf5982d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-66mm8" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.560692 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.585845 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.586207 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-lmn9t"] Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.601836 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.618784 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.626457 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-x2sb4" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.642295 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.659536 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wdnt5" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.659730 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.667749 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zzldb"] Mar 14 07:00:57 crc kubenswrapper[4893]: W0314 07:00:57.688406 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod807a844d_47c0_4dc3_b820_0ea0069a28b6.slice/crio-006a4fcf98451c3f8b59e3d26a4e2fefcd4c166af2e0fe145bbf1f62edb23dc2 WatchSource:0}: Error finding container 006a4fcf98451c3f8b59e3d26a4e2fefcd4c166af2e0fe145bbf1f62edb23dc2: Status 404 returned error can't find the container with id 006a4fcf98451c3f8b59e3d26a4e2fefcd4c166af2e0fe145bbf1f62edb23dc2 Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.707767 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/42dc5935-aed6-4fee-a749-3c292d042df5-serving-cert\") pod \"authentication-operator-69f744f599-qgm7k\" (UID: \"42dc5935-aed6-4fee-a749-3c292d042df5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qgm7k" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.707826 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc155a4b-53ea-4394-87d2-d4f966e3589d-serving-cert\") pod \"console-operator-58897d9998-cgrf8\" (UID: \"bc155a4b-53ea-4394-87d2-d4f966e3589d\") " pod="openshift-console-operator/console-operator-58897d9998-cgrf8" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.707852 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/928468fc-c237-4779-a2c6-7365b3764fe8-audit-dir\") pod \"oauth-openshift-558db77b4-sfrgk\" (UID: \"928468fc-c237-4779-a2c6-7365b3764fe8\") " pod="openshift-authentication/oauth-openshift-558db77b4-sfrgk" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.707870 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/85a8345c-8774-4272-887a-42b2d64a65cf-registry-certificates\") pod \"image-registry-697d97f7c8-mjzxr\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.707895 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95fe2bd1-1c99-4105-931e-6a60dd881260-config\") pod \"route-controller-manager-6576b87f9c-m8wwx\" (UID: \"95fe2bd1-1c99-4105-931e-6a60dd881260\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m8wwx" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.707909 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/95fe2bd1-1c99-4105-931e-6a60dd881260-serving-cert\") pod \"route-controller-manager-6576b87f9c-m8wwx\" (UID: \"95fe2bd1-1c99-4105-931e-6a60dd881260\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m8wwx" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.707939 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjzxr\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.707969 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z28pl\" (UniqueName: \"kubernetes.io/projected/95fe2bd1-1c99-4105-931e-6a60dd881260-kube-api-access-z28pl\") pod \"route-controller-manager-6576b87f9c-m8wwx\" (UID: \"95fe2bd1-1c99-4105-931e-6a60dd881260\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m8wwx" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.707986 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/1c71efc6-2e03-405c-84f9-6ba44b085df4-default-certificate\") pod \"router-default-5444994796-ss7r8\" (UID: \"1c71efc6-2e03-405c-84f9-6ba44b085df4\") " pod="openshift-ingress/router-default-5444994796-ss7r8" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.708002 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ef0e0ce2-d109-426d-8d69-7eb458708189-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-qq2pq\" (UID: \"ef0e0ce2-d109-426d-8d69-7eb458708189\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qq2pq" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.708017 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc155a4b-53ea-4394-87d2-d4f966e3589d-config\") pod \"console-operator-58897d9998-cgrf8\" (UID: \"bc155a4b-53ea-4394-87d2-d4f966e3589d\") " pod="openshift-console-operator/console-operator-58897d9998-cgrf8" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.708033 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/928468fc-c237-4779-a2c6-7365b3764fe8-audit-policies\") pod \"oauth-openshift-558db77b4-sfrgk\" (UID: \"928468fc-c237-4779-a2c6-7365b3764fe8\") " pod="openshift-authentication/oauth-openshift-558db77b4-sfrgk" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.708063 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b511756d-571f-4e3f-9fde-bd9cd2d6e038-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-jn9rp\" (UID: \"b511756d-571f-4e3f-9fde-bd9cd2d6e038\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jn9rp" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.708080 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/928468fc-c237-4779-a2c6-7365b3764fe8-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-sfrgk\" (UID: \"928468fc-c237-4779-a2c6-7365b3764fe8\") " pod="openshift-authentication/oauth-openshift-558db77b4-sfrgk" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.708924 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2336c0f6-9ad9-45d3-bb00-cc5832466e7c-auth-proxy-config\") pod \"machine-approver-56656f9798-7wtvk\" (UID: \"2336c0f6-9ad9-45d3-bb00-cc5832466e7c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7wtvk" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.709150 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5b706875-4635-4744-941b-f235ebd548b0-metrics-tls\") pod \"dns-operator-744455d44c-gnrsd\" (UID: \"5b706875-4635-4744-941b-f235ebd548b0\") " pod="openshift-dns-operator/dns-operator-744455d44c-gnrsd" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.709184 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b82b8dee-58d7-4701-ad47-8ddc20898935-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-qvvds\" (UID: \"b82b8dee-58d7-4701-ad47-8ddc20898935\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qvvds" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.709244 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b82b8dee-58d7-4701-ad47-8ddc20898935-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-qvvds\" (UID: \"b82b8dee-58d7-4701-ad47-8ddc20898935\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qvvds" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.709283 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b511756d-571f-4e3f-9fde-bd9cd2d6e038-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-jn9rp\" (UID: \"b511756d-571f-4e3f-9fde-bd9cd2d6e038\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jn9rp" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.709302 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42dc5935-aed6-4fee-a749-3c292d042df5-config\") pod \"authentication-operator-69f744f599-qgm7k\" (UID: \"42dc5935-aed6-4fee-a749-3c292d042df5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qgm7k" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.709393 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qq9k\" (UniqueName: \"kubernetes.io/projected/b1b92f23-a052-41c6-817f-d43b04079105-kube-api-access-7qq9k\") pod \"downloads-7954f5f757-vkgvv\" (UID: \"b1b92f23-a052-41c6-817f-d43b04079105\") " pod="openshift-console/downloads-7954f5f757-vkgvv" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.709430 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/1c71efc6-2e03-405c-84f9-6ba44b085df4-stats-auth\") pod \"router-default-5444994796-ss7r8\" (UID: \"1c71efc6-2e03-405c-84f9-6ba44b085df4\") " pod="openshift-ingress/router-default-5444994796-ss7r8" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.709452 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/928468fc-c237-4779-a2c6-7365b3764fe8-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-sfrgk\" (UID: \"928468fc-c237-4779-a2c6-7365b3764fe8\") " pod="openshift-authentication/oauth-openshift-558db77b4-sfrgk" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.709467 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ls6kb\" (UniqueName: \"kubernetes.io/projected/1c71efc6-2e03-405c-84f9-6ba44b085df4-kube-api-access-ls6kb\") pod \"router-default-5444994796-ss7r8\" (UID: \"1c71efc6-2e03-405c-84f9-6ba44b085df4\") " pod="openshift-ingress/router-default-5444994796-ss7r8" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.709488 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/85a8345c-8774-4272-887a-42b2d64a65cf-trusted-ca\") pod \"image-registry-697d97f7c8-mjzxr\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.709514 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1c71efc6-2e03-405c-84f9-6ba44b085df4-metrics-certs\") pod \"router-default-5444994796-ss7r8\" (UID: \"1c71efc6-2e03-405c-84f9-6ba44b085df4\") " pod="openshift-ingress/router-default-5444994796-ss7r8" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.709552 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2pmc\" (UniqueName: \"kubernetes.io/projected/ef0e0ce2-d109-426d-8d69-7eb458708189-kube-api-access-v2pmc\") pod \"cluster-samples-operator-665b6dd947-qq2pq\" (UID: \"ef0e0ce2-d109-426d-8d69-7eb458708189\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qq2pq" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.709579 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/928468fc-c237-4779-a2c6-7365b3764fe8-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-sfrgk\" (UID: \"928468fc-c237-4779-a2c6-7365b3764fe8\") " pod="openshift-authentication/oauth-openshift-558db77b4-sfrgk" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.709600 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/928468fc-c237-4779-a2c6-7365b3764fe8-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-sfrgk\" (UID: \"928468fc-c237-4779-a2c6-7365b3764fe8\") " pod="openshift-authentication/oauth-openshift-558db77b4-sfrgk" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.709674 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/85a8345c-8774-4272-887a-42b2d64a65cf-bound-sa-token\") pod \"image-registry-697d97f7c8-mjzxr\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:00:57 crc kubenswrapper[4893]: E0314 07:00:57.709966 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:00:58.20993987 +0000 UTC m=+137.472116673 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjzxr" (UID: "85a8345c-8774-4272-887a-42b2d64a65cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.710218 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2pcz\" (UniqueName: \"kubernetes.io/projected/2336c0f6-9ad9-45d3-bb00-cc5832466e7c-kube-api-access-t2pcz\") pod \"machine-approver-56656f9798-7wtvk\" (UID: \"2336c0f6-9ad9-45d3-bb00-cc5832466e7c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7wtvk" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.710278 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2336c0f6-9ad9-45d3-bb00-cc5832466e7c-config\") pod \"machine-approver-56656f9798-7wtvk\" (UID: \"2336c0f6-9ad9-45d3-bb00-cc5832466e7c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7wtvk" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.710318 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/928468fc-c237-4779-a2c6-7365b3764fe8-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-sfrgk\" (UID: \"928468fc-c237-4779-a2c6-7365b3764fe8\") " pod="openshift-authentication/oauth-openshift-558db77b4-sfrgk" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.710334 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcg52\" (UniqueName: \"kubernetes.io/projected/85a8345c-8774-4272-887a-42b2d64a65cf-kube-api-access-gcg52\") pod \"image-registry-697d97f7c8-mjzxr\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.710361 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzrnd\" (UniqueName: \"kubernetes.io/projected/b82b8dee-58d7-4701-ad47-8ddc20898935-kube-api-access-dzrnd\") pod \"cluster-image-registry-operator-dc59b4c8b-qvvds\" (UID: \"b82b8dee-58d7-4701-ad47-8ddc20898935\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qvvds" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.710409 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2336c0f6-9ad9-45d3-bb00-cc5832466e7c-machine-approver-tls\") pod \"machine-approver-56656f9798-7wtvk\" (UID: \"2336c0f6-9ad9-45d3-bb00-cc5832466e7c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7wtvk" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.710440 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksktz\" (UniqueName: \"kubernetes.io/projected/928468fc-c237-4779-a2c6-7365b3764fe8-kube-api-access-ksktz\") pod \"oauth-openshift-558db77b4-sfrgk\" (UID: \"928468fc-c237-4779-a2c6-7365b3764fe8\") " pod="openshift-authentication/oauth-openshift-558db77b4-sfrgk" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.710466 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/928468fc-c237-4779-a2c6-7365b3764fe8-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-sfrgk\" (UID: \"928468fc-c237-4779-a2c6-7365b3764fe8\") " pod="openshift-authentication/oauth-openshift-558db77b4-sfrgk" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.710495 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/928468fc-c237-4779-a2c6-7365b3764fe8-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-sfrgk\" (UID: \"928468fc-c237-4779-a2c6-7365b3764fe8\") " pod="openshift-authentication/oauth-openshift-558db77b4-sfrgk" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.710564 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42dc5935-aed6-4fee-a749-3c292d042df5-service-ca-bundle\") pod \"authentication-operator-69f744f599-qgm7k\" (UID: \"42dc5935-aed6-4fee-a749-3c292d042df5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qgm7k" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.710590 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kqm9\" (UniqueName: \"kubernetes.io/projected/b511756d-571f-4e3f-9fde-bd9cd2d6e038-kube-api-access-6kqm9\") pod \"kube-storage-version-migrator-operator-b67b599dd-jn9rp\" (UID: \"b511756d-571f-4e3f-9fde-bd9cd2d6e038\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jn9rp" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.710614 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/85a8345c-8774-4272-887a-42b2d64a65cf-registry-tls\") pod \"image-registry-697d97f7c8-mjzxr\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.710637 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c22j4\" (UniqueName: \"kubernetes.io/projected/bc155a4b-53ea-4394-87d2-d4f966e3589d-kube-api-access-c22j4\") pod \"console-operator-58897d9998-cgrf8\" (UID: \"bc155a4b-53ea-4394-87d2-d4f966e3589d\") " pod="openshift-console-operator/console-operator-58897d9998-cgrf8" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.721746 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bc155a4b-53ea-4394-87d2-d4f966e3589d-trusted-ca\") pod \"console-operator-58897d9998-cgrf8\" (UID: \"bc155a4b-53ea-4394-87d2-d4f966e3589d\") " pod="openshift-console-operator/console-operator-58897d9998-cgrf8" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.721793 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/95fe2bd1-1c99-4105-931e-6a60dd881260-client-ca\") pod \"route-controller-manager-6576b87f9c-m8wwx\" (UID: \"95fe2bd1-1c99-4105-931e-6a60dd881260\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m8wwx" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.721820 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/928468fc-c237-4779-a2c6-7365b3764fe8-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-sfrgk\" (UID: \"928468fc-c237-4779-a2c6-7365b3764fe8\") " pod="openshift-authentication/oauth-openshift-558db77b4-sfrgk" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.721844 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b82b8dee-58d7-4701-ad47-8ddc20898935-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-qvvds\" (UID: \"b82b8dee-58d7-4701-ad47-8ddc20898935\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qvvds" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.722245 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/928468fc-c237-4779-a2c6-7365b3764fe8-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-sfrgk\" (UID: \"928468fc-c237-4779-a2c6-7365b3764fe8\") " pod="openshift-authentication/oauth-openshift-558db77b4-sfrgk" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.722315 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7kjs\" (UniqueName: \"kubernetes.io/projected/5b706875-4635-4744-941b-f235ebd548b0-kube-api-access-d7kjs\") pod \"dns-operator-744455d44c-gnrsd\" (UID: \"5b706875-4635-4744-941b-f235ebd548b0\") " pod="openshift-dns-operator/dns-operator-744455d44c-gnrsd" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.722407 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/85a8345c-8774-4272-887a-42b2d64a65cf-ca-trust-extracted\") pod \"image-registry-697d97f7c8-mjzxr\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.722432 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/85a8345c-8774-4272-887a-42b2d64a65cf-installation-pull-secrets\") pod \"image-registry-697d97f7c8-mjzxr\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.722450 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c71efc6-2e03-405c-84f9-6ba44b085df4-service-ca-bundle\") pod \"router-default-5444994796-ss7r8\" (UID: \"1c71efc6-2e03-405c-84f9-6ba44b085df4\") " pod="openshift-ingress/router-default-5444994796-ss7r8" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.722502 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/928468fc-c237-4779-a2c6-7365b3764fe8-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-sfrgk\" (UID: \"928468fc-c237-4779-a2c6-7365b3764fe8\") " pod="openshift-authentication/oauth-openshift-558db77b4-sfrgk" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.722552 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/928468fc-c237-4779-a2c6-7365b3764fe8-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-sfrgk\" (UID: \"928468fc-c237-4779-a2c6-7365b3764fe8\") " pod="openshift-authentication/oauth-openshift-558db77b4-sfrgk" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.722790 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42dc5935-aed6-4fee-a749-3c292d042df5-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-qgm7k\" (UID: \"42dc5935-aed6-4fee-a749-3c292d042df5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qgm7k" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.722813 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgcs8\" (UniqueName: \"kubernetes.io/projected/42dc5935-aed6-4fee-a749-3c292d042df5-kube-api-access-rgcs8\") pod \"authentication-operator-69f744f599-qgm7k\" (UID: \"42dc5935-aed6-4fee-a749-3c292d042df5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qgm7k" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.726700 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.735394 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jj2bg" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.739292 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.759787 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.763512 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-k4jvl" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.780782 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.782328 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-66mm8" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.786679 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pjrq\" (UniqueName: \"kubernetes.io/projected/871c529e-b263-4600-982a-d7be266f86e4-kube-api-access-7pjrq\") pod \"machine-api-operator-5694c8668f-vbg6j\" (UID: \"871c529e-b263-4600-982a-d7be266f86e4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vbg6j" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.799042 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.819100 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-x2sb4"] Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.820244 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.827328 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/871c529e-b263-4600-982a-d7be266f86e4-config\") pod \"machine-api-operator-5694c8668f-vbg6j\" (UID: \"871c529e-b263-4600-982a-d7be266f86e4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vbg6j" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.827842 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.827995 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/928468fc-c237-4779-a2c6-7365b3764fe8-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-sfrgk\" (UID: \"928468fc-c237-4779-a2c6-7365b3764fe8\") " pod="openshift-authentication/oauth-openshift-558db77b4-sfrgk" Mar 14 07:00:57 crc kubenswrapper[4893]: E0314 07:00:57.828014 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:00:58.3279891 +0000 UTC m=+137.590165892 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.828044 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/928468fc-c237-4779-a2c6-7365b3764fe8-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-sfrgk\" (UID: \"928468fc-c237-4779-a2c6-7365b3764fe8\") " pod="openshift-authentication/oauth-openshift-558db77b4-sfrgk" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.828106 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/38206b8e-cb94-47b8-b857-63a073f99ef7-mountpoint-dir\") pod \"csi-hostpathplugin-phr5l\" (UID: \"38206b8e-cb94-47b8-b857-63a073f99ef7\") " pod="hostpath-provisioner/csi-hostpathplugin-phr5l" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.828127 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/38206b8e-cb94-47b8-b857-63a073f99ef7-csi-data-dir\") pod \"csi-hostpathplugin-phr5l\" (UID: \"38206b8e-cb94-47b8-b857-63a073f99ef7\") " pod="hostpath-provisioner/csi-hostpathplugin-phr5l" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.828144 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/e026a9f7-3301-4162-9710-706837e6d0ea-certs\") pod \"machine-config-server-2p482\" (UID: \"e026a9f7-3301-4162-9710-706837e6d0ea\") " pod="openshift-machine-config-operator/machine-config-server-2p482" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.828835 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/cf444907-0c34-43ab-9bbd-b9ef0773743c-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-6xtr4\" (UID: \"cf444907-0c34-43ab-9bbd-b9ef0773743c\") " pod="openshift-multus/cni-sysctl-allowlist-ds-6xtr4" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.828861 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/41364595-3dec-491e-823b-5cd2d7c4ea46-auth-proxy-config\") pod \"machine-config-operator-74547568cd-7l845\" (UID: \"41364595-3dec-491e-823b-5cd2d7c4ea46\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7l845" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.828877 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-444zm\" (UniqueName: \"kubernetes.io/projected/be7063fc-de10-4911-8c87-c3251c274912-kube-api-access-444zm\") pod \"packageserver-d55dfcdfc-tsbfw\" (UID: \"be7063fc-de10-4911-8c87-c3251c274912\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tsbfw" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.828924 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a772f0d-1184-43e0-9d33-b2d024933bd9-config\") pod \"service-ca-operator-777779d784-wnwsg\" (UID: \"4a772f0d-1184-43e0-9d33-b2d024933bd9\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wnwsg" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.829014 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/85a8345c-8774-4272-887a-42b2d64a65cf-bound-sa-token\") pod \"image-registry-697d97f7c8-mjzxr\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.829233 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2pcz\" (UniqueName: \"kubernetes.io/projected/2336c0f6-9ad9-45d3-bb00-cc5832466e7c-kube-api-access-t2pcz\") pod \"machine-approver-56656f9798-7wtvk\" (UID: \"2336c0f6-9ad9-45d3-bb00-cc5832466e7c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7wtvk" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.829303 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/38206b8e-cb94-47b8-b857-63a073f99ef7-plugins-dir\") pod \"csi-hostpathplugin-phr5l\" (UID: \"38206b8e-cb94-47b8-b857-63a073f99ef7\") " pod="hostpath-provisioner/csi-hostpathplugin-phr5l" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.829330 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9t8d\" (UniqueName: \"kubernetes.io/projected/e026a9f7-3301-4162-9710-706837e6d0ea-kube-api-access-x9t8d\") pod \"machine-config-server-2p482\" (UID: \"e026a9f7-3301-4162-9710-706837e6d0ea\") " pod="openshift-machine-config-operator/machine-config-server-2p482" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.829381 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2336c0f6-9ad9-45d3-bb00-cc5832466e7c-config\") pod \"machine-approver-56656f9798-7wtvk\" (UID: \"2336c0f6-9ad9-45d3-bb00-cc5832466e7c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7wtvk" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.829435 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcg52\" (UniqueName: \"kubernetes.io/projected/85a8345c-8774-4272-887a-42b2d64a65cf-kube-api-access-gcg52\") pod \"image-registry-697d97f7c8-mjzxr\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.829468 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzrnd\" (UniqueName: \"kubernetes.io/projected/b82b8dee-58d7-4701-ad47-8ddc20898935-kube-api-access-dzrnd\") pod \"cluster-image-registry-operator-dc59b4c8b-qvvds\" (UID: \"b82b8dee-58d7-4701-ad47-8ddc20898935\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qvvds" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.829496 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktl25\" (UniqueName: \"kubernetes.io/projected/b9902b25-208d-48e9-bd97-f7e46797d813-kube-api-access-ktl25\") pod \"service-ca-9c57cc56f-xmj8x\" (UID: \"b9902b25-208d-48e9-bd97-f7e46797d813\") " pod="openshift-service-ca/service-ca-9c57cc56f-xmj8x" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.829539 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2336c0f6-9ad9-45d3-bb00-cc5832466e7c-machine-approver-tls\") pod \"machine-approver-56656f9798-7wtvk\" (UID: \"2336c0f6-9ad9-45d3-bb00-cc5832466e7c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7wtvk" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.829564 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksktz\" (UniqueName: \"kubernetes.io/projected/928468fc-c237-4779-a2c6-7365b3764fe8-kube-api-access-ksktz\") pod \"oauth-openshift-558db77b4-sfrgk\" (UID: \"928468fc-c237-4779-a2c6-7365b3764fe8\") " pod="openshift-authentication/oauth-openshift-558db77b4-sfrgk" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.829594 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/7f7aced4-1923-447e-a85d-b84ff4974986-etcd-ca\") pod \"etcd-operator-b45778765-2vgnl\" (UID: \"7f7aced4-1923-447e-a85d-b84ff4974986\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2vgnl" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.829611 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3009e07b-2452-425c-95c3-3a78fa993d62-config-volume\") pod \"collect-profiles-29557860-g42hf\" (UID: \"3009e07b-2452-425c-95c3-3a78fa993d62\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557860-g42hf" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.829641 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kqm9\" (UniqueName: \"kubernetes.io/projected/b511756d-571f-4e3f-9fde-bd9cd2d6e038-kube-api-access-6kqm9\") pod \"kube-storage-version-migrator-operator-b67b599dd-jn9rp\" (UID: \"b511756d-571f-4e3f-9fde-bd9cd2d6e038\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jn9rp" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.829661 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/95fe2bd1-1c99-4105-931e-6a60dd881260-client-ca\") pod \"route-controller-manager-6576b87f9c-m8wwx\" (UID: \"95fe2bd1-1c99-4105-931e-6a60dd881260\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m8wwx" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.829681 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m998t\" (UniqueName: \"kubernetes.io/projected/26699cc4-344e-4154-baa7-8964daba84f7-kube-api-access-m998t\") pod \"package-server-manager-789f6589d5-vcv9c\" (UID: \"26699cc4-344e-4154-baa7-8964daba84f7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vcv9c" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.829712 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b82b8dee-58d7-4701-ad47-8ddc20898935-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-qvvds\" (UID: \"b82b8dee-58d7-4701-ad47-8ddc20898935\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qvvds" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.829734 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/982402e2-823c-4c34-a446-7a2b05e9a00d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jn8vg\" (UID: \"982402e2-823c-4c34-a446-7a2b05e9a00d\") " pod="openshift-marketplace/marketplace-operator-79b997595-jn8vg" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.829755 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/38206b8e-cb94-47b8-b857-63a073f99ef7-socket-dir\") pod \"csi-hostpathplugin-phr5l\" (UID: \"38206b8e-cb94-47b8-b857-63a073f99ef7\") " pod="hostpath-provisioner/csi-hostpathplugin-phr5l" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.829776 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/928468fc-c237-4779-a2c6-7365b3764fe8-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-sfrgk\" (UID: \"928468fc-c237-4779-a2c6-7365b3764fe8\") " pod="openshift-authentication/oauth-openshift-558db77b4-sfrgk" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.829795 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bt5g9\" (UniqueName: \"kubernetes.io/projected/7f7aced4-1923-447e-a85d-b84ff4974986-kube-api-access-bt5g9\") pod \"etcd-operator-b45778765-2vgnl\" (UID: \"7f7aced4-1923-447e-a85d-b84ff4974986\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2vgnl" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.829814 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lggpb\" (UniqueName: \"kubernetes.io/projected/0a74016e-259d-4d37-b12b-1a1255430d59-kube-api-access-lggpb\") pod \"ingress-operator-5b745b69d9-ccjq4\" (UID: \"0a74016e-259d-4d37-b12b-1a1255430d59\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ccjq4" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.829831 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/be7063fc-de10-4911-8c87-c3251c274912-webhook-cert\") pod \"packageserver-d55dfcdfc-tsbfw\" (UID: \"be7063fc-de10-4911-8c87-c3251c274912\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tsbfw" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.829865 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/85a8345c-8774-4272-887a-42b2d64a65cf-ca-trust-extracted\") pod \"image-registry-697d97f7c8-mjzxr\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.829882 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c71efc6-2e03-405c-84f9-6ba44b085df4-service-ca-bundle\") pod \"router-default-5444994796-ss7r8\" (UID: \"1c71efc6-2e03-405c-84f9-6ba44b085df4\") " pod="openshift-ingress/router-default-5444994796-ss7r8" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.829901 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c2a95849-5cd5-46db-9979-7ed894406e61-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-wt8zp\" (UID: \"c2a95849-5cd5-46db-9979-7ed894406e61\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wt8zp" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.829930 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/7f7aced4-1923-447e-a85d-b84ff4974986-etcd-service-ca\") pod \"etcd-operator-b45778765-2vgnl\" (UID: \"7f7aced4-1923-447e-a85d-b84ff4974986\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2vgnl" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.829947 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzq5j\" (UniqueName: \"kubernetes.io/projected/f5e5a700-4ef2-4e79-9eaf-b1291e395185-kube-api-access-zzq5j\") pod \"dns-default-xxwxx\" (UID: \"f5e5a700-4ef2-4e79-9eaf-b1291e395185\") " pod="openshift-dns/dns-default-xxwxx" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.829975 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/33e432c4-ff58-4a8d-98b0-28668b3d7a88-profile-collector-cert\") pod \"catalog-operator-68c6474976-57kx4\" (UID: \"33e432c4-ff58-4a8d-98b0-28668b3d7a88\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-57kx4" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.829991 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/cf444907-0c34-43ab-9bbd-b9ef0773743c-ready\") pod \"cni-sysctl-allowlist-ds-6xtr4\" (UID: \"cf444907-0c34-43ab-9bbd-b9ef0773743c\") " pod="openshift-multus/cni-sysctl-allowlist-ds-6xtr4" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.830029 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/42dc5935-aed6-4fee-a749-3c292d042df5-serving-cert\") pod \"authentication-operator-69f744f599-qgm7k\" (UID: \"42dc5935-aed6-4fee-a749-3c292d042df5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qgm7k" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.830058 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a772f0d-1184-43e0-9d33-b2d024933bd9-serving-cert\") pod \"service-ca-operator-777779d784-wnwsg\" (UID: \"4a772f0d-1184-43e0-9d33-b2d024933bd9\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wnwsg" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.830084 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tr26g\" (UniqueName: \"kubernetes.io/projected/982402e2-823c-4c34-a446-7a2b05e9a00d-kube-api-access-tr26g\") pod \"marketplace-operator-79b997595-jn8vg\" (UID: \"982402e2-823c-4c34-a446-7a2b05e9a00d\") " pod="openshift-marketplace/marketplace-operator-79b997595-jn8vg" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.830110 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjzxr\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.830132 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95fe2bd1-1c99-4105-931e-6a60dd881260-config\") pod \"route-controller-manager-6576b87f9c-m8wwx\" (UID: \"95fe2bd1-1c99-4105-931e-6a60dd881260\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m8wwx" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.830152 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/95fe2bd1-1c99-4105-931e-6a60dd881260-serving-cert\") pod \"route-controller-manager-6576b87f9c-m8wwx\" (UID: \"95fe2bd1-1c99-4105-931e-6a60dd881260\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m8wwx" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.830171 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0a74016e-259d-4d37-b12b-1a1255430d59-bound-sa-token\") pod \"ingress-operator-5b745b69d9-ccjq4\" (UID: \"0a74016e-259d-4d37-b12b-1a1255430d59\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ccjq4" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.830187 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnbqz\" (UniqueName: \"kubernetes.io/projected/122146eb-6174-4393-aabf-121f89674223-kube-api-access-gnbqz\") pod \"machine-config-controller-84d6567774-spcr2\" (UID: \"122146eb-6174-4393-aabf-121f89674223\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-spcr2" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.830226 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctjqf\" (UniqueName: \"kubernetes.io/projected/33e432c4-ff58-4a8d-98b0-28668b3d7a88-kube-api-access-ctjqf\") pod \"catalog-operator-68c6474976-57kx4\" (UID: \"33e432c4-ff58-4a8d-98b0-28668b3d7a88\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-57kx4" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.830243 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc155a4b-53ea-4394-87d2-d4f966e3589d-config\") pod \"console-operator-58897d9998-cgrf8\" (UID: \"bc155a4b-53ea-4394-87d2-d4f966e3589d\") " pod="openshift-console-operator/console-operator-58897d9998-cgrf8" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.830260 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/38206b8e-cb94-47b8-b857-63a073f99ef7-registration-dir\") pod \"csi-hostpathplugin-phr5l\" (UID: \"38206b8e-cb94-47b8-b857-63a073f99ef7\") " pod="hostpath-provisioner/csi-hostpathplugin-phr5l" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.830277 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/928468fc-c237-4779-a2c6-7365b3764fe8-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-sfrgk\" (UID: \"928468fc-c237-4779-a2c6-7365b3764fe8\") " pod="openshift-authentication/oauth-openshift-558db77b4-sfrgk" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.830287 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2336c0f6-9ad9-45d3-bb00-cc5832466e7c-config\") pod \"machine-approver-56656f9798-7wtvk\" (UID: \"2336c0f6-9ad9-45d3-bb00-cc5832466e7c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7wtvk" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.830296 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5b706875-4635-4744-941b-f235ebd548b0-metrics-tls\") pod \"dns-operator-744455d44c-gnrsd\" (UID: \"5b706875-4635-4744-941b-f235ebd548b0\") " pod="openshift-dns-operator/dns-operator-744455d44c-gnrsd" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.830369 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b82b8dee-58d7-4701-ad47-8ddc20898935-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-qvvds\" (UID: \"b82b8dee-58d7-4701-ad47-8ddc20898935\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qvvds" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.830389 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b82b8dee-58d7-4701-ad47-8ddc20898935-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-qvvds\" (UID: \"b82b8dee-58d7-4701-ad47-8ddc20898935\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qvvds" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.830432 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f5e5a700-4ef2-4e79-9eaf-b1291e395185-config-volume\") pod \"dns-default-xxwxx\" (UID: \"f5e5a700-4ef2-4e79-9eaf-b1291e395185\") " pod="openshift-dns/dns-default-xxwxx" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.830454 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xgd5\" (UniqueName: \"kubernetes.io/projected/72b53c84-bbb9-4fd1-9acd-625ecf005bff-kube-api-access-6xgd5\") pod \"ingress-canary-jh6jw\" (UID: \"72b53c84-bbb9-4fd1-9acd-625ecf005bff\") " pod="openshift-ingress-canary/ingress-canary-jh6jw" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.830502 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b511756d-571f-4e3f-9fde-bd9cd2d6e038-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-jn9rp\" (UID: \"b511756d-571f-4e3f-9fde-bd9cd2d6e038\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jn9rp" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.831043 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/41364595-3dec-491e-823b-5cd2d7c4ea46-images\") pod \"machine-config-operator-74547568cd-7l845\" (UID: \"41364595-3dec-491e-823b-5cd2d7c4ea46\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7l845" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.831068 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/f23fcc26-f800-497a-b038-065687659df7-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-v7lxq\" (UID: \"f23fcc26-f800-497a-b038-065687659df7\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v7lxq" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.831157 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7f7aced4-1923-447e-a85d-b84ff4974986-etcd-client\") pod \"etcd-operator-b45778765-2vgnl\" (UID: \"7f7aced4-1923-447e-a85d-b84ff4974986\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2vgnl" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.831248 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ls6kb\" (UniqueName: \"kubernetes.io/projected/1c71efc6-2e03-405c-84f9-6ba44b085df4-kube-api-access-ls6kb\") pod \"router-default-5444994796-ss7r8\" (UID: \"1c71efc6-2e03-405c-84f9-6ba44b085df4\") " pod="openshift-ingress/router-default-5444994796-ss7r8" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.831288 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrqmf\" (UniqueName: \"kubernetes.io/projected/cf444907-0c34-43ab-9bbd-b9ef0773743c-kube-api-access-wrqmf\") pod \"cni-sysctl-allowlist-ds-6xtr4\" (UID: \"cf444907-0c34-43ab-9bbd-b9ef0773743c\") " pod="openshift-multus/cni-sysctl-allowlist-ds-6xtr4" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.831294 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/95fe2bd1-1c99-4105-931e-6a60dd881260-client-ca\") pod \"route-controller-manager-6576b87f9c-m8wwx\" (UID: \"95fe2bd1-1c99-4105-931e-6a60dd881260\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m8wwx" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.831313 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1c71efc6-2e03-405c-84f9-6ba44b085df4-metrics-certs\") pod \"router-default-5444994796-ss7r8\" (UID: \"1c71efc6-2e03-405c-84f9-6ba44b085df4\") " pod="openshift-ingress/router-default-5444994796-ss7r8" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.831357 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f7aced4-1923-447e-a85d-b84ff4974986-serving-cert\") pod \"etcd-operator-b45778765-2vgnl\" (UID: \"7f7aced4-1923-447e-a85d-b84ff4974986\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2vgnl" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.831380 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mj8rq\" (UniqueName: \"kubernetes.io/projected/41364595-3dec-491e-823b-5cd2d7c4ea46-kube-api-access-mj8rq\") pod \"machine-config-operator-74547568cd-7l845\" (UID: \"41364595-3dec-491e-823b-5cd2d7c4ea46\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7l845" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.831446 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cf444907-0c34-43ab-9bbd-b9ef0773743c-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-6xtr4\" (UID: \"cf444907-0c34-43ab-9bbd-b9ef0773743c\") " pod="openshift-multus/cni-sysctl-allowlist-ds-6xtr4" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.831531 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/e026a9f7-3301-4162-9710-706837e6d0ea-node-bootstrap-token\") pod \"machine-config-server-2p482\" (UID: \"e026a9f7-3301-4162-9710-706837e6d0ea\") " pod="openshift-machine-config-operator/machine-config-server-2p482" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.831556 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/928468fc-c237-4779-a2c6-7365b3764fe8-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-sfrgk\" (UID: \"928468fc-c237-4779-a2c6-7365b3764fe8\") " pod="openshift-authentication/oauth-openshift-558db77b4-sfrgk" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.831598 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0a74016e-259d-4d37-b12b-1a1255430d59-metrics-tls\") pod \"ingress-operator-5b745b69d9-ccjq4\" (UID: \"0a74016e-259d-4d37-b12b-1a1255430d59\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ccjq4" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.831623 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/41364595-3dec-491e-823b-5cd2d7c4ea46-proxy-tls\") pod \"machine-config-operator-74547568cd-7l845\" (UID: \"41364595-3dec-491e-823b-5cd2d7c4ea46\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7l845" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.831643 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2a95849-5cd5-46db-9979-7ed894406e61-config\") pod \"kube-apiserver-operator-766d6c64bb-wt8zp\" (UID: \"c2a95849-5cd5-46db-9979-7ed894406e61\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wt8zp" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.831679 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/b9902b25-208d-48e9-bd97-f7e46797d813-signing-key\") pod \"service-ca-9c57cc56f-xmj8x\" (UID: \"b9902b25-208d-48e9-bd97-f7e46797d813\") " pod="openshift-service-ca/service-ca-9c57cc56f-xmj8x" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.831703 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/85a8345c-8774-4272-887a-42b2d64a65cf-ca-trust-extracted\") pod \"image-registry-697d97f7c8-mjzxr\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.831726 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0a74016e-259d-4d37-b12b-1a1255430d59-trusted-ca\") pod \"ingress-operator-5b745b69d9-ccjq4\" (UID: \"0a74016e-259d-4d37-b12b-1a1255430d59\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ccjq4" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.831978 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc155a4b-53ea-4394-87d2-d4f966e3589d-config\") pod \"console-operator-58897d9998-cgrf8\" (UID: \"bc155a4b-53ea-4394-87d2-d4f966e3589d\") " pod="openshift-console-operator/console-operator-58897d9998-cgrf8" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.832041 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/72b53c84-bbb9-4fd1-9acd-625ecf005bff-cert\") pod \"ingress-canary-jh6jw\" (UID: \"72b53c84-bbb9-4fd1-9acd-625ecf005bff\") " pod="openshift-ingress-canary/ingress-canary-jh6jw" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.832076 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvhx6\" (UniqueName: \"kubernetes.io/projected/3009e07b-2452-425c-95c3-3a78fa993d62-kube-api-access-vvhx6\") pod \"collect-profiles-29557860-g42hf\" (UID: \"3009e07b-2452-425c-95c3-3a78fa993d62\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557860-g42hf" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.832123 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/33e432c4-ff58-4a8d-98b0-28668b3d7a88-srv-cert\") pod \"catalog-operator-68c6474976-57kx4\" (UID: \"33e432c4-ff58-4a8d-98b0-28668b3d7a88\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-57kx4" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.832160 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2dkb\" (UniqueName: \"kubernetes.io/projected/38206b8e-cb94-47b8-b857-63a073f99ef7-kube-api-access-z2dkb\") pod \"csi-hostpathplugin-phr5l\" (UID: \"38206b8e-cb94-47b8-b857-63a073f99ef7\") " pod="hostpath-provisioner/csi-hostpathplugin-phr5l" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.832188 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/928468fc-c237-4779-a2c6-7365b3764fe8-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-sfrgk\" (UID: \"928468fc-c237-4779-a2c6-7365b3764fe8\") " pod="openshift-authentication/oauth-openshift-558db77b4-sfrgk" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.832216 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/928468fc-c237-4779-a2c6-7365b3764fe8-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-sfrgk\" (UID: \"928468fc-c237-4779-a2c6-7365b3764fe8\") " pod="openshift-authentication/oauth-openshift-558db77b4-sfrgk" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.832245 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19d07c7b-828b-4582-aa4c-a6f4c1f09c9f-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-h4wwz\" (UID: \"19d07c7b-828b-4582-aa4c-a6f4c1f09c9f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h4wwz" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.832285 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/85a8345c-8774-4272-887a-42b2d64a65cf-registry-tls\") pod \"image-registry-697d97f7c8-mjzxr\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.832311 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c22j4\" (UniqueName: \"kubernetes.io/projected/bc155a4b-53ea-4394-87d2-d4f966e3589d-kube-api-access-c22j4\") pod \"console-operator-58897d9998-cgrf8\" (UID: \"bc155a4b-53ea-4394-87d2-d4f966e3589d\") " pod="openshift-console-operator/console-operator-58897d9998-cgrf8" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.832335 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42dc5935-aed6-4fee-a749-3c292d042df5-service-ca-bundle\") pod \"authentication-operator-69f744f599-qgm7k\" (UID: \"42dc5935-aed6-4fee-a749-3c292d042df5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qgm7k" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.832361 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92xt8\" (UniqueName: \"kubernetes.io/projected/19d07c7b-828b-4582-aa4c-a6f4c1f09c9f-kube-api-access-92xt8\") pod \"openshift-controller-manager-operator-756b6f6bc6-h4wwz\" (UID: \"19d07c7b-828b-4582-aa4c-a6f4c1f09c9f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h4wwz" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.832407 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bc155a4b-53ea-4394-87d2-d4f966e3589d-trusted-ca\") pod \"console-operator-58897d9998-cgrf8\" (UID: \"bc155a4b-53ea-4394-87d2-d4f966e3589d\") " pod="openshift-console-operator/console-operator-58897d9998-cgrf8" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.832434 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/928468fc-c237-4779-a2c6-7365b3764fe8-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-sfrgk\" (UID: \"928468fc-c237-4779-a2c6-7365b3764fe8\") " pod="openshift-authentication/oauth-openshift-558db77b4-sfrgk" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.832463 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/982402e2-823c-4c34-a446-7a2b05e9a00d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jn8vg\" (UID: \"982402e2-823c-4c34-a446-7a2b05e9a00d\") " pod="openshift-marketplace/marketplace-operator-79b997595-jn8vg" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.832508 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7kjs\" (UniqueName: \"kubernetes.io/projected/5b706875-4635-4744-941b-f235ebd548b0-kube-api-access-d7kjs\") pod \"dns-operator-744455d44c-gnrsd\" (UID: \"5b706875-4635-4744-941b-f235ebd548b0\") " pod="openshift-dns-operator/dns-operator-744455d44c-gnrsd" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.832644 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2a95849-5cd5-46db-9979-7ed894406e61-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-wt8zp\" (UID: \"c2a95849-5cd5-46db-9979-7ed894406e61\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wt8zp" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.832666 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/85a8345c-8774-4272-887a-42b2d64a65cf-installation-pull-secrets\") pod \"image-registry-697d97f7c8-mjzxr\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.832685 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f7aced4-1923-447e-a85d-b84ff4974986-config\") pod \"etcd-operator-b45778765-2vgnl\" (UID: \"7f7aced4-1923-447e-a85d-b84ff4974986\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2vgnl" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.832703 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f5e5a700-4ef2-4e79-9eaf-b1291e395185-metrics-tls\") pod \"dns-default-xxwxx\" (UID: \"f5e5a700-4ef2-4e79-9eaf-b1291e395185\") " pod="openshift-dns/dns-default-xxwxx" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.832737 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/928468fc-c237-4779-a2c6-7365b3764fe8-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-sfrgk\" (UID: \"928468fc-c237-4779-a2c6-7365b3764fe8\") " pod="openshift-authentication/oauth-openshift-558db77b4-sfrgk" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.832755 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/928468fc-c237-4779-a2c6-7365b3764fe8-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-sfrgk\" (UID: \"928468fc-c237-4779-a2c6-7365b3764fe8\") " pod="openshift-authentication/oauth-openshift-558db77b4-sfrgk" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.832784 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42dc5935-aed6-4fee-a749-3c292d042df5-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-qgm7k\" (UID: \"42dc5935-aed6-4fee-a749-3c292d042df5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qgm7k" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.832803 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgcs8\" (UniqueName: \"kubernetes.io/projected/42dc5935-aed6-4fee-a749-3c292d042df5-kube-api-access-rgcs8\") pod \"authentication-operator-69f744f599-qgm7k\" (UID: \"42dc5935-aed6-4fee-a749-3c292d042df5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qgm7k" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.832825 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4l6j6\" (UniqueName: \"kubernetes.io/projected/f23fcc26-f800-497a-b038-065687659df7-kube-api-access-4l6j6\") pod \"control-plane-machine-set-operator-78cbb6b69f-v7lxq\" (UID: \"f23fcc26-f800-497a-b038-065687659df7\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v7lxq" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.832850 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/928468fc-c237-4779-a2c6-7365b3764fe8-audit-dir\") pod \"oauth-openshift-558db77b4-sfrgk\" (UID: \"928468fc-c237-4779-a2c6-7365b3764fe8\") " pod="openshift-authentication/oauth-openshift-558db77b4-sfrgk" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.832871 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc155a4b-53ea-4394-87d2-d4f966e3589d-serving-cert\") pod \"console-operator-58897d9998-cgrf8\" (UID: \"bc155a4b-53ea-4394-87d2-d4f966e3589d\") " pod="openshift-console-operator/console-operator-58897d9998-cgrf8" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.832888 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/85a8345c-8774-4272-887a-42b2d64a65cf-registry-certificates\") pod \"image-registry-697d97f7c8-mjzxr\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.832908 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4glh\" (UniqueName: \"kubernetes.io/projected/4a772f0d-1184-43e0-9d33-b2d024933bd9-kube-api-access-w4glh\") pod \"service-ca-operator-777779d784-wnwsg\" (UID: \"4a772f0d-1184-43e0-9d33-b2d024933bd9\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wnwsg" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.832925 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3009e07b-2452-425c-95c3-3a78fa993d62-secret-volume\") pod \"collect-profiles-29557860-g42hf\" (UID: \"3009e07b-2452-425c-95c3-3a78fa993d62\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557860-g42hf" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.833000 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z28pl\" (UniqueName: \"kubernetes.io/projected/95fe2bd1-1c99-4105-931e-6a60dd881260-kube-api-access-z28pl\") pod \"route-controller-manager-6576b87f9c-m8wwx\" (UID: \"95fe2bd1-1c99-4105-931e-6a60dd881260\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m8wwx" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.833019 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/1c71efc6-2e03-405c-84f9-6ba44b085df4-default-certificate\") pod \"router-default-5444994796-ss7r8\" (UID: \"1c71efc6-2e03-405c-84f9-6ba44b085df4\") " pod="openshift-ingress/router-default-5444994796-ss7r8" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.833039 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ef0e0ce2-d109-426d-8d69-7eb458708189-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-qq2pq\" (UID: \"ef0e0ce2-d109-426d-8d69-7eb458708189\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qq2pq" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.833059 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/122146eb-6174-4393-aabf-121f89674223-proxy-tls\") pod \"machine-config-controller-84d6567774-spcr2\" (UID: \"122146eb-6174-4393-aabf-121f89674223\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-spcr2" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.833078 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/122146eb-6174-4393-aabf-121f89674223-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-spcr2\" (UID: \"122146eb-6174-4393-aabf-121f89674223\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-spcr2" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.833096 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/be7063fc-de10-4911-8c87-c3251c274912-tmpfs\") pod \"packageserver-d55dfcdfc-tsbfw\" (UID: \"be7063fc-de10-4911-8c87-c3251c274912\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tsbfw" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.833263 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/19d07c7b-828b-4582-aa4c-a6f4c1f09c9f-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-h4wwz\" (UID: \"19d07c7b-828b-4582-aa4c-a6f4c1f09c9f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h4wwz" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.833297 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/928468fc-c237-4779-a2c6-7365b3764fe8-audit-policies\") pod \"oauth-openshift-558db77b4-sfrgk\" (UID: \"928468fc-c237-4779-a2c6-7365b3764fe8\") " pod="openshift-authentication/oauth-openshift-558db77b4-sfrgk" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.833316 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2336c0f6-9ad9-45d3-bb00-cc5832466e7c-auth-proxy-config\") pod \"machine-approver-56656f9798-7wtvk\" (UID: \"2336c0f6-9ad9-45d3-bb00-cc5832466e7c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7wtvk" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.833335 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b511756d-571f-4e3f-9fde-bd9cd2d6e038-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-jn9rp\" (UID: \"b511756d-571f-4e3f-9fde-bd9cd2d6e038\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jn9rp" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.833365 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/b9902b25-208d-48e9-bd97-f7e46797d813-signing-cabundle\") pod \"service-ca-9c57cc56f-xmj8x\" (UID: \"b9902b25-208d-48e9-bd97-f7e46797d813\") " pod="openshift-service-ca/service-ca-9c57cc56f-xmj8x" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.833386 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42dc5935-aed6-4fee-a749-3c292d042df5-config\") pod \"authentication-operator-69f744f599-qgm7k\" (UID: \"42dc5935-aed6-4fee-a749-3c292d042df5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qgm7k" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.833404 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qq9k\" (UniqueName: \"kubernetes.io/projected/b1b92f23-a052-41c6-817f-d43b04079105-kube-api-access-7qq9k\") pod \"downloads-7954f5f757-vkgvv\" (UID: \"b1b92f23-a052-41c6-817f-d43b04079105\") " pod="openshift-console/downloads-7954f5f757-vkgvv" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.833440 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/26699cc4-344e-4154-baa7-8964daba84f7-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-vcv9c\" (UID: \"26699cc4-344e-4154-baa7-8964daba84f7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vcv9c" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.833458 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/1c71efc6-2e03-405c-84f9-6ba44b085df4-stats-auth\") pod \"router-default-5444994796-ss7r8\" (UID: \"1c71efc6-2e03-405c-84f9-6ba44b085df4\") " pod="openshift-ingress/router-default-5444994796-ss7r8" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.833478 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/85a8345c-8774-4272-887a-42b2d64a65cf-trusted-ca\") pod \"image-registry-697d97f7c8-mjzxr\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.833495 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/928468fc-c237-4779-a2c6-7365b3764fe8-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-sfrgk\" (UID: \"928468fc-c237-4779-a2c6-7365b3764fe8\") " pod="openshift-authentication/oauth-openshift-558db77b4-sfrgk" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.833512 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/be7063fc-de10-4911-8c87-c3251c274912-apiservice-cert\") pod \"packageserver-d55dfcdfc-tsbfw\" (UID: \"be7063fc-de10-4911-8c87-c3251c274912\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tsbfw" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.833588 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2pmc\" (UniqueName: \"kubernetes.io/projected/ef0e0ce2-d109-426d-8d69-7eb458708189-kube-api-access-v2pmc\") pod \"cluster-samples-operator-665b6dd947-qq2pq\" (UID: \"ef0e0ce2-d109-426d-8d69-7eb458708189\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qq2pq" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.834259 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c71efc6-2e03-405c-84f9-6ba44b085df4-service-ca-bundle\") pod \"router-default-5444994796-ss7r8\" (UID: \"1c71efc6-2e03-405c-84f9-6ba44b085df4\") " pod="openshift-ingress/router-default-5444994796-ss7r8" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.834462 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/95fe2bd1-1c99-4105-931e-6a60dd881260-serving-cert\") pod \"route-controller-manager-6576b87f9c-m8wwx\" (UID: \"95fe2bd1-1c99-4105-931e-6a60dd881260\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m8wwx" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.834560 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1c71efc6-2e03-405c-84f9-6ba44b085df4-metrics-certs\") pod \"router-default-5444994796-ss7r8\" (UID: \"1c71efc6-2e03-405c-84f9-6ba44b085df4\") " pod="openshift-ingress/router-default-5444994796-ss7r8" Mar 14 07:00:57 crc kubenswrapper[4893]: E0314 07:00:57.835378 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:00:58.3353604 +0000 UTC m=+137.597537192 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjzxr" (UID: "85a8345c-8774-4272-887a-42b2d64a65cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.835441 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95fe2bd1-1c99-4105-931e-6a60dd881260-config\") pod \"route-controller-manager-6576b87f9c-m8wwx\" (UID: \"95fe2bd1-1c99-4105-931e-6a60dd881260\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m8wwx" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.835551 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5b706875-4635-4744-941b-f235ebd548b0-metrics-tls\") pod \"dns-operator-744455d44c-gnrsd\" (UID: \"5b706875-4635-4744-941b-f235ebd548b0\") " pod="openshift-dns-operator/dns-operator-744455d44c-gnrsd" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.835769 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/928468fc-c237-4779-a2c6-7365b3764fe8-audit-dir\") pod \"oauth-openshift-558db77b4-sfrgk\" (UID: \"928468fc-c237-4779-a2c6-7365b3764fe8\") " pod="openshift-authentication/oauth-openshift-558db77b4-sfrgk" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.836414 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42dc5935-aed6-4fee-a749-3c292d042df5-config\") pod \"authentication-operator-69f744f599-qgm7k\" (UID: \"42dc5935-aed6-4fee-a749-3c292d042df5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qgm7k" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.837013 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42dc5935-aed6-4fee-a749-3c292d042df5-service-ca-bundle\") pod \"authentication-operator-69f744f599-qgm7k\" (UID: \"42dc5935-aed6-4fee-a749-3c292d042df5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qgm7k" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.837082 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42dc5935-aed6-4fee-a749-3c292d042df5-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-qgm7k\" (UID: \"42dc5935-aed6-4fee-a749-3c292d042df5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qgm7k" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.837411 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2336c0f6-9ad9-45d3-bb00-cc5832466e7c-auth-proxy-config\") pod \"machine-approver-56656f9798-7wtvk\" (UID: \"2336c0f6-9ad9-45d3-bb00-cc5832466e7c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7wtvk" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.837464 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2336c0f6-9ad9-45d3-bb00-cc5832466e7c-machine-approver-tls\") pod \"machine-approver-56656f9798-7wtvk\" (UID: \"2336c0f6-9ad9-45d3-bb00-cc5832466e7c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7wtvk" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.838571 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b511756d-571f-4e3f-9fde-bd9cd2d6e038-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-jn9rp\" (UID: \"b511756d-571f-4e3f-9fde-bd9cd2d6e038\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jn9rp" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.839300 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b511756d-571f-4e3f-9fde-bd9cd2d6e038-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-jn9rp\" (UID: \"b511756d-571f-4e3f-9fde-bd9cd2d6e038\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jn9rp" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.839679 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.839765 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b82b8dee-58d7-4701-ad47-8ddc20898935-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-qvvds\" (UID: \"b82b8dee-58d7-4701-ad47-8ddc20898935\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qvvds" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.840171 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/928468fc-c237-4779-a2c6-7365b3764fe8-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-sfrgk\" (UID: \"928468fc-c237-4779-a2c6-7365b3764fe8\") " pod="openshift-authentication/oauth-openshift-558db77b4-sfrgk" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.840493 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc155a4b-53ea-4394-87d2-d4f966e3589d-serving-cert\") pod \"console-operator-58897d9998-cgrf8\" (UID: \"bc155a4b-53ea-4394-87d2-d4f966e3589d\") " pod="openshift-console-operator/console-operator-58897d9998-cgrf8" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.840878 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/85a8345c-8774-4272-887a-42b2d64a65cf-installation-pull-secrets\") pod \"image-registry-697d97f7c8-mjzxr\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.841543 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/1c71efc6-2e03-405c-84f9-6ba44b085df4-default-certificate\") pod \"router-default-5444994796-ss7r8\" (UID: \"1c71efc6-2e03-405c-84f9-6ba44b085df4\") " pod="openshift-ingress/router-default-5444994796-ss7r8" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.842956 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/928468fc-c237-4779-a2c6-7365b3764fe8-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-sfrgk\" (UID: \"928468fc-c237-4779-a2c6-7365b3764fe8\") " pod="openshift-authentication/oauth-openshift-558db77b4-sfrgk" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.848187 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/928468fc-c237-4779-a2c6-7365b3764fe8-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-sfrgk\" (UID: \"928468fc-c237-4779-a2c6-7365b3764fe8\") " pod="openshift-authentication/oauth-openshift-558db77b4-sfrgk" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.848258 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/1c71efc6-2e03-405c-84f9-6ba44b085df4-stats-auth\") pod \"router-default-5444994796-ss7r8\" (UID: \"1c71efc6-2e03-405c-84f9-6ba44b085df4\") " pod="openshift-ingress/router-default-5444994796-ss7r8" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.848302 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ef0e0ce2-d109-426d-8d69-7eb458708189-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-qq2pq\" (UID: \"ef0e0ce2-d109-426d-8d69-7eb458708189\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qq2pq" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.848706 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ef07db86-4677-41ce-8b6d-7960cc63a9b8-encryption-config\") pod \"apiserver-76f77b778f-btl2s\" (UID: \"ef07db86-4677-41ce-8b6d-7960cc63a9b8\") " pod="openshift-apiserver/apiserver-76f77b778f-btl2s" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.848943 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/42dc5935-aed6-4fee-a749-3c292d042df5-serving-cert\") pod \"authentication-operator-69f744f599-qgm7k\" (UID: \"42dc5935-aed6-4fee-a749-3c292d042df5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qgm7k" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.860971 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.881877 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-wdnt5"] Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.886170 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.896023 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bc155a4b-53ea-4394-87d2-d4f966e3589d-trusted-ca\") pod \"console-operator-58897d9998-cgrf8\" (UID: \"bc155a4b-53ea-4394-87d2-d4f966e3589d\") " pod="openshift-console-operator/console-operator-58897d9998-cgrf8" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.898940 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.915874 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/85a8345c-8774-4272-887a-42b2d64a65cf-registry-certificates\") pod \"image-registry-697d97f7c8-mjzxr\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.916510 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/928468fc-c237-4779-a2c6-7365b3764fe8-audit-policies\") pod \"oauth-openshift-558db77b4-sfrgk\" (UID: \"928468fc-c237-4779-a2c6-7365b3764fe8\") " pod="openshift-authentication/oauth-openshift-558db77b4-sfrgk" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.920822 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.934980 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.935168 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-444zm\" (UniqueName: \"kubernetes.io/projected/be7063fc-de10-4911-8c87-c3251c274912-kube-api-access-444zm\") pod \"packageserver-d55dfcdfc-tsbfw\" (UID: \"be7063fc-de10-4911-8c87-c3251c274912\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tsbfw" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.935195 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a772f0d-1184-43e0-9d33-b2d024933bd9-config\") pod \"service-ca-operator-777779d784-wnwsg\" (UID: \"4a772f0d-1184-43e0-9d33-b2d024933bd9\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wnwsg" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.935214 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/41364595-3dec-491e-823b-5cd2d7c4ea46-auth-proxy-config\") pod \"machine-config-operator-74547568cd-7l845\" (UID: \"41364595-3dec-491e-823b-5cd2d7c4ea46\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7l845" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.935256 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/38206b8e-cb94-47b8-b857-63a073f99ef7-plugins-dir\") pod \"csi-hostpathplugin-phr5l\" (UID: \"38206b8e-cb94-47b8-b857-63a073f99ef7\") " pod="hostpath-provisioner/csi-hostpathplugin-phr5l" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.935273 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9t8d\" (UniqueName: \"kubernetes.io/projected/e026a9f7-3301-4162-9710-706837e6d0ea-kube-api-access-x9t8d\") pod \"machine-config-server-2p482\" (UID: \"e026a9f7-3301-4162-9710-706837e6d0ea\") " pod="openshift-machine-config-operator/machine-config-server-2p482" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.935306 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktl25\" (UniqueName: \"kubernetes.io/projected/b9902b25-208d-48e9-bd97-f7e46797d813-kube-api-access-ktl25\") pod \"service-ca-9c57cc56f-xmj8x\" (UID: \"b9902b25-208d-48e9-bd97-f7e46797d813\") " pod="openshift-service-ca/service-ca-9c57cc56f-xmj8x" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.935492 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jj2bg"] Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.935549 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/7f7aced4-1923-447e-a85d-b84ff4974986-etcd-ca\") pod \"etcd-operator-b45778765-2vgnl\" (UID: \"7f7aced4-1923-447e-a85d-b84ff4974986\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2vgnl" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.935632 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3009e07b-2452-425c-95c3-3a78fa993d62-config-volume\") pod \"collect-profiles-29557860-g42hf\" (UID: \"3009e07b-2452-425c-95c3-3a78fa993d62\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557860-g42hf" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.935672 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m998t\" (UniqueName: \"kubernetes.io/projected/26699cc4-344e-4154-baa7-8964daba84f7-kube-api-access-m998t\") pod \"package-server-manager-789f6589d5-vcv9c\" (UID: \"26699cc4-344e-4154-baa7-8964daba84f7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vcv9c" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.935703 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/982402e2-823c-4c34-a446-7a2b05e9a00d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jn8vg\" (UID: \"982402e2-823c-4c34-a446-7a2b05e9a00d\") " pod="openshift-marketplace/marketplace-operator-79b997595-jn8vg" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.935765 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/38206b8e-cb94-47b8-b857-63a073f99ef7-socket-dir\") pod \"csi-hostpathplugin-phr5l\" (UID: \"38206b8e-cb94-47b8-b857-63a073f99ef7\") " pod="hostpath-provisioner/csi-hostpathplugin-phr5l" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.935795 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bt5g9\" (UniqueName: \"kubernetes.io/projected/7f7aced4-1923-447e-a85d-b84ff4974986-kube-api-access-bt5g9\") pod \"etcd-operator-b45778765-2vgnl\" (UID: \"7f7aced4-1923-447e-a85d-b84ff4974986\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2vgnl" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.935824 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lggpb\" (UniqueName: \"kubernetes.io/projected/0a74016e-259d-4d37-b12b-1a1255430d59-kube-api-access-lggpb\") pod \"ingress-operator-5b745b69d9-ccjq4\" (UID: \"0a74016e-259d-4d37-b12b-1a1255430d59\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ccjq4" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.935846 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/be7063fc-de10-4911-8c87-c3251c274912-webhook-cert\") pod \"packageserver-d55dfcdfc-tsbfw\" (UID: \"be7063fc-de10-4911-8c87-c3251c274912\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tsbfw" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.935877 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c2a95849-5cd5-46db-9979-7ed894406e61-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-wt8zp\" (UID: \"c2a95849-5cd5-46db-9979-7ed894406e61\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wt8zp" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.935904 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/7f7aced4-1923-447e-a85d-b84ff4974986-etcd-service-ca\") pod \"etcd-operator-b45778765-2vgnl\" (UID: \"7f7aced4-1923-447e-a85d-b84ff4974986\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2vgnl" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.935930 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzq5j\" (UniqueName: \"kubernetes.io/projected/f5e5a700-4ef2-4e79-9eaf-b1291e395185-kube-api-access-zzq5j\") pod \"dns-default-xxwxx\" (UID: \"f5e5a700-4ef2-4e79-9eaf-b1291e395185\") " pod="openshift-dns/dns-default-xxwxx" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.935953 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/33e432c4-ff58-4a8d-98b0-28668b3d7a88-profile-collector-cert\") pod \"catalog-operator-68c6474976-57kx4\" (UID: \"33e432c4-ff58-4a8d-98b0-28668b3d7a88\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-57kx4" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.935974 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/cf444907-0c34-43ab-9bbd-b9ef0773743c-ready\") pod \"cni-sysctl-allowlist-ds-6xtr4\" (UID: \"cf444907-0c34-43ab-9bbd-b9ef0773743c\") " pod="openshift-multus/cni-sysctl-allowlist-ds-6xtr4" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.936011 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tr26g\" (UniqueName: \"kubernetes.io/projected/982402e2-823c-4c34-a446-7a2b05e9a00d-kube-api-access-tr26g\") pod \"marketplace-operator-79b997595-jn8vg\" (UID: \"982402e2-823c-4c34-a446-7a2b05e9a00d\") " pod="openshift-marketplace/marketplace-operator-79b997595-jn8vg" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.936033 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a772f0d-1184-43e0-9d33-b2d024933bd9-serving-cert\") pod \"service-ca-operator-777779d784-wnwsg\" (UID: \"4a772f0d-1184-43e0-9d33-b2d024933bd9\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wnwsg" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.936057 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0a74016e-259d-4d37-b12b-1a1255430d59-bound-sa-token\") pod \"ingress-operator-5b745b69d9-ccjq4\" (UID: \"0a74016e-259d-4d37-b12b-1a1255430d59\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ccjq4" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.936081 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnbqz\" (UniqueName: \"kubernetes.io/projected/122146eb-6174-4393-aabf-121f89674223-kube-api-access-gnbqz\") pod \"machine-config-controller-84d6567774-spcr2\" (UID: \"122146eb-6174-4393-aabf-121f89674223\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-spcr2" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.936119 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctjqf\" (UniqueName: \"kubernetes.io/projected/33e432c4-ff58-4a8d-98b0-28668b3d7a88-kube-api-access-ctjqf\") pod \"catalog-operator-68c6474976-57kx4\" (UID: \"33e432c4-ff58-4a8d-98b0-28668b3d7a88\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-57kx4" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.936145 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/38206b8e-cb94-47b8-b857-63a073f99ef7-registration-dir\") pod \"csi-hostpathplugin-phr5l\" (UID: \"38206b8e-cb94-47b8-b857-63a073f99ef7\") " pod="hostpath-provisioner/csi-hostpathplugin-phr5l" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.936195 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xgd5\" (UniqueName: \"kubernetes.io/projected/72b53c84-bbb9-4fd1-9acd-625ecf005bff-kube-api-access-6xgd5\") pod \"ingress-canary-jh6jw\" (UID: \"72b53c84-bbb9-4fd1-9acd-625ecf005bff\") " pod="openshift-ingress-canary/ingress-canary-jh6jw" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.936220 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f5e5a700-4ef2-4e79-9eaf-b1291e395185-config-volume\") pod \"dns-default-xxwxx\" (UID: \"f5e5a700-4ef2-4e79-9eaf-b1291e395185\") " pod="openshift-dns/dns-default-xxwxx" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.936240 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/41364595-3dec-491e-823b-5cd2d7c4ea46-images\") pod \"machine-config-operator-74547568cd-7l845\" (UID: \"41364595-3dec-491e-823b-5cd2d7c4ea46\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7l845" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.936262 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/7f7aced4-1923-447e-a85d-b84ff4974986-etcd-ca\") pod \"etcd-operator-b45778765-2vgnl\" (UID: \"7f7aced4-1923-447e-a85d-b84ff4974986\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2vgnl" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.936264 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/f23fcc26-f800-497a-b038-065687659df7-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-v7lxq\" (UID: \"f23fcc26-f800-497a-b038-065687659df7\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v7lxq" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.936318 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7f7aced4-1923-447e-a85d-b84ff4974986-etcd-client\") pod \"etcd-operator-b45778765-2vgnl\" (UID: \"7f7aced4-1923-447e-a85d-b84ff4974986\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2vgnl" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.936338 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrqmf\" (UniqueName: \"kubernetes.io/projected/cf444907-0c34-43ab-9bbd-b9ef0773743c-kube-api-access-wrqmf\") pod \"cni-sysctl-allowlist-ds-6xtr4\" (UID: \"cf444907-0c34-43ab-9bbd-b9ef0773743c\") " pod="openshift-multus/cni-sysctl-allowlist-ds-6xtr4" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.936365 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f7aced4-1923-447e-a85d-b84ff4974986-serving-cert\") pod \"etcd-operator-b45778765-2vgnl\" (UID: \"7f7aced4-1923-447e-a85d-b84ff4974986\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2vgnl" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.936381 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mj8rq\" (UniqueName: \"kubernetes.io/projected/41364595-3dec-491e-823b-5cd2d7c4ea46-kube-api-access-mj8rq\") pod \"machine-config-operator-74547568cd-7l845\" (UID: \"41364595-3dec-491e-823b-5cd2d7c4ea46\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7l845" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.936395 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/41364595-3dec-491e-823b-5cd2d7c4ea46-auth-proxy-config\") pod \"machine-config-operator-74547568cd-7l845\" (UID: \"41364595-3dec-491e-823b-5cd2d7c4ea46\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7l845" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.936463 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cf444907-0c34-43ab-9bbd-b9ef0773743c-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-6xtr4\" (UID: \"cf444907-0c34-43ab-9bbd-b9ef0773743c\") " pod="openshift-multus/cni-sysctl-allowlist-ds-6xtr4" Mar 14 07:00:57 crc kubenswrapper[4893]: E0314 07:00:57.936552 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:00:58.436534739 +0000 UTC m=+137.698711531 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.936642 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/38206b8e-cb94-47b8-b857-63a073f99ef7-plugins-dir\") pod \"csi-hostpathplugin-phr5l\" (UID: \"38206b8e-cb94-47b8-b857-63a073f99ef7\") " pod="hostpath-provisioner/csi-hostpathplugin-phr5l" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.936983 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/cf444907-0c34-43ab-9bbd-b9ef0773743c-ready\") pod \"cni-sysctl-allowlist-ds-6xtr4\" (UID: \"cf444907-0c34-43ab-9bbd-b9ef0773743c\") " pod="openshift-multus/cni-sysctl-allowlist-ds-6xtr4" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.937257 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a772f0d-1184-43e0-9d33-b2d024933bd9-config\") pod \"service-ca-operator-777779d784-wnwsg\" (UID: \"4a772f0d-1184-43e0-9d33-b2d024933bd9\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wnwsg" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.936412 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cf444907-0c34-43ab-9bbd-b9ef0773743c-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-6xtr4\" (UID: \"cf444907-0c34-43ab-9bbd-b9ef0773743c\") " pod="openshift-multus/cni-sysctl-allowlist-ds-6xtr4" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.937289 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3009e07b-2452-425c-95c3-3a78fa993d62-config-volume\") pod \"collect-profiles-29557860-g42hf\" (UID: \"3009e07b-2452-425c-95c3-3a78fa993d62\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557860-g42hf" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.937356 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/e026a9f7-3301-4162-9710-706837e6d0ea-node-bootstrap-token\") pod \"machine-config-server-2p482\" (UID: \"e026a9f7-3301-4162-9710-706837e6d0ea\") " pod="openshift-machine-config-operator/machine-config-server-2p482" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.937374 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/41364595-3dec-491e-823b-5cd2d7c4ea46-proxy-tls\") pod \"machine-config-operator-74547568cd-7l845\" (UID: \"41364595-3dec-491e-823b-5cd2d7c4ea46\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7l845" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.937405 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2a95849-5cd5-46db-9979-7ed894406e61-config\") pod \"kube-apiserver-operator-766d6c64bb-wt8zp\" (UID: \"c2a95849-5cd5-46db-9979-7ed894406e61\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wt8zp" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.937427 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/b9902b25-208d-48e9-bd97-f7e46797d813-signing-key\") pod \"service-ca-9c57cc56f-xmj8x\" (UID: \"b9902b25-208d-48e9-bd97-f7e46797d813\") " pod="openshift-service-ca/service-ca-9c57cc56f-xmj8x" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.937461 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0a74016e-259d-4d37-b12b-1a1255430d59-metrics-tls\") pod \"ingress-operator-5b745b69d9-ccjq4\" (UID: \"0a74016e-259d-4d37-b12b-1a1255430d59\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ccjq4" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.937482 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/72b53c84-bbb9-4fd1-9acd-625ecf005bff-cert\") pod \"ingress-canary-jh6jw\" (UID: \"72b53c84-bbb9-4fd1-9acd-625ecf005bff\") " pod="openshift-ingress-canary/ingress-canary-jh6jw" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.937504 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvhx6\" (UniqueName: \"kubernetes.io/projected/3009e07b-2452-425c-95c3-3a78fa993d62-kube-api-access-vvhx6\") pod \"collect-profiles-29557860-g42hf\" (UID: \"3009e07b-2452-425c-95c3-3a78fa993d62\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557860-g42hf" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.937511 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/38206b8e-cb94-47b8-b857-63a073f99ef7-registration-dir\") pod \"csi-hostpathplugin-phr5l\" (UID: \"38206b8e-cb94-47b8-b857-63a073f99ef7\") " pod="hostpath-provisioner/csi-hostpathplugin-phr5l" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.937554 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0a74016e-259d-4d37-b12b-1a1255430d59-trusted-ca\") pod \"ingress-operator-5b745b69d9-ccjq4\" (UID: \"0a74016e-259d-4d37-b12b-1a1255430d59\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ccjq4" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.937665 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2dkb\" (UniqueName: \"kubernetes.io/projected/38206b8e-cb94-47b8-b857-63a073f99ef7-kube-api-access-z2dkb\") pod \"csi-hostpathplugin-phr5l\" (UID: \"38206b8e-cb94-47b8-b857-63a073f99ef7\") " pod="hostpath-provisioner/csi-hostpathplugin-phr5l" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.937699 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/33e432c4-ff58-4a8d-98b0-28668b3d7a88-srv-cert\") pod \"catalog-operator-68c6474976-57kx4\" (UID: \"33e432c4-ff58-4a8d-98b0-28668b3d7a88\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-57kx4" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.937718 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19d07c7b-828b-4582-aa4c-a6f4c1f09c9f-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-h4wwz\" (UID: \"19d07c7b-828b-4582-aa4c-a6f4c1f09c9f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h4wwz" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.937750 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92xt8\" (UniqueName: \"kubernetes.io/projected/19d07c7b-828b-4582-aa4c-a6f4c1f09c9f-kube-api-access-92xt8\") pod \"openshift-controller-manager-operator-756b6f6bc6-h4wwz\" (UID: \"19d07c7b-828b-4582-aa4c-a6f4c1f09c9f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h4wwz" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.937781 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/982402e2-823c-4c34-a446-7a2b05e9a00d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jn8vg\" (UID: \"982402e2-823c-4c34-a446-7a2b05e9a00d\") " pod="openshift-marketplace/marketplace-operator-79b997595-jn8vg" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.937822 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2a95849-5cd5-46db-9979-7ed894406e61-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-wt8zp\" (UID: \"c2a95849-5cd5-46db-9979-7ed894406e61\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wt8zp" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.937846 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f7aced4-1923-447e-a85d-b84ff4974986-config\") pod \"etcd-operator-b45778765-2vgnl\" (UID: \"7f7aced4-1923-447e-a85d-b84ff4974986\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2vgnl" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.937871 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f5e5a700-4ef2-4e79-9eaf-b1291e395185-metrics-tls\") pod \"dns-default-xxwxx\" (UID: \"f5e5a700-4ef2-4e79-9eaf-b1291e395185\") " pod="openshift-dns/dns-default-xxwxx" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.937943 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4l6j6\" (UniqueName: \"kubernetes.io/projected/f23fcc26-f800-497a-b038-065687659df7-kube-api-access-4l6j6\") pod \"control-plane-machine-set-operator-78cbb6b69f-v7lxq\" (UID: \"f23fcc26-f800-497a-b038-065687659df7\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v7lxq" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.937984 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4glh\" (UniqueName: \"kubernetes.io/projected/4a772f0d-1184-43e0-9d33-b2d024933bd9-kube-api-access-w4glh\") pod \"service-ca-operator-777779d784-wnwsg\" (UID: \"4a772f0d-1184-43e0-9d33-b2d024933bd9\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wnwsg" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.938009 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3009e07b-2452-425c-95c3-3a78fa993d62-secret-volume\") pod \"collect-profiles-29557860-g42hf\" (UID: \"3009e07b-2452-425c-95c3-3a78fa993d62\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557860-g42hf" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.938045 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/122146eb-6174-4393-aabf-121f89674223-proxy-tls\") pod \"machine-config-controller-84d6567774-spcr2\" (UID: \"122146eb-6174-4393-aabf-121f89674223\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-spcr2" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.938062 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/122146eb-6174-4393-aabf-121f89674223-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-spcr2\" (UID: \"122146eb-6174-4393-aabf-121f89674223\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-spcr2" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.938090 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/be7063fc-de10-4911-8c87-c3251c274912-tmpfs\") pod \"packageserver-d55dfcdfc-tsbfw\" (UID: \"be7063fc-de10-4911-8c87-c3251c274912\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tsbfw" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.938112 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/19d07c7b-828b-4582-aa4c-a6f4c1f09c9f-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-h4wwz\" (UID: \"19d07c7b-828b-4582-aa4c-a6f4c1f09c9f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h4wwz" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.938145 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/b9902b25-208d-48e9-bd97-f7e46797d813-signing-cabundle\") pod \"service-ca-9c57cc56f-xmj8x\" (UID: \"b9902b25-208d-48e9-bd97-f7e46797d813\") " pod="openshift-service-ca/service-ca-9c57cc56f-xmj8x" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.938174 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/26699cc4-344e-4154-baa7-8964daba84f7-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-vcv9c\" (UID: \"26699cc4-344e-4154-baa7-8964daba84f7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vcv9c" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.938198 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/be7063fc-de10-4911-8c87-c3251c274912-apiservice-cert\") pod \"packageserver-d55dfcdfc-tsbfw\" (UID: \"be7063fc-de10-4911-8c87-c3251c274912\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tsbfw" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.938237 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/38206b8e-cb94-47b8-b857-63a073f99ef7-mountpoint-dir\") pod \"csi-hostpathplugin-phr5l\" (UID: \"38206b8e-cb94-47b8-b857-63a073f99ef7\") " pod="hostpath-provisioner/csi-hostpathplugin-phr5l" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.938251 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/38206b8e-cb94-47b8-b857-63a073f99ef7-csi-data-dir\") pod \"csi-hostpathplugin-phr5l\" (UID: \"38206b8e-cb94-47b8-b857-63a073f99ef7\") " pod="hostpath-provisioner/csi-hostpathplugin-phr5l" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.938267 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/e026a9f7-3301-4162-9710-706837e6d0ea-certs\") pod \"machine-config-server-2p482\" (UID: \"e026a9f7-3301-4162-9710-706837e6d0ea\") " pod="openshift-machine-config-operator/machine-config-server-2p482" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.938401 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/cf444907-0c34-43ab-9bbd-b9ef0773743c-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-6xtr4\" (UID: \"cf444907-0c34-43ab-9bbd-b9ef0773743c\") " pod="openshift-multus/cni-sysctl-allowlist-ds-6xtr4" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.938847 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0a74016e-259d-4d37-b12b-1a1255430d59-trusted-ca\") pod \"ingress-operator-5b745b69d9-ccjq4\" (UID: \"0a74016e-259d-4d37-b12b-1a1255430d59\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ccjq4" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.939907 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2a95849-5cd5-46db-9979-7ed894406e61-config\") pod \"kube-apiserver-operator-766d6c64bb-wt8zp\" (UID: \"c2a95849-5cd5-46db-9979-7ed894406e61\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wt8zp" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.939919 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7f7aced4-1923-447e-a85d-b84ff4974986-etcd-client\") pod \"etcd-operator-b45778765-2vgnl\" (UID: \"7f7aced4-1923-447e-a85d-b84ff4974986\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2vgnl" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.940028 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f5e5a700-4ef2-4e79-9eaf-b1291e395185-config-volume\") pod \"dns-default-xxwxx\" (UID: \"f5e5a700-4ef2-4e79-9eaf-b1291e395185\") " pod="openshift-dns/dns-default-xxwxx" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.942509 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f7aced4-1923-447e-a85d-b84ff4974986-config\") pod \"etcd-operator-b45778765-2vgnl\" (UID: \"7f7aced4-1923-447e-a85d-b84ff4974986\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2vgnl" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.942569 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/38206b8e-cb94-47b8-b857-63a073f99ef7-socket-dir\") pod \"csi-hostpathplugin-phr5l\" (UID: \"38206b8e-cb94-47b8-b857-63a073f99ef7\") " pod="hostpath-provisioner/csi-hostpathplugin-phr5l" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.942763 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/41364595-3dec-491e-823b-5cd2d7c4ea46-images\") pod \"machine-config-operator-74547568cd-7l845\" (UID: \"41364595-3dec-491e-823b-5cd2d7c4ea46\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7l845" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.943162 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.943411 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19d07c7b-828b-4582-aa4c-a6f4c1f09c9f-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-h4wwz\" (UID: \"19d07c7b-828b-4582-aa4c-a6f4c1f09c9f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h4wwz" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.943459 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/38206b8e-cb94-47b8-b857-63a073f99ef7-mountpoint-dir\") pod \"csi-hostpathplugin-phr5l\" (UID: \"38206b8e-cb94-47b8-b857-63a073f99ef7\") " pod="hostpath-provisioner/csi-hostpathplugin-phr5l" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.943563 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/38206b8e-cb94-47b8-b857-63a073f99ef7-csi-data-dir\") pod \"csi-hostpathplugin-phr5l\" (UID: \"38206b8e-cb94-47b8-b857-63a073f99ef7\") " pod="hostpath-provisioner/csi-hostpathplugin-phr5l" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.944855 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/7f7aced4-1923-447e-a85d-b84ff4974986-etcd-service-ca\") pod \"etcd-operator-b45778765-2vgnl\" (UID: \"7f7aced4-1923-447e-a85d-b84ff4974986\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2vgnl" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.945031 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/982402e2-823c-4c34-a446-7a2b05e9a00d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jn8vg\" (UID: \"982402e2-823c-4c34-a446-7a2b05e9a00d\") " pod="openshift-marketplace/marketplace-operator-79b997595-jn8vg" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.945665 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/b9902b25-208d-48e9-bd97-f7e46797d813-signing-cabundle\") pod \"service-ca-9c57cc56f-xmj8x\" (UID: \"b9902b25-208d-48e9-bd97-f7e46797d813\") " pod="openshift-service-ca/service-ca-9c57cc56f-xmj8x" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.947543 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/cf444907-0c34-43ab-9bbd-b9ef0773743c-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-6xtr4\" (UID: \"cf444907-0c34-43ab-9bbd-b9ef0773743c\") " pod="openshift-multus/cni-sysctl-allowlist-ds-6xtr4" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.948346 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f7aced4-1923-447e-a85d-b84ff4974986-serving-cert\") pod \"etcd-operator-b45778765-2vgnl\" (UID: \"7f7aced4-1923-447e-a85d-b84ff4974986\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2vgnl" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.949132 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/e026a9f7-3301-4162-9710-706837e6d0ea-certs\") pod \"machine-config-server-2p482\" (UID: \"e026a9f7-3301-4162-9710-706837e6d0ea\") " pod="openshift-machine-config-operator/machine-config-server-2p482" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.949738 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/26699cc4-344e-4154-baa7-8964daba84f7-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-vcv9c\" (UID: \"26699cc4-344e-4154-baa7-8964daba84f7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vcv9c" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.950190 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/122146eb-6174-4393-aabf-121f89674223-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-spcr2\" (UID: \"122146eb-6174-4393-aabf-121f89674223\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-spcr2" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.950350 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/982402e2-823c-4c34-a446-7a2b05e9a00d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jn8vg\" (UID: \"982402e2-823c-4c34-a446-7a2b05e9a00d\") " pod="openshift-marketplace/marketplace-operator-79b997595-jn8vg" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.951255 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/be7063fc-de10-4911-8c87-c3251c274912-apiservice-cert\") pod \"packageserver-d55dfcdfc-tsbfw\" (UID: \"be7063fc-de10-4911-8c87-c3251c274912\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tsbfw" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.951887 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a772f0d-1184-43e0-9d33-b2d024933bd9-serving-cert\") pod \"service-ca-operator-777779d784-wnwsg\" (UID: \"4a772f0d-1184-43e0-9d33-b2d024933bd9\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wnwsg" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.951927 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2a95849-5cd5-46db-9979-7ed894406e61-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-wt8zp\" (UID: \"c2a95849-5cd5-46db-9979-7ed894406e61\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wt8zp" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.951980 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/19d07c7b-828b-4582-aa4c-a6f4c1f09c9f-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-h4wwz\" (UID: \"19d07c7b-828b-4582-aa4c-a6f4c1f09c9f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h4wwz" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.952042 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/33e432c4-ff58-4a8d-98b0-28668b3d7a88-profile-collector-cert\") pod \"catalog-operator-68c6474976-57kx4\" (UID: \"33e432c4-ff58-4a8d-98b0-28668b3d7a88\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-57kx4" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.952108 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/be7063fc-de10-4911-8c87-c3251c274912-webhook-cert\") pod \"packageserver-d55dfcdfc-tsbfw\" (UID: \"be7063fc-de10-4911-8c87-c3251c274912\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tsbfw" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.952346 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/33e432c4-ff58-4a8d-98b0-28668b3d7a88-srv-cert\") pod \"catalog-operator-68c6474976-57kx4\" (UID: \"33e432c4-ff58-4a8d-98b0-28668b3d7a88\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-57kx4" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.952379 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/72b53c84-bbb9-4fd1-9acd-625ecf005bff-cert\") pod \"ingress-canary-jh6jw\" (UID: \"72b53c84-bbb9-4fd1-9acd-625ecf005bff\") " pod="openshift-ingress-canary/ingress-canary-jh6jw" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.952443 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3009e07b-2452-425c-95c3-3a78fa993d62-secret-volume\") pod \"collect-profiles-29557860-g42hf\" (UID: \"3009e07b-2452-425c-95c3-3a78fa993d62\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557860-g42hf" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.952430 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/122146eb-6174-4393-aabf-121f89674223-proxy-tls\") pod \"machine-config-controller-84d6567774-spcr2\" (UID: \"122146eb-6174-4393-aabf-121f89674223\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-spcr2" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.952578 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/e026a9f7-3301-4162-9710-706837e6d0ea-node-bootstrap-token\") pod \"machine-config-server-2p482\" (UID: \"e026a9f7-3301-4162-9710-706837e6d0ea\") " pod="openshift-machine-config-operator/machine-config-server-2p482" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.952762 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-52b8f"] Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.953086 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/be7063fc-de10-4911-8c87-c3251c274912-tmpfs\") pod \"packageserver-d55dfcdfc-tsbfw\" (UID: \"be7063fc-de10-4911-8c87-c3251c274912\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tsbfw" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.953085 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0a74016e-259d-4d37-b12b-1a1255430d59-metrics-tls\") pod \"ingress-operator-5b745b69d9-ccjq4\" (UID: \"0a74016e-259d-4d37-b12b-1a1255430d59\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ccjq4" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.953717 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/f23fcc26-f800-497a-b038-065687659df7-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-v7lxq\" (UID: \"f23fcc26-f800-497a-b038-065687659df7\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v7lxq" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.953732 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/41364595-3dec-491e-823b-5cd2d7c4ea46-proxy-tls\") pod \"machine-config-operator-74547568cd-7l845\" (UID: \"41364595-3dec-491e-823b-5cd2d7c4ea46\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7l845" Mar 14 07:00:57 crc kubenswrapper[4893]: W0314 07:00:57.953833 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbdb8df65_c9b0_488a_9653_c0e97024027b.slice/crio-cac36c37b0d62e30aa24c813c47283f1bda2051069b69ddcf66d95555dda497d WatchSource:0}: Error finding container cac36c37b0d62e30aa24c813c47283f1bda2051069b69ddcf66d95555dda497d: Status 404 returned error can't find the container with id cac36c37b0d62e30aa24c813c47283f1bda2051069b69ddcf66d95555dda497d Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.959159 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.959290 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h2xwg"] Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.959685 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lmn9t" event={"ID":"39b2c526-e373-404b-a607-bdd8d29c8fae","Type":"ContainerStarted","Data":"962122fce0a0df427fdb621234a1a6ae7683dd175d6c02fa010e135237a8f936"} Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.959720 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/b9902b25-208d-48e9-bd97-f7e46797d813-signing-key\") pod \"service-ca-9c57cc56f-xmj8x\" (UID: \"b9902b25-208d-48e9-bd97-f7e46797d813\") " pod="openshift-service-ca/service-ca-9c57cc56f-xmj8x" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.961443 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zzldb" event={"ID":"807a844d-47c0-4dc3-b820-0ea0069a28b6","Type":"ContainerStarted","Data":"006a4fcf98451c3f8b59e3d26a4e2fefcd4c166af2e0fe145bbf1f62edb23dc2"} Mar 14 07:00:57 crc kubenswrapper[4893]: W0314 07:00:57.962435 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode25c5b47_b7b9_4213_931b_5d6691aaa2d4.slice/crio-ab2330108db0953931931557d536401e531969140b2897f2fa845daa1463b5ef WatchSource:0}: Error finding container ab2330108db0953931931557d536401e531969140b2897f2fa845daa1463b5ef: Status 404 returned error can't find the container with id ab2330108db0953931931557d536401e531969140b2897f2fa845daa1463b5ef Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.963828 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-x2sb4" event={"ID":"6f37e61e-cd0d-4394-bdfa-a7c08c0742f3","Type":"ContainerStarted","Data":"fd9c4c7203ed35aba1bf7c835c91534ce3d24daa0a7215c46b3d277e32b19c4a"} Mar 14 07:00:57 crc kubenswrapper[4893]: W0314 07:00:57.965810 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44475c5a_f261_48e3_be03_2a5b4f127e5a.slice/crio-2f556be3541d9cd11a56501a47352d97cb2db0a7f8365d1d3c82ecfb38a1a875 WatchSource:0}: Error finding container 2f556be3541d9cd11a56501a47352d97cb2db0a7f8365d1d3c82ecfb38a1a875: Status 404 returned error can't find the container with id 2f556be3541d9cd11a56501a47352d97cb2db0a7f8365d1d3c82ecfb38a1a875 Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.970094 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f5e5a700-4ef2-4e79-9eaf-b1291e395185-metrics-tls\") pod \"dns-default-xxwxx\" (UID: \"f5e5a700-4ef2-4e79-9eaf-b1291e395185\") " pod="openshift-dns/dns-default-xxwxx" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.980262 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.995931 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef07db86-4677-41ce-8b6d-7960cc63a9b8-serving-cert\") pod \"apiserver-76f77b778f-btl2s\" (UID: \"ef07db86-4677-41ce-8b6d-7960cc63a9b8\") " pod="openshift-apiserver/apiserver-76f77b778f-btl2s" Mar 14 07:00:57 crc kubenswrapper[4893]: I0314 07:00:57.999491 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.008552 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/871c529e-b263-4600-982a-d7be266f86e4-images\") pod \"machine-api-operator-5694c8668f-vbg6j\" (UID: \"871c529e-b263-4600-982a-d7be266f86e4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vbg6j" Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.021576 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.031993 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-k4jvl"] Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.032331 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtp6t\" (UniqueName: \"kubernetes.io/projected/ef07db86-4677-41ce-8b6d-7960cc63a9b8-kube-api-access-gtp6t\") pod \"apiserver-76f77b778f-btl2s\" (UID: \"ef07db86-4677-41ce-8b6d-7960cc63a9b8\") " pod="openshift-apiserver/apiserver-76f77b778f-btl2s" Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.039290 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjzxr\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.039365 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 14 07:00:58 crc kubenswrapper[4893]: E0314 07:00:58.039858 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:00:58.53984051 +0000 UTC m=+137.802017302 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjzxr" (UID: "85a8345c-8774-4272-887a-42b2d64a65cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.059096 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.070623 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ef07db86-4677-41ce-8b6d-7960cc63a9b8-etcd-client\") pod \"apiserver-76f77b778f-btl2s\" (UID: \"ef07db86-4677-41ce-8b6d-7960cc63a9b8\") " pod="openshift-apiserver/apiserver-76f77b778f-btl2s" Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.074710 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-66mm8"] Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.078932 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.094946 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/928468fc-c237-4779-a2c6-7365b3764fe8-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-sfrgk\" (UID: \"928468fc-c237-4779-a2c6-7365b3764fe8\") " pod="openshift-authentication/oauth-openshift-558db77b4-sfrgk" Mar 14 07:00:58 crc kubenswrapper[4893]: W0314 07:00:58.095770 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55b95986_0c2a_4589_98cf_e3834cf5982d.slice/crio-52ec1fc1c2749bb642682f8697449a43eebc1fd3e05352404a21af0838148aea WatchSource:0}: Error finding container 52ec1fc1c2749bb642682f8697449a43eebc1fd3e05352404a21af0838148aea: Status 404 returned error can't find the container with id 52ec1fc1c2749bb642682f8697449a43eebc1fd3e05352404a21af0838148aea Mar 14 07:00:58 crc kubenswrapper[4893]: E0314 07:00:58.104744 4893 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-1: failed to sync configmap cache: timed out waiting for the condition Mar 14 07:00:58 crc kubenswrapper[4893]: E0314 07:00:58.104812 4893 configmap.go:193] Couldn't get configMap openshift-apiserver/trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Mar 14 07:00:58 crc kubenswrapper[4893]: E0314 07:00:58.104848 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ef07db86-4677-41ce-8b6d-7960cc63a9b8-audit podName:ef07db86-4677-41ce-8b6d-7960cc63a9b8 nodeName:}" failed. No retries permitted until 2026-03-14 07:00:59.10482844 +0000 UTC m=+138.367005232 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/ef07db86-4677-41ce-8b6d-7960cc63a9b8-audit") pod "apiserver-76f77b778f-btl2s" (UID: "ef07db86-4677-41ce-8b6d-7960cc63a9b8") : failed to sync configmap cache: timed out waiting for the condition Mar 14 07:00:58 crc kubenswrapper[4893]: E0314 07:00:58.104889 4893 secret.go:188] Couldn't get secret openshift-machine-api/machine-api-operator-tls: failed to sync secret cache: timed out waiting for the condition Mar 14 07:00:58 crc kubenswrapper[4893]: E0314 07:00:58.104926 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ef07db86-4677-41ce-8b6d-7960cc63a9b8-trusted-ca-bundle podName:ef07db86-4677-41ce-8b6d-7960cc63a9b8 nodeName:}" failed. No retries permitted until 2026-03-14 07:00:59.104899372 +0000 UTC m=+138.367076164 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/ef07db86-4677-41ce-8b6d-7960cc63a9b8-trusted-ca-bundle") pod "apiserver-76f77b778f-btl2s" (UID: "ef07db86-4677-41ce-8b6d-7960cc63a9b8") : failed to sync configmap cache: timed out waiting for the condition Mar 14 07:00:58 crc kubenswrapper[4893]: E0314 07:00:58.104954 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/871c529e-b263-4600-982a-d7be266f86e4-machine-api-operator-tls podName:871c529e-b263-4600-982a-d7be266f86e4 nodeName:}" failed. No retries permitted until 2026-03-14 07:00:59.104943423 +0000 UTC m=+138.367120215 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "machine-api-operator-tls" (UniqueName: "kubernetes.io/secret/871c529e-b263-4600-982a-d7be266f86e4-machine-api-operator-tls") pod "machine-api-operator-5694c8668f-vbg6j" (UID: "871c529e-b263-4600-982a-d7be266f86e4") : failed to sync secret cache: timed out waiting for the condition Mar 14 07:00:58 crc kubenswrapper[4893]: E0314 07:00:58.105059 4893 configmap.go:193] Couldn't get configMap openshift-apiserver/image-import-ca: failed to sync configmap cache: timed out waiting for the condition Mar 14 07:00:58 crc kubenswrapper[4893]: E0314 07:00:58.105069 4893 configmap.go:193] Couldn't get configMap openshift-apiserver/etcd-serving-ca: failed to sync configmap cache: timed out waiting for the condition Mar 14 07:00:58 crc kubenswrapper[4893]: E0314 07:00:58.105119 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ef07db86-4677-41ce-8b6d-7960cc63a9b8-etcd-serving-ca podName:ef07db86-4677-41ce-8b6d-7960cc63a9b8 nodeName:}" failed. No retries permitted until 2026-03-14 07:00:59.105099976 +0000 UTC m=+138.367276768 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etcd-serving-ca" (UniqueName: "kubernetes.io/configmap/ef07db86-4677-41ce-8b6d-7960cc63a9b8-etcd-serving-ca") pod "apiserver-76f77b778f-btl2s" (UID: "ef07db86-4677-41ce-8b6d-7960cc63a9b8") : failed to sync configmap cache: timed out waiting for the condition Mar 14 07:00:58 crc kubenswrapper[4893]: E0314 07:00:58.105140 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ef07db86-4677-41ce-8b6d-7960cc63a9b8-image-import-ca podName:ef07db86-4677-41ce-8b6d-7960cc63a9b8 nodeName:}" failed. No retries permitted until 2026-03-14 07:00:59.105130727 +0000 UTC m=+138.367307519 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "image-import-ca" (UniqueName: "kubernetes.io/configmap/ef07db86-4677-41ce-8b6d-7960cc63a9b8-image-import-ca") pod "apiserver-76f77b778f-btl2s" (UID: "ef07db86-4677-41ce-8b6d-7960cc63a9b8") : failed to sync configmap cache: timed out waiting for the condition Mar 14 07:00:58 crc kubenswrapper[4893]: E0314 07:00:58.105504 4893 configmap.go:193] Couldn't get configMap openshift-apiserver/config: failed to sync configmap cache: timed out waiting for the condition Mar 14 07:00:58 crc kubenswrapper[4893]: E0314 07:00:58.105649 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ef07db86-4677-41ce-8b6d-7960cc63a9b8-config podName:ef07db86-4677-41ce-8b6d-7960cc63a9b8 nodeName:}" failed. No retries permitted until 2026-03-14 07:00:59.10562798 +0000 UTC m=+138.367804772 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/ef07db86-4677-41ce-8b6d-7960cc63a9b8-config") pod "apiserver-76f77b778f-btl2s" (UID: "ef07db86-4677-41ce-8b6d-7960cc63a9b8") : failed to sync configmap cache: timed out waiting for the condition Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.107167 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 14 07:00:58 crc kubenswrapper[4893]: E0314 07:00:58.108541 4893 configmap.go:193] Couldn't get configMap openshift-console/service-ca: failed to sync configmap cache: timed out waiting for the condition Mar 14 07:00:58 crc kubenswrapper[4893]: E0314 07:00:58.108629 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9-service-ca podName:ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9 nodeName:}" failed. No retries permitted until 2026-03-14 07:00:59.108611802 +0000 UTC m=+138.370788594 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca" (UniqueName: "kubernetes.io/configmap/ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9-service-ca") pod "console-f9d7485db-psm2j" (UID: "ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9") : failed to sync configmap cache: timed out waiting for the condition Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.122243 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.140311 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.140890 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:00:58 crc kubenswrapper[4893]: E0314 07:00:58.141027 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:00:58.641005509 +0000 UTC m=+137.903182311 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.141092 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjzxr\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:00:58 crc kubenswrapper[4893]: E0314 07:00:58.141800 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:00:58.641781729 +0000 UTC m=+137.903958611 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjzxr" (UID: "85a8345c-8774-4272-887a-42b2d64a65cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.158979 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.171921 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/928468fc-c237-4779-a2c6-7365b3764fe8-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-sfrgk\" (UID: \"928468fc-c237-4779-a2c6-7365b3764fe8\") " pod="openshift-authentication/oauth-openshift-558db77b4-sfrgk" Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.179691 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.207633 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.219482 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.239384 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.242148 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:00:58 crc kubenswrapper[4893]: E0314 07:00:58.242283 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:00:58.74225616 +0000 UTC m=+138.004432952 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.242966 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjzxr\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:00:58 crc kubenswrapper[4893]: E0314 07:00:58.243326 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:00:58.743312746 +0000 UTC m=+138.005489548 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjzxr" (UID: "85a8345c-8774-4272-887a-42b2d64a65cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.244746 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/928468fc-c237-4779-a2c6-7365b3764fe8-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-sfrgk\" (UID: \"928468fc-c237-4779-a2c6-7365b3764fe8\") " pod="openshift-authentication/oauth-openshift-558db77b4-sfrgk" Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.274597 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.281004 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.282349 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/928468fc-c237-4779-a2c6-7365b3764fe8-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-sfrgk\" (UID: \"928468fc-c237-4779-a2c6-7365b3764fe8\") " pod="openshift-authentication/oauth-openshift-558db77b4-sfrgk" Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.291605 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/85a8345c-8774-4272-887a-42b2d64a65cf-registry-tls\") pod \"image-registry-697d97f7c8-mjzxr\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.298789 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.319578 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.340759 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.341156 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/928468fc-c237-4779-a2c6-7365b3764fe8-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-sfrgk\" (UID: \"928468fc-c237-4779-a2c6-7365b3764fe8\") " pod="openshift-authentication/oauth-openshift-558db77b4-sfrgk" Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.343601 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:00:58 crc kubenswrapper[4893]: E0314 07:00:58.343721 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:00:58.843703127 +0000 UTC m=+138.105879919 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.344059 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjzxr\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:00:58 crc kubenswrapper[4893]: E0314 07:00:58.344353 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:00:58.844343652 +0000 UTC m=+138.106520444 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjzxr" (UID: "85a8345c-8774-4272-887a-42b2d64a65cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.359217 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.369533 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/928468fc-c237-4779-a2c6-7365b3764fe8-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-sfrgk\" (UID: \"928468fc-c237-4779-a2c6-7365b3764fe8\") " pod="openshift-authentication/oauth-openshift-558db77b4-sfrgk" Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.386494 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.396061 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/928468fc-c237-4779-a2c6-7365b3764fe8-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-sfrgk\" (UID: \"928468fc-c237-4779-a2c6-7365b3764fe8\") " pod="openshift-authentication/oauth-openshift-558db77b4-sfrgk" Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.399605 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.418990 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.439275 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.445734 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:00:58 crc kubenswrapper[4893]: E0314 07:00:58.445880 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:00:58.945862841 +0000 UTC m=+138.208039633 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.446461 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjzxr\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:00:58 crc kubenswrapper[4893]: E0314 07:00:58.446824 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:00:58.946814633 +0000 UTC m=+138.208991425 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjzxr" (UID: "85a8345c-8774-4272-887a-42b2d64a65cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.447019 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/928468fc-c237-4779-a2c6-7365b3764fe8-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-sfrgk\" (UID: \"928468fc-c237-4779-a2c6-7365b3764fe8\") " pod="openshift-authentication/oauth-openshift-558db77b4-sfrgk" Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.460062 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.479924 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.506211 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.508853 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/85a8345c-8774-4272-887a-42b2d64a65cf-trusted-ca\") pod \"image-registry-697d97f7c8-mjzxr\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.512489 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b82b8dee-58d7-4701-ad47-8ddc20898935-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-qvvds\" (UID: \"b82b8dee-58d7-4701-ad47-8ddc20898935\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qvvds" Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.518204 4893 request.go:700] Waited for 1.140576775s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler/pods Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.528156 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.547452 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:00:58 crc kubenswrapper[4893]: E0314 07:00:58.547623 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:00:59.047600453 +0000 UTC m=+138.309777245 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.547989 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjzxr\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:00:58 crc kubenswrapper[4893]: E0314 07:00:58.548323 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:00:59.048315781 +0000 UTC m=+138.310492573 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjzxr" (UID: "85a8345c-8774-4272-887a-42b2d64a65cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.572411 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/85a8345c-8774-4272-887a-42b2d64a65cf-bound-sa-token\") pod \"image-registry-697d97f7c8-mjzxr\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.601677 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2pcz\" (UniqueName: \"kubernetes.io/projected/2336c0f6-9ad9-45d3-bb00-cc5832466e7c-kube-api-access-t2pcz\") pod \"machine-approver-56656f9798-7wtvk\" (UID: \"2336c0f6-9ad9-45d3-bb00-cc5832466e7c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7wtvk" Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.611752 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcg52\" (UniqueName: \"kubernetes.io/projected/85a8345c-8774-4272-887a-42b2d64a65cf-kube-api-access-gcg52\") pod \"image-registry-697d97f7c8-mjzxr\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.631545 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksktz\" (UniqueName: \"kubernetes.io/projected/928468fc-c237-4779-a2c6-7365b3764fe8-kube-api-access-ksktz\") pod \"oauth-openshift-558db77b4-sfrgk\" (UID: \"928468fc-c237-4779-a2c6-7365b3764fe8\") " pod="openshift-authentication/oauth-openshift-558db77b4-sfrgk" Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.650883 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:00:58 crc kubenswrapper[4893]: E0314 07:00:58.651132 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:00:59.151099829 +0000 UTC m=+138.413276621 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.651270 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjzxr\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:00:58 crc kubenswrapper[4893]: E0314 07:00:58.651880 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:00:59.151868588 +0000 UTC m=+138.414045380 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjzxr" (UID: "85a8345c-8774-4272-887a-42b2d64a65cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.653758 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzrnd\" (UniqueName: \"kubernetes.io/projected/b82b8dee-58d7-4701-ad47-8ddc20898935-kube-api-access-dzrnd\") pod \"cluster-image-registry-operator-dc59b4c8b-qvvds\" (UID: \"b82b8dee-58d7-4701-ad47-8ddc20898935\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qvvds" Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.693045 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kqm9\" (UniqueName: \"kubernetes.io/projected/b511756d-571f-4e3f-9fde-bd9cd2d6e038-kube-api-access-6kqm9\") pod \"kube-storage-version-migrator-operator-b67b599dd-jn9rp\" (UID: \"b511756d-571f-4e3f-9fde-bd9cd2d6e038\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jn9rp" Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.695451 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b82b8dee-58d7-4701-ad47-8ddc20898935-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-qvvds\" (UID: \"b82b8dee-58d7-4701-ad47-8ddc20898935\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qvvds" Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.716724 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7kjs\" (UniqueName: \"kubernetes.io/projected/5b706875-4635-4744-941b-f235ebd548b0-kube-api-access-d7kjs\") pod \"dns-operator-744455d44c-gnrsd\" (UID: \"5b706875-4635-4744-941b-f235ebd548b0\") " pod="openshift-dns-operator/dns-operator-744455d44c-gnrsd" Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.733360 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2pmc\" (UniqueName: \"kubernetes.io/projected/ef0e0ce2-d109-426d-8d69-7eb458708189-kube-api-access-v2pmc\") pod \"cluster-samples-operator-665b6dd947-qq2pq\" (UID: \"ef0e0ce2-d109-426d-8d69-7eb458708189\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qq2pq" Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.753062 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:00:58 crc kubenswrapper[4893]: E0314 07:00:58.754440 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:00:59.254405001 +0000 UTC m=+138.516581793 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.755652 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ls6kb\" (UniqueName: \"kubernetes.io/projected/1c71efc6-2e03-405c-84f9-6ba44b085df4-kube-api-access-ls6kb\") pod \"router-default-5444994796-ss7r8\" (UID: \"1c71efc6-2e03-405c-84f9-6ba44b085df4\") " pod="openshift-ingress/router-default-5444994796-ss7r8" Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.775388 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-sfrgk" Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.776078 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qq9k\" (UniqueName: \"kubernetes.io/projected/b1b92f23-a052-41c6-817f-d43b04079105-kube-api-access-7qq9k\") pod \"downloads-7954f5f757-vkgvv\" (UID: \"b1b92f23-a052-41c6-817f-d43b04079105\") " pod="openshift-console/downloads-7954f5f757-vkgvv" Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.798738 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c22j4\" (UniqueName: \"kubernetes.io/projected/bc155a4b-53ea-4394-87d2-d4f966e3589d-kube-api-access-c22j4\") pod \"console-operator-58897d9998-cgrf8\" (UID: \"bc155a4b-53ea-4394-87d2-d4f966e3589d\") " pod="openshift-console-operator/console-operator-58897d9998-cgrf8" Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.815595 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgcs8\" (UniqueName: \"kubernetes.io/projected/42dc5935-aed6-4fee-a749-3c292d042df5-kube-api-access-rgcs8\") pod \"authentication-operator-69f744f599-qgm7k\" (UID: \"42dc5935-aed6-4fee-a749-3c292d042df5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qgm7k" Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.834389 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z28pl\" (UniqueName: \"kubernetes.io/projected/95fe2bd1-1c99-4105-931e-6a60dd881260-kube-api-access-z28pl\") pod \"route-controller-manager-6576b87f9c-m8wwx\" (UID: \"95fe2bd1-1c99-4105-931e-6a60dd881260\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m8wwx" Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.841135 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qvvds" Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.851670 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7wtvk" Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.853159 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9t8d\" (UniqueName: \"kubernetes.io/projected/e026a9f7-3301-4162-9710-706837e6d0ea-kube-api-access-x9t8d\") pod \"machine-config-server-2p482\" (UID: \"e026a9f7-3301-4162-9710-706837e6d0ea\") " pod="openshift-machine-config-operator/machine-config-server-2p482" Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.856039 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjzxr\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:00:58 crc kubenswrapper[4893]: E0314 07:00:58.856357 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:00:59.356345839 +0000 UTC m=+138.618522631 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjzxr" (UID: "85a8345c-8774-4272-887a-42b2d64a65cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.875480 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-vkgvv" Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.882781 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qq2pq" Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.882795 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktl25\" (UniqueName: \"kubernetes.io/projected/b9902b25-208d-48e9-bd97-f7e46797d813-kube-api-access-ktl25\") pod \"service-ca-9c57cc56f-xmj8x\" (UID: \"b9902b25-208d-48e9-bd97-f7e46797d813\") " pod="openshift-service-ca/service-ca-9c57cc56f-xmj8x" Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.894675 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-444zm\" (UniqueName: \"kubernetes.io/projected/be7063fc-de10-4911-8c87-c3251c274912-kube-api-access-444zm\") pod \"packageserver-d55dfcdfc-tsbfw\" (UID: \"be7063fc-de10-4911-8c87-c3251c274912\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tsbfw" Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.927041 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-gnrsd" Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.955867 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m998t\" (UniqueName: \"kubernetes.io/projected/26699cc4-344e-4154-baa7-8964daba84f7-kube-api-access-m998t\") pod \"package-server-manager-789f6589d5-vcv9c\" (UID: \"26699cc4-344e-4154-baa7-8964daba84f7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vcv9c" Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.956822 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:00:58 crc kubenswrapper[4893]: E0314 07:00:58.957174 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:00:59.457151669 +0000 UTC m=+138.719328491 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.957927 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tr26g\" (UniqueName: \"kubernetes.io/projected/982402e2-823c-4c34-a446-7a2b05e9a00d-kube-api-access-tr26g\") pod \"marketplace-operator-79b997595-jn8vg\" (UID: \"982402e2-823c-4c34-a446-7a2b05e9a00d\") " pod="openshift-marketplace/marketplace-operator-79b997595-jn8vg" Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.963766 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctjqf\" (UniqueName: \"kubernetes.io/projected/33e432c4-ff58-4a8d-98b0-28668b3d7a88-kube-api-access-ctjqf\") pod \"catalog-operator-68c6474976-57kx4\" (UID: \"33e432c4-ff58-4a8d-98b0-28668b3d7a88\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-57kx4" Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.972441 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-ss7r8" Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.984278 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrqmf\" (UniqueName: \"kubernetes.io/projected/cf444907-0c34-43ab-9bbd-b9ef0773743c-kube-api-access-wrqmf\") pod \"cni-sysctl-allowlist-ds-6xtr4\" (UID: \"cf444907-0c34-43ab-9bbd-b9ef0773743c\") " pod="openshift-multus/cni-sysctl-allowlist-ds-6xtr4" Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.986441 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jn9rp" Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.996623 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zzldb" event={"ID":"807a844d-47c0-4dc3-b820-0ea0069a28b6","Type":"ContainerStarted","Data":"c4a7b784f20366e1fa03580c00bb58b40b905d63714f38984d3622a3118c3455"} Mar 14 07:00:58 crc kubenswrapper[4893]: I0314 07:00:58.998112 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mj8rq\" (UniqueName: \"kubernetes.io/projected/41364595-3dec-491e-823b-5cd2d7c4ea46-kube-api-access-mj8rq\") pod \"machine-config-operator-74547568cd-7l845\" (UID: \"41364595-3dec-491e-823b-5cd2d7c4ea46\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7l845" Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:58.999401 4893 generic.go:334] "Generic (PLEG): container finished" podID="bdb8df65-c9b0-488a-9653-c0e97024027b" containerID="41bd756653a473f13adca5850d59a53448e0f82c95e1656f4868593aa2aa3021" exitCode=0 Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:58.999816 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wdnt5" event={"ID":"bdb8df65-c9b0-488a-9653-c0e97024027b","Type":"ContainerDied","Data":"41bd756653a473f13adca5850d59a53448e0f82c95e1656f4868593aa2aa3021"} Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:58.999842 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wdnt5" event={"ID":"bdb8df65-c9b0-488a-9653-c0e97024027b","Type":"ContainerStarted","Data":"cac36c37b0d62e30aa24c813c47283f1bda2051069b69ddcf66d95555dda497d"} Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.012466 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-qgm7k" Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.012813 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-k4jvl" event={"ID":"7bb36e2b-cd32-41b1-aa71-e9e6c8e8ce4b","Type":"ContainerStarted","Data":"2ab40673f5c61d5f0a126840e580737dbbe9da2c87ed366c6f5e23b921f470af"} Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.012856 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-k4jvl" event={"ID":"7bb36e2b-cd32-41b1-aa71-e9e6c8e8ce4b","Type":"ContainerStarted","Data":"663ce6e910cec33e5f5d2f617b8df3fd36748791023fed559eb05abfbf9fdb9f"} Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.016479 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jj2bg" event={"ID":"e25c5b47-b7b9-4213-931b-5d6691aaa2d4","Type":"ContainerStarted","Data":"b1519a4e3b243165d4a4fda4f079c679fe64ec3afc2b6b078012fd86c6492078"} Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.016536 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jj2bg" event={"ID":"e25c5b47-b7b9-4213-931b-5d6691aaa2d4","Type":"ContainerStarted","Data":"ab2330108db0953931931557d536401e531969140b2897f2fa845daa1463b5ef"} Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.016828 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xgd5\" (UniqueName: \"kubernetes.io/projected/72b53c84-bbb9-4fd1-9acd-625ecf005bff-kube-api-access-6xgd5\") pod \"ingress-canary-jh6jw\" (UID: \"72b53c84-bbb9-4fd1-9acd-625ecf005bff\") " pod="openshift-ingress-canary/ingress-canary-jh6jw" Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.036066 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4glh\" (UniqueName: \"kubernetes.io/projected/4a772f0d-1184-43e0-9d33-b2d024933bd9-kube-api-access-w4glh\") pod \"service-ca-operator-777779d784-wnwsg\" (UID: \"4a772f0d-1184-43e0-9d33-b2d024933bd9\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wnwsg" Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.040988 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m8wwx" Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.046315 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-sfrgk"] Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.047036 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vcv9c" Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.060890 4893 generic.go:334] "Generic (PLEG): container finished" podID="39b2c526-e373-404b-a607-bdd8d29c8fae" containerID="1a9d6fc56f84465608f3a9d8bcd2cac4246afdd0874e5d046369391afdc406d5" exitCode=0 Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.060968 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lmn9t" event={"ID":"39b2c526-e373-404b-a607-bdd8d29c8fae","Type":"ContainerDied","Data":"1a9d6fc56f84465608f3a9d8bcd2cac4246afdd0874e5d046369391afdc406d5"} Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.061415 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tsbfw" Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.063161 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0a74016e-259d-4d37-b12b-1a1255430d59-bound-sa-token\") pod \"ingress-operator-5b745b69d9-ccjq4\" (UID: \"0a74016e-259d-4d37-b12b-1a1255430d59\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ccjq4" Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.063480 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7l845" Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.065751 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjzxr\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:00:59 crc kubenswrapper[4893]: E0314 07:00:59.066098 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:00:59.566086688 +0000 UTC m=+138.828263480 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjzxr" (UID: "85a8345c-8774-4272-887a-42b2d64a65cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.071032 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7wtvk" event={"ID":"2336c0f6-9ad9-45d3-bb00-cc5832466e7c","Type":"ContainerStarted","Data":"b9735a3dec461dbde416ad4b1cdb2e857d34e341456e3261bb487b442bbb64e3"} Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.075839 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-57kx4" Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.075863 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-cgrf8" Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.084903 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-jn8vg" Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.090734 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-52b8f" event={"ID":"44475c5a-f261-48e3-be03-2a5b4f127e5a","Type":"ContainerStarted","Data":"c5ee989f73c9b5fccbc0d706cac73beeea6bfc064519f875d0849d404b461ead"} Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.090781 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-52b8f" event={"ID":"44475c5a-f261-48e3-be03-2a5b4f127e5a","Type":"ContainerStarted","Data":"8720c5453de8891229a449e724334fe84306966ebe3f8705b146c84f5dbfb476"} Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.090798 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-52b8f" event={"ID":"44475c5a-f261-48e3-be03-2a5b4f127e5a","Type":"ContainerStarted","Data":"2f556be3541d9cd11a56501a47352d97cb2db0a7f8365d1d3c82ecfb38a1a875"} Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.091128 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-xmj8x" Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.092752 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-66mm8" event={"ID":"55b95986-0c2a-4589-98cf-e3834cf5982d","Type":"ContainerStarted","Data":"227243a5f9fffc7af2258ab8cc12e9f1eccf859a8ccf216fb997408f5aa26198"} Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.092787 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-66mm8" event={"ID":"55b95986-0c2a-4589-98cf-e3834cf5982d","Type":"ContainerStarted","Data":"52ec1fc1c2749bb642682f8697449a43eebc1fd3e05352404a21af0838148aea"} Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.096222 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92xt8\" (UniqueName: \"kubernetes.io/projected/19d07c7b-828b-4582-aa4c-a6f4c1f09c9f-kube-api-access-92xt8\") pod \"openshift-controller-manager-operator-756b6f6bc6-h4wwz\" (UID: \"19d07c7b-828b-4582-aa4c-a6f4c1f09c9f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h4wwz" Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.099053 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-x2sb4" event={"ID":"6f37e61e-cd0d-4394-bdfa-a7c08c0742f3","Type":"ContainerStarted","Data":"ee417ed0d4a034b0fa24a26bc52e41cd081acb674244d479d6e3cb1b213ae4ad"} Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.099767 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-x2sb4" Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.100425 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wnwsg" Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.100429 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvhx6\" (UniqueName: \"kubernetes.io/projected/3009e07b-2452-425c-95c3-3a78fa993d62-kube-api-access-vvhx6\") pod \"collect-profiles-29557860-g42hf\" (UID: \"3009e07b-2452-425c-95c3-3a78fa993d62\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557860-g42hf" Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.103286 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h2xwg" event={"ID":"8cacf151-878c-4b98-accc-7731c17b2de8","Type":"ContainerStarted","Data":"6fcebe486fc9b142b4de7f96d7213ebebb132279b5ada36039814dc0ff8aa6ae"} Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.103333 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h2xwg" event={"ID":"8cacf151-878c-4b98-accc-7731c17b2de8","Type":"ContainerStarted","Data":"b37ae9cb6a0f6a118beaaea0e74c013c10c520a0522f50e2658fd487887bf669"} Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.103678 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h2xwg" Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.112328 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557860-g42hf" Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.117352 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-2p482" Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.120813 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-6xtr4" Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.127089 4893 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-x2sb4 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.127192 4893 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-x2sb4" podUID="6f37e61e-cd0d-4394-bdfa-a7c08c0742f3" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.127795 4893 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-h2xwg container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.30:8443/healthz\": dial tcp 10.217.0.30:8443: connect: connection refused" start-of-body= Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.128000 4893 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h2xwg" podUID="8cacf151-878c-4b98-accc-7731c17b2de8" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.30:8443/healthz\": dial tcp 10.217.0.30:8443: connect: connection refused" Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.128125 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2dkb\" (UniqueName: \"kubernetes.io/projected/38206b8e-cb94-47b8-b857-63a073f99ef7-kube-api-access-z2dkb\") pod \"csi-hostpathplugin-phr5l\" (UID: \"38206b8e-cb94-47b8-b857-63a073f99ef7\") " pod="hostpath-provisioner/csi-hostpathplugin-phr5l" Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.139572 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4l6j6\" (UniqueName: \"kubernetes.io/projected/f23fcc26-f800-497a-b038-065687659df7-kube-api-access-4l6j6\") pod \"control-plane-machine-set-operator-78cbb6b69f-v7lxq\" (UID: \"f23fcc26-f800-497a-b038-065687659df7\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v7lxq" Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.147873 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-phr5l" Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.153415 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-jh6jw" Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.154610 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnbqz\" (UniqueName: \"kubernetes.io/projected/122146eb-6174-4393-aabf-121f89674223-kube-api-access-gnbqz\") pod \"machine-config-controller-84d6567774-spcr2\" (UID: \"122146eb-6174-4393-aabf-121f89674223\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-spcr2" Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.167183 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.167461 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/871c529e-b263-4600-982a-d7be266f86e4-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-vbg6j\" (UID: \"871c529e-b263-4600-982a-d7be266f86e4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vbg6j" Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.167571 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef07db86-4677-41ce-8b6d-7960cc63a9b8-config\") pod \"apiserver-76f77b778f-btl2s\" (UID: \"ef07db86-4677-41ce-8b6d-7960cc63a9b8\") " pod="openshift-apiserver/apiserver-76f77b778f-btl2s" Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.167651 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9-service-ca\") pod \"console-f9d7485db-psm2j\" (UID: \"ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9\") " pod="openshift-console/console-f9d7485db-psm2j" Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.167685 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ef07db86-4677-41ce-8b6d-7960cc63a9b8-etcd-serving-ca\") pod \"apiserver-76f77b778f-btl2s\" (UID: \"ef07db86-4677-41ce-8b6d-7960cc63a9b8\") " pod="openshift-apiserver/apiserver-76f77b778f-btl2s" Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.167704 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/ef07db86-4677-41ce-8b6d-7960cc63a9b8-image-import-ca\") pod \"apiserver-76f77b778f-btl2s\" (UID: \"ef07db86-4677-41ce-8b6d-7960cc63a9b8\") " pod="openshift-apiserver/apiserver-76f77b778f-btl2s" Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.167745 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ef07db86-4677-41ce-8b6d-7960cc63a9b8-trusted-ca-bundle\") pod \"apiserver-76f77b778f-btl2s\" (UID: \"ef07db86-4677-41ce-8b6d-7960cc63a9b8\") " pod="openshift-apiserver/apiserver-76f77b778f-btl2s" Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.167854 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/ef07db86-4677-41ce-8b6d-7960cc63a9b8-audit\") pod \"apiserver-76f77b778f-btl2s\" (UID: \"ef07db86-4677-41ce-8b6d-7960cc63a9b8\") " pod="openshift-apiserver/apiserver-76f77b778f-btl2s" Mar 14 07:00:59 crc kubenswrapper[4893]: E0314 07:00:59.168113 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:00:59.668097788 +0000 UTC m=+138.930274580 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.171153 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ef07db86-4677-41ce-8b6d-7960cc63a9b8-trusted-ca-bundle\") pod \"apiserver-76f77b778f-btl2s\" (UID: \"ef07db86-4677-41ce-8b6d-7960cc63a9b8\") " pod="openshift-apiserver/apiserver-76f77b778f-btl2s" Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.172474 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/871c529e-b263-4600-982a-d7be266f86e4-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-vbg6j\" (UID: \"871c529e-b263-4600-982a-d7be266f86e4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vbg6j" Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.174302 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9-service-ca\") pod \"console-f9d7485db-psm2j\" (UID: \"ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9\") " pod="openshift-console/console-f9d7485db-psm2j" Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.174339 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef07db86-4677-41ce-8b6d-7960cc63a9b8-config\") pod \"apiserver-76f77b778f-btl2s\" (UID: \"ef07db86-4677-41ce-8b6d-7960cc63a9b8\") " pod="openshift-apiserver/apiserver-76f77b778f-btl2s" Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.174629 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/ef07db86-4677-41ce-8b6d-7960cc63a9b8-audit\") pod \"apiserver-76f77b778f-btl2s\" (UID: \"ef07db86-4677-41ce-8b6d-7960cc63a9b8\") " pod="openshift-apiserver/apiserver-76f77b778f-btl2s" Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.176847 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ef07db86-4677-41ce-8b6d-7960cc63a9b8-etcd-serving-ca\") pod \"apiserver-76f77b778f-btl2s\" (UID: \"ef07db86-4677-41ce-8b6d-7960cc63a9b8\") " pod="openshift-apiserver/apiserver-76f77b778f-btl2s" Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.184466 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bt5g9\" (UniqueName: \"kubernetes.io/projected/7f7aced4-1923-447e-a85d-b84ff4974986-kube-api-access-bt5g9\") pod \"etcd-operator-b45778765-2vgnl\" (UID: \"7f7aced4-1923-447e-a85d-b84ff4974986\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2vgnl" Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.186086 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qvvds"] Mar 14 07:00:59 crc kubenswrapper[4893]: W0314 07:00:59.189856 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod928468fc_c237_4779_a2c6_7365b3764fe8.slice/crio-f35f87f1115fec26294aa10c852357897d09d5951e9fd108f37f1f38e705df5f WatchSource:0}: Error finding container f35f87f1115fec26294aa10c852357897d09d5951e9fd108f37f1f38e705df5f: Status 404 returned error can't find the container with id f35f87f1115fec26294aa10c852357897d09d5951e9fd108f37f1f38e705df5f Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.196110 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qq2pq"] Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.200501 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lggpb\" (UniqueName: \"kubernetes.io/projected/0a74016e-259d-4d37-b12b-1a1255430d59-kube-api-access-lggpb\") pod \"ingress-operator-5b745b69d9-ccjq4\" (UID: \"0a74016e-259d-4d37-b12b-1a1255430d59\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ccjq4" Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.206966 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/ef07db86-4677-41ce-8b6d-7960cc63a9b8-image-import-ca\") pod \"apiserver-76f77b778f-btl2s\" (UID: \"ef07db86-4677-41ce-8b6d-7960cc63a9b8\") " pod="openshift-apiserver/apiserver-76f77b778f-btl2s" Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.222173 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzq5j\" (UniqueName: \"kubernetes.io/projected/f5e5a700-4ef2-4e79-9eaf-b1291e395185-kube-api-access-zzq5j\") pod \"dns-default-xxwxx\" (UID: \"f5e5a700-4ef2-4e79-9eaf-b1291e395185\") " pod="openshift-dns/dns-default-xxwxx" Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.251617 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.266507 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c2a95849-5cd5-46db-9979-7ed894406e61-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-wt8zp\" (UID: \"c2a95849-5cd5-46db-9979-7ed894406e61\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wt8zp" Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.272221 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjzxr\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:00:59 crc kubenswrapper[4893]: E0314 07:00:59.273244 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:00:59.773232063 +0000 UTC m=+139.035408855 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjzxr" (UID: "85a8345c-8774-4272-887a-42b2d64a65cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.283415 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-gnrsd"] Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.294124 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v7lxq" Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.307249 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ccjq4" Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.313218 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h4wwz" Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.317568 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-psm2j" Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.319069 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-2vgnl" Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.339896 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wt8zp" Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.363803 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-qgm7k"] Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.369497 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-spcr2" Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.381552 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:00:59 crc kubenswrapper[4893]: E0314 07:00:59.382172 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:00:59.88215308 +0000 UTC m=+139.144329872 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.383383 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-btl2s" Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.417025 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-vbg6j" Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.458493 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-xxwxx" Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.489602 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjzxr\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:00:59 crc kubenswrapper[4893]: E0314 07:00:59.489982 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:00:59.989969711 +0000 UTC m=+139.252146503 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjzxr" (UID: "85a8345c-8774-4272-887a-42b2d64a65cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.499586 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jn9rp"] Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.501181 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-vkgvv"] Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.561386 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-xmj8x"] Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.593115 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:00:59 crc kubenswrapper[4893]: E0314 07:00:59.593571 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:01:00.093540519 +0000 UTC m=+139.355717311 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.680020 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557860-g42hf"] Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.695734 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjzxr\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:00:59 crc kubenswrapper[4893]: E0314 07:00:59.696078 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:01:00.196062822 +0000 UTC m=+139.458239614 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjzxr" (UID: "85a8345c-8774-4272-887a-42b2d64a65cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.714682 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-m8wwx"] Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.796721 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:00:59 crc kubenswrapper[4893]: E0314 07:00:59.796995 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:01:00.296966205 +0000 UTC m=+139.559142997 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.797863 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjzxr\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:00:59 crc kubenswrapper[4893]: E0314 07:00:59.799865 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:01:00.299828944 +0000 UTC m=+139.562005736 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjzxr" (UID: "85a8345c-8774-4272-887a-42b2d64a65cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.851204 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.899771 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:00:59 crc kubenswrapper[4893]: E0314 07:00:59.900181 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:01:00.400165353 +0000 UTC m=+139.662342145 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:00:59 crc kubenswrapper[4893]: I0314 07:00:59.941492 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vcv9c"] Mar 14 07:01:00 crc kubenswrapper[4893]: I0314 07:01:00.004280 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjzxr\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:01:00 crc kubenswrapper[4893]: E0314 07:01:00.006132 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:01:00.506098428 +0000 UTC m=+139.768275310 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjzxr" (UID: "85a8345c-8774-4272-887a-42b2d64a65cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:01:00 crc kubenswrapper[4893]: I0314 07:01:00.009830 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tsbfw"] Mar 14 07:01:00 crc kubenswrapper[4893]: I0314 07:01:00.009874 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jn8vg"] Mar 14 07:01:00 crc kubenswrapper[4893]: I0314 07:01:00.058827 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-phr5l"] Mar 14 07:01:00 crc kubenswrapper[4893]: I0314 07:01:00.066354 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-wnwsg"] Mar 14 07:01:00 crc kubenswrapper[4893]: I0314 07:01:00.073293 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-cgrf8"] Mar 14 07:01:00 crc kubenswrapper[4893]: I0314 07:01:00.108506 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:01:00 crc kubenswrapper[4893]: E0314 07:01:00.109373 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:01:00.609357838 +0000 UTC m=+139.871534630 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:01:00 crc kubenswrapper[4893]: I0314 07:01:00.131078 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m8wwx" event={"ID":"95fe2bd1-1c99-4105-931e-6a60dd881260","Type":"ContainerStarted","Data":"bb8ac88b07a14007465d064315f90614080167db4a0a536cb40f5bdd3942c16f"} Mar 14 07:01:00 crc kubenswrapper[4893]: I0314 07:01:00.132079 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-xmj8x" event={"ID":"b9902b25-208d-48e9-bd97-f7e46797d813","Type":"ContainerStarted","Data":"4f4fa3d979e8a74e3709a2fa025d6f1dc77cef18e3ba1138b7e1724403794a30"} Mar 14 07:01:00 crc kubenswrapper[4893]: I0314 07:01:00.133025 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-vkgvv" event={"ID":"b1b92f23-a052-41c6-817f-d43b04079105","Type":"ContainerStarted","Data":"8c017d3fdd25383c71fe3b7f49055b64cd1f5c29a74d3b4406a0af2608485447"} Mar 14 07:01:00 crc kubenswrapper[4893]: I0314 07:01:00.141684 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wdnt5" event={"ID":"bdb8df65-c9b0-488a-9653-c0e97024027b","Type":"ContainerStarted","Data":"da3a99043f86b848b14d741d98897045b7e9bf887b3f95ce5d2a956cc5a67465"} Mar 14 07:01:00 crc kubenswrapper[4893]: I0314 07:01:00.141804 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wdnt5" Mar 14 07:01:00 crc kubenswrapper[4893]: I0314 07:01:00.145279 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-k4jvl" event={"ID":"7bb36e2b-cd32-41b1-aa71-e9e6c8e8ce4b","Type":"ContainerStarted","Data":"afb9a55bc146e10d68f264e906ed51594019353cce9818ddf8f0b334b25074c4"} Mar 14 07:01:00 crc kubenswrapper[4893]: I0314 07:01:00.149627 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-ss7r8" event={"ID":"1c71efc6-2e03-405c-84f9-6ba44b085df4","Type":"ContainerStarted","Data":"3fa8e2c827172b944e3f05c2ae59b9a945f709ba83c48cd83b7e7209dcc8c438"} Mar 14 07:01:00 crc kubenswrapper[4893]: I0314 07:01:00.149658 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-ss7r8" event={"ID":"1c71efc6-2e03-405c-84f9-6ba44b085df4","Type":"ContainerStarted","Data":"8b43d8c55fa0ab17b4a461504a302a44f71471a0347b3fd44898ade2fa94b5b8"} Mar 14 07:01:00 crc kubenswrapper[4893]: I0314 07:01:00.152457 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-sfrgk" event={"ID":"928468fc-c237-4779-a2c6-7365b3764fe8","Type":"ContainerStarted","Data":"f35f87f1115fec26294aa10c852357897d09d5951e9fd108f37f1f38e705df5f"} Mar 14 07:01:00 crc kubenswrapper[4893]: I0314 07:01:00.172044 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7wtvk" event={"ID":"2336c0f6-9ad9-45d3-bb00-cc5832466e7c","Type":"ContainerStarted","Data":"936b5015996159ac0f3398eb5fdbdd89a37bc154f6e65484ac7456c5c3931288"} Mar 14 07:01:00 crc kubenswrapper[4893]: I0314 07:01:00.178422 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-6xtr4" event={"ID":"cf444907-0c34-43ab-9bbd-b9ef0773743c","Type":"ContainerStarted","Data":"65fff7269dcb3b043d060705f66ef7e6b97c8cbfbb1957f67070736938bf0e58"} Mar 14 07:01:00 crc kubenswrapper[4893]: I0314 07:01:00.181779 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557860-g42hf" event={"ID":"3009e07b-2452-425c-95c3-3a78fa993d62","Type":"ContainerStarted","Data":"7a0ae4aa4520ad5604e171adcc4ccc71f706656e975e7a676c3c34a478e0e9f2"} Mar 14 07:01:00 crc kubenswrapper[4893]: I0314 07:01:00.185295 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-qgm7k" event={"ID":"42dc5935-aed6-4fee-a749-3c292d042df5","Type":"ContainerStarted","Data":"3d5639bd2e4bd8d1ee3d8f677234942c80fde7d2767442f3266eb8843714a9b0"} Mar 14 07:01:00 crc kubenswrapper[4893]: I0314 07:01:00.190034 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-2p482" event={"ID":"e026a9f7-3301-4162-9710-706837e6d0ea","Type":"ContainerStarted","Data":"44f113d5619a7c11de16c5e685479dfcdd68b9a32485819f5bf6ab7656705901"} Mar 14 07:01:00 crc kubenswrapper[4893]: I0314 07:01:00.194628 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qq2pq" event={"ID":"ef0e0ce2-d109-426d-8d69-7eb458708189","Type":"ContainerStarted","Data":"d428251382cf355db18a61f35ad8e307c16f335994bb3930a694e54eb96f1e6b"} Mar 14 07:01:00 crc kubenswrapper[4893]: I0314 07:01:00.196368 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-gnrsd" event={"ID":"5b706875-4635-4744-941b-f235ebd548b0","Type":"ContainerStarted","Data":"8e37ebbc3013c7fba27a21bd91f0504dca62235326695619a63447f2317b0c5a"} Mar 14 07:01:00 crc kubenswrapper[4893]: I0314 07:01:00.197694 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qvvds" event={"ID":"b82b8dee-58d7-4701-ad47-8ddc20898935","Type":"ContainerStarted","Data":"3b37a550b50356234b2a084edbdc6ca2c2ef4b5bac22b5162e1fcebb512cbce3"} Mar 14 07:01:00 crc kubenswrapper[4893]: I0314 07:01:00.199578 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jn9rp" event={"ID":"b511756d-571f-4e3f-9fde-bd9cd2d6e038","Type":"ContainerStarted","Data":"6b9e953d132585dd0f3547f466158ab2ee5aa0561b40294b4709e0fa0ae50e85"} Mar 14 07:01:00 crc kubenswrapper[4893]: I0314 07:01:00.201564 4893 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-x2sb4 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Mar 14 07:01:00 crc kubenswrapper[4893]: I0314 07:01:00.201622 4893 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-x2sb4" podUID="6f37e61e-cd0d-4394-bdfa-a7c08c0742f3" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" Mar 14 07:01:00 crc kubenswrapper[4893]: I0314 07:01:00.211002 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjzxr\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:01:00 crc kubenswrapper[4893]: E0314 07:01:00.211375 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:01:00.711363638 +0000 UTC m=+139.973540420 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjzxr" (UID: "85a8345c-8774-4272-887a-42b2d64a65cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:01:00 crc kubenswrapper[4893]: I0314 07:01:00.281922 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-66mm8" podStartSLOduration=77.281903553 podStartE2EDuration="1m17.281903553s" podCreationTimestamp="2026-03-14 06:59:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:01:00.236004717 +0000 UTC m=+139.498181519" watchObservedRunningTime="2026-03-14 07:01:00.281903553 +0000 UTC m=+139.544080345" Mar 14 07:01:00 crc kubenswrapper[4893]: I0314 07:01:00.283337 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h2xwg" podStartSLOduration=77.283329717 podStartE2EDuration="1m17.283329717s" podCreationTimestamp="2026-03-14 06:59:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:01:00.28179728 +0000 UTC m=+139.543974072" watchObservedRunningTime="2026-03-14 07:01:00.283329717 +0000 UTC m=+139.545506509" Mar 14 07:01:00 crc kubenswrapper[4893]: I0314 07:01:00.308344 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h2xwg" Mar 14 07:01:00 crc kubenswrapper[4893]: I0314 07:01:00.312048 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:01:00 crc kubenswrapper[4893]: W0314 07:01:00.319406 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a772f0d_1184_43e0_9d33_b2d024933bd9.slice/crio-b764ac4ad66772f538d596328867a8dd8ad54329c4ff1ba4672f0d4150fd07b6 WatchSource:0}: Error finding container b764ac4ad66772f538d596328867a8dd8ad54329c4ff1ba4672f0d4150fd07b6: Status 404 returned error can't find the container with id b764ac4ad66772f538d596328867a8dd8ad54329c4ff1ba4672f0d4150fd07b6 Mar 14 07:01:00 crc kubenswrapper[4893]: E0314 07:01:00.319513 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:01:00.819494196 +0000 UTC m=+140.081670988 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:01:00 crc kubenswrapper[4893]: I0314 07:01:00.414409 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjzxr\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:01:00 crc kubenswrapper[4893]: E0314 07:01:00.414704 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:01:00.914693621 +0000 UTC m=+140.176870413 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjzxr" (UID: "85a8345c-8774-4272-887a-42b2d64a65cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:01:00 crc kubenswrapper[4893]: I0314 07:01:00.426264 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-57kx4"] Mar 14 07:01:00 crc kubenswrapper[4893]: I0314 07:01:00.458913 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-x2sb4" podStartSLOduration=77.458882524 podStartE2EDuration="1m17.458882524s" podCreationTimestamp="2026-03-14 06:59:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:01:00.442180839 +0000 UTC m=+139.704357651" watchObservedRunningTime="2026-03-14 07:01:00.458882524 +0000 UTC m=+139.721059326" Mar 14 07:01:00 crc kubenswrapper[4893]: I0314 07:01:00.460222 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-7l845"] Mar 14 07:01:00 crc kubenswrapper[4893]: I0314 07:01:00.515315 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:01:00 crc kubenswrapper[4893]: E0314 07:01:00.515650 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:01:01.015629054 +0000 UTC m=+140.277805846 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:01:00 crc kubenswrapper[4893]: I0314 07:01:00.617297 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjzxr\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:01:00 crc kubenswrapper[4893]: E0314 07:01:00.617933 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:01:01.117921711 +0000 UTC m=+140.380098503 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjzxr" (UID: "85a8345c-8774-4272-887a-42b2d64a65cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:01:00 crc kubenswrapper[4893]: I0314 07:01:00.640726 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-btl2s"] Mar 14 07:01:00 crc kubenswrapper[4893]: I0314 07:01:00.701879 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=1.7018625109999999 podStartE2EDuration="1.701862511s" podCreationTimestamp="2026-03-14 07:00:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:01:00.700239672 +0000 UTC m=+139.962416474" watchObservedRunningTime="2026-03-14 07:01:00.701862511 +0000 UTC m=+139.964039303" Mar 14 07:01:00 crc kubenswrapper[4893]: I0314 07:01:00.703328 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wt8zp"] Mar 14 07:01:00 crc kubenswrapper[4893]: I0314 07:01:00.718322 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:01:00 crc kubenswrapper[4893]: E0314 07:01:00.734220 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:01:01.234194257 +0000 UTC m=+140.496371039 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:01:00 crc kubenswrapper[4893]: I0314 07:01:00.738191 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=2.738173604 podStartE2EDuration="2.738173604s" podCreationTimestamp="2026-03-14 07:00:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:01:00.734604267 +0000 UTC m=+139.996781059" watchObservedRunningTime="2026-03-14 07:01:00.738173604 +0000 UTC m=+140.000350396" Mar 14 07:01:00 crc kubenswrapper[4893]: I0314 07:01:00.765969 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jj2bg" podStartSLOduration=77.765138589 podStartE2EDuration="1m17.765138589s" podCreationTimestamp="2026-03-14 06:59:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:01:00.758875468 +0000 UTC m=+140.021052280" watchObservedRunningTime="2026-03-14 07:01:00.765138589 +0000 UTC m=+140.027315381" Mar 14 07:01:00 crc kubenswrapper[4893]: I0314 07:01:00.812486 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-spcr2"] Mar 14 07:01:00 crc kubenswrapper[4893]: I0314 07:01:00.828541 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjzxr\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:01:00 crc kubenswrapper[4893]: E0314 07:01:00.828790 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:01:01.328778287 +0000 UTC m=+140.590955079 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjzxr" (UID: "85a8345c-8774-4272-887a-42b2d64a65cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:01:00 crc kubenswrapper[4893]: W0314 07:01:00.831695 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef07db86_4677_41ce_8b6d_7960cc63a9b8.slice/crio-02fc4b678f0893c8a6eea4cb8279b097ab0f8633c7fe5569ac93665c40a88ee3 WatchSource:0}: Error finding container 02fc4b678f0893c8a6eea4cb8279b097ab0f8633c7fe5569ac93665c40a88ee3: Status 404 returned error can't find the container with id 02fc4b678f0893c8a6eea4cb8279b097ab0f8633c7fe5569ac93665c40a88ee3 Mar 14 07:01:00 crc kubenswrapper[4893]: W0314 07:01:00.843647 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2a95849_5cd5_46db_9979_7ed894406e61.slice/crio-d30165286db7f949f0eee2782d9cfe99d839e2962a59d68f2812008d28f27b9f WatchSource:0}: Error finding container d30165286db7f949f0eee2782d9cfe99d839e2962a59d68f2812008d28f27b9f: Status 404 returned error can't find the container with id d30165286db7f949f0eee2782d9cfe99d839e2962a59d68f2812008d28f27b9f Mar 14 07:01:00 crc kubenswrapper[4893]: W0314 07:01:00.858568 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod122146eb_6174_4393_aabf_121f89674223.slice/crio-1b10ad77e27f4e3b4f166ab7cce109a928fa5793445ccf5e8fed094931bcbc67 WatchSource:0}: Error finding container 1b10ad77e27f4e3b4f166ab7cce109a928fa5793445ccf5e8fed094931bcbc67: Status 404 returned error can't find the container with id 1b10ad77e27f4e3b4f166ab7cce109a928fa5793445ccf5e8fed094931bcbc67 Mar 14 07:01:00 crc kubenswrapper[4893]: I0314 07:01:00.913719 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zzldb" podStartSLOduration=77.913696531 podStartE2EDuration="1m17.913696531s" podCreationTimestamp="2026-03-14 06:59:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:01:00.905136043 +0000 UTC m=+140.167312855" watchObservedRunningTime="2026-03-14 07:01:00.913696531 +0000 UTC m=+140.175873343" Mar 14 07:01:00 crc kubenswrapper[4893]: I0314 07:01:00.929511 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:01:00 crc kubenswrapper[4893]: E0314 07:01:00.930131 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:01:01.4301169 +0000 UTC m=+140.692293692 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:01:00 crc kubenswrapper[4893]: I0314 07:01:00.980623 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-ss7r8" Mar 14 07:01:00 crc kubenswrapper[4893]: I0314 07:01:00.982000 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-jh6jw"] Mar 14 07:01:00 crc kubenswrapper[4893]: I0314 07:01:00.983990 4893 patch_prober.go:28] interesting pod/router-default-5444994796-ss7r8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 14 07:01:00 crc kubenswrapper[4893]: [-]has-synced failed: reason withheld Mar 14 07:01:00 crc kubenswrapper[4893]: [+]process-running ok Mar 14 07:01:00 crc kubenswrapper[4893]: healthz check failed Mar 14 07:01:00 crc kubenswrapper[4893]: I0314 07:01:00.984036 4893 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ss7r8" podUID="1c71efc6-2e03-405c-84f9-6ba44b085df4" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.006343 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h4wwz"] Mar 14 07:01:01 crc kubenswrapper[4893]: W0314 07:01:01.026633 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72b53c84_bbb9_4fd1_9acd_625ecf005bff.slice/crio-d7070f03be3d3f399920e5067f66008cc543c573c6c6beac928491e1d3a4dd37 WatchSource:0}: Error finding container d7070f03be3d3f399920e5067f66008cc543c573c6c6beac928491e1d3a4dd37: Status 404 returned error can't find the container with id d7070f03be3d3f399920e5067f66008cc543c573c6c6beac928491e1d3a4dd37 Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.031589 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjzxr\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:01:01 crc kubenswrapper[4893]: E0314 07:01:01.031899 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:01:01.531888674 +0000 UTC m=+140.794065466 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjzxr" (UID: "85a8345c-8774-4272-887a-42b2d64a65cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.127809 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-52b8f" podStartSLOduration=78.127794655 podStartE2EDuration="1m18.127794655s" podCreationTimestamp="2026-03-14 06:59:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:01:01.092276602 +0000 UTC m=+140.354453394" watchObservedRunningTime="2026-03-14 07:01:01.127794655 +0000 UTC m=+140.389971447" Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.132491 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:01:01 crc kubenswrapper[4893]: E0314 07:01:01.132836 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:01:01.632821128 +0000 UTC m=+140.894997920 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.156698 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-2vgnl"] Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.164070 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v7lxq"] Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.191000 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-ccjq4"] Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.209195 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-psm2j"] Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.217934 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7l845" event={"ID":"41364595-3dec-491e-823b-5cd2d7c4ea46","Type":"ContainerStarted","Data":"123a7446bd22ea98a2d70f7f3fbb046b3c9ad9da629c45f5c144001aa46713e3"} Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.233506 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjzxr\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:01:01 crc kubenswrapper[4893]: E0314 07:01:01.233837 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:01:01.733825363 +0000 UTC m=+140.996002155 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjzxr" (UID: "85a8345c-8774-4272-887a-42b2d64a65cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.234654 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-xmj8x" event={"ID":"b9902b25-208d-48e9-bd97-f7e46797d813","Type":"ContainerStarted","Data":"4e1d13d16b27a4c43ecba006b8fbc92e30e08d19baecc4acf7b4e4171edb76b1"} Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.236076 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wnwsg" event={"ID":"4a772f0d-1184-43e0-9d33-b2d024933bd9","Type":"ContainerStarted","Data":"b764ac4ad66772f538d596328867a8dd8ad54329c4ff1ba4672f0d4150fd07b6"} Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.240171 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-xxwxx"] Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.246712 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tsbfw" event={"ID":"be7063fc-de10-4911-8c87-c3251c274912","Type":"ContainerStarted","Data":"d9ee988b7a344386d710df099c4d1f8cf7d16be694f6afbed7666b69c8a95e8e"} Mar 14 07:01:01 crc kubenswrapper[4893]: W0314 07:01:01.250702 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf23fcc26_f800_497a_b038_065687659df7.slice/crio-31bec58d93811f551727a08548f02987848bab6e981ec916183a004dcb7d02ce WatchSource:0}: Error finding container 31bec58d93811f551727a08548f02987848bab6e981ec916183a004dcb7d02ce: Status 404 returned error can't find the container with id 31bec58d93811f551727a08548f02987848bab6e981ec916183a004dcb7d02ce Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.252220 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jn8vg" event={"ID":"982402e2-823c-4c34-a446-7a2b05e9a00d","Type":"ContainerStarted","Data":"1f861f6f715c76bf703ff7d48a856e92a0410c0cbe8ce41eae5e22329ccf5305"} Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.277814 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557860-g42hf" event={"ID":"3009e07b-2452-425c-95c3-3a78fa993d62","Type":"ContainerStarted","Data":"f346a9d6640b2d3342f142ca50107602a1405bb3b1a5118c157b77375c6a113a"} Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.284687 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-jh6jw" event={"ID":"72b53c84-bbb9-4fd1-9acd-625ecf005bff","Type":"ContainerStarted","Data":"d7070f03be3d3f399920e5067f66008cc543c573c6c6beac928491e1d3a4dd37"} Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.293646 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-spcr2" event={"ID":"122146eb-6174-4393-aabf-121f89674223","Type":"ContainerStarted","Data":"1b10ad77e27f4e3b4f166ab7cce109a928fa5793445ccf5e8fed094931bcbc67"} Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.296072 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-vbg6j"] Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.299861 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-btl2s" event={"ID":"ef07db86-4677-41ce-8b6d-7960cc63a9b8","Type":"ContainerStarted","Data":"02fc4b678f0893c8a6eea4cb8279b097ab0f8633c7fe5569ac93665c40a88ee3"} Mar 14 07:01:01 crc kubenswrapper[4893]: W0314 07:01:01.303780 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f7aced4_1923_447e_a85d_b84ff4974986.slice/crio-9ed6028b2bec2c98d96222522231093f57672827fbb2bbbf35c4fcb3c89b9163 WatchSource:0}: Error finding container 9ed6028b2bec2c98d96222522231093f57672827fbb2bbbf35c4fcb3c89b9163: Status 404 returned error can't find the container with id 9ed6028b2bec2c98d96222522231093f57672827fbb2bbbf35c4fcb3c89b9163 Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.305168 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jn9rp" event={"ID":"b511756d-571f-4e3f-9fde-bd9cd2d6e038","Type":"ContainerStarted","Data":"0338e2d6c993a499521972e89182086ca4d6cf04a04db4e5d7511fa7051891f8"} Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.313411 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qq2pq" event={"ID":"ef0e0ce2-d109-426d-8d69-7eb458708189","Type":"ContainerStarted","Data":"753ea4868e9b9c19998ef96097034b720db4a5d469db82a8520c2a7d9020923a"} Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.327125 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lmn9t" event={"ID":"39b2c526-e373-404b-a607-bdd8d29c8fae","Type":"ContainerStarted","Data":"b339f8ec1ab93f2da7908df23421309b7860358a2724e6af3e37478e28157797"} Mar 14 07:01:01 crc kubenswrapper[4893]: W0314 07:01:01.328096 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba3cbd4f_22c0_45f3_8f49_ac687b5d7ab9.slice/crio-864651349f8df1f5d4599ace23fddee6f61035514ad0b0ba08ca8d7eb8b4b65c WatchSource:0}: Error finding container 864651349f8df1f5d4599ace23fddee6f61035514ad0b0ba08ca8d7eb8b4b65c: Status 404 returned error can't find the container with id 864651349f8df1f5d4599ace23fddee6f61035514ad0b0ba08ca8d7eb8b4b65c Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.332784 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-cgrf8" event={"ID":"bc155a4b-53ea-4394-87d2-d4f966e3589d","Type":"ContainerStarted","Data":"8f8ed8f0e7e4ae13207575efe082361b6e1101358ef557250d9a303c95deecf9"} Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.334509 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.334711 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.334793 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.334816 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.334840 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:01:01 crc kubenswrapper[4893]: E0314 07:01:01.336264 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:01:01.836248123 +0000 UTC m=+141.098424915 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.345620 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.358070 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-sfrgk" event={"ID":"928468fc-c237-4779-a2c6-7365b3764fe8","Type":"ContainerStarted","Data":"fa62d26f6bce6ceb8f7e651a0b2d60460ae87c6adc1dd272bd9ccc46e5424bc8"} Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.358909 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-sfrgk" Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.361724 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.365573 4893 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-sfrgk container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.8:6443/healthz\": dial tcp 10.217.0.8:6443: connect: connection refused" start-of-body= Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.365612 4893 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-sfrgk" podUID="928468fc-c237-4779-a2c6-7365b3764fe8" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.8:6443/healthz\": dial tcp 10.217.0.8:6443: connect: connection refused" Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.376892 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.408067 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.422644 4893 patch_prober.go:28] interesting pod/downloads-7954f5f757-vkgvv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.422696 4893 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-vkgvv" podUID="b1b92f23-a052-41c6-817f-d43b04079105" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.423105 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-57kx4" event={"ID":"33e432c4-ff58-4a8d-98b0-28668b3d7a88","Type":"ContainerStarted","Data":"ab409ebe0f637189bfbaa982dd421c87b392c3cfdd0957bca710dfa58248b38a"} Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.423145 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-2p482" event={"ID":"e026a9f7-3301-4162-9710-706837e6d0ea","Type":"ContainerStarted","Data":"8a75a017070017e64e859f5fb7118155123ca2bb00a72e3c3ec58b502a765ad5"} Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.423166 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-vkgvv" Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.423177 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-vkgvv" event={"ID":"b1b92f23-a052-41c6-817f-d43b04079105","Type":"ContainerStarted","Data":"a570edde6cdcce5cc2a154f3cd8a77a36509eeab69ad000349cb2ffa5d144fff"} Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.436037 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjzxr\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:01:01 crc kubenswrapper[4893]: E0314 07:01:01.437291 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:01:01.937274168 +0000 UTC m=+141.199450960 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjzxr" (UID: "85a8345c-8774-4272-887a-42b2d64a65cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.445151 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h4wwz" event={"ID":"19d07c7b-828b-4582-aa4c-a6f4c1f09c9f","Type":"ContainerStarted","Data":"6b1ab14a9db6ba46c241edb96185f1a2d78d32df58a59de05b447bbdc3d71aad"} Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.460085 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qvvds" event={"ID":"b82b8dee-58d7-4701-ad47-8ddc20898935","Type":"ContainerStarted","Data":"531bd6e9fd378ba597d38af914fd2d1a940df5beddb7af5a13b63d1d77e42089"} Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.469204 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wt8zp" event={"ID":"c2a95849-5cd5-46db-9979-7ed894406e61","Type":"ContainerStarted","Data":"d30165286db7f949f0eee2782d9cfe99d839e2962a59d68f2812008d28f27b9f"} Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.470813 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lmn9t" podStartSLOduration=78.470798433 podStartE2EDuration="1m18.470798433s" podCreationTimestamp="2026-03-14 06:59:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:01:01.468803435 +0000 UTC m=+140.730980247" watchObservedRunningTime="2026-03-14 07:01:01.470798433 +0000 UTC m=+140.732975225" Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.471646 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wdnt5" podStartSLOduration=78.471641174 podStartE2EDuration="1m18.471641174s" podCreationTimestamp="2026-03-14 06:59:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:01:01.430860913 +0000 UTC m=+140.693037715" watchObservedRunningTime="2026-03-14 07:01:01.471641174 +0000 UTC m=+140.733817966" Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.479689 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-6xtr4" event={"ID":"cf444907-0c34-43ab-9bbd-b9ef0773743c","Type":"ContainerStarted","Data":"5d47d584e62acecf57128aff773b22cb948e4d61adb20657341d87d377075cca"} Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.480351 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-multus/cni-sysctl-allowlist-ds-6xtr4" Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.506155 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.509819 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vcv9c" event={"ID":"26699cc4-344e-4154-baa7-8964daba84f7","Type":"ContainerStarted","Data":"82c287f7f2aeb173c92a106b503aafae0e599e66f6ebdd43cd255bfc073a3a59"} Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.511746 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.517028 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jn9rp" podStartSLOduration=78.517005707 podStartE2EDuration="1m18.517005707s" podCreationTimestamp="2026-03-14 06:59:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:01:01.50193207 +0000 UTC m=+140.764108892" watchObservedRunningTime="2026-03-14 07:01:01.517005707 +0000 UTC m=+140.779182499" Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.517489 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:01:01 crc kubenswrapper[4893]: W0314 07:01:01.527358 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod871c529e_b263_4600_982a_d7be266f86e4.slice/crio-efd77547ee060b1dd0ff40ef4c2aa00f4a94e62a3f9c445d81963e732554a01f WatchSource:0}: Error finding container efd77547ee060b1dd0ff40ef4c2aa00f4a94e62a3f9c445d81963e732554a01f: Status 404 returned error can't find the container with id efd77547ee060b1dd0ff40ef4c2aa00f4a94e62a3f9c445d81963e732554a01f Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.533651 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-qgm7k" event={"ID":"42dc5935-aed6-4fee-a749-3c292d042df5","Type":"ContainerStarted","Data":"995a86a0ea4dbdac43da5fbf0e198df7c1ec2bda1b9060473dde840bba351d5a"} Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.537965 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:01:01 crc kubenswrapper[4893]: E0314 07:01:01.539232 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:01:02.039209847 +0000 UTC m=+141.301386639 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.543736 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-phr5l" event={"ID":"38206b8e-cb94-47b8-b857-63a073f99ef7","Type":"ContainerStarted","Data":"4feffbe387bfe0f6db20b9ad99a538ed732b5cc9d1bd9966288d14fe7230b546"} Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.549711 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-6xtr4" Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.569264 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-ss7r8" podStartSLOduration=78.569243946 podStartE2EDuration="1m18.569243946s" podCreationTimestamp="2026-03-14 06:59:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:01:01.540341034 +0000 UTC m=+140.802517836" watchObservedRunningTime="2026-03-14 07:01:01.569243946 +0000 UTC m=+140.831420738" Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.616080 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-k4jvl" podStartSLOduration=78.616064185 podStartE2EDuration="1m18.616064185s" podCreationTimestamp="2026-03-14 06:59:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:01:01.614492277 +0000 UTC m=+140.876669069" watchObservedRunningTime="2026-03-14 07:01:01.616064185 +0000 UTC m=+140.878240977" Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.617627 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29557860-g42hf" podStartSLOduration=61.617620333 podStartE2EDuration="1m1.617620333s" podCreationTimestamp="2026-03-14 07:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:01:01.571309627 +0000 UTC m=+140.833486429" watchObservedRunningTime="2026-03-14 07:01:01.617620333 +0000 UTC m=+140.879797125" Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.643654 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjzxr\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:01:01 crc kubenswrapper[4893]: E0314 07:01:01.646163 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:01:02.146152486 +0000 UTC m=+141.408329278 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjzxr" (UID: "85a8345c-8774-4272-887a-42b2d64a65cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.693146 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-xmj8x" podStartSLOduration=77.693128768 podStartE2EDuration="1m17.693128768s" podCreationTimestamp="2026-03-14 06:59:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:01:01.653050734 +0000 UTC m=+140.915227546" watchObservedRunningTime="2026-03-14 07:01:01.693128768 +0000 UTC m=+140.955305560" Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.695201 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-vkgvv" podStartSLOduration=78.695194688 podStartE2EDuration="1m18.695194688s" podCreationTimestamp="2026-03-14 06:59:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:01:01.690026043 +0000 UTC m=+140.952202835" watchObservedRunningTime="2026-03-14 07:01:01.695194688 +0000 UTC m=+140.957371480" Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.745473 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:01:01 crc kubenswrapper[4893]: E0314 07:01:01.745758 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:01:02.245741188 +0000 UTC m=+141.507917980 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.789781 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-sfrgk" podStartSLOduration=78.789759997 podStartE2EDuration="1m18.789759997s" podCreationTimestamp="2026-03-14 06:59:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:01:01.784015278 +0000 UTC m=+141.046192090" watchObservedRunningTime="2026-03-14 07:01:01.789759997 +0000 UTC m=+141.051936789" Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.791083 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-qgm7k" podStartSLOduration=78.791078169 podStartE2EDuration="1m18.791078169s" podCreationTimestamp="2026-03-14 06:59:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:01:01.731512511 +0000 UTC m=+140.993689313" watchObservedRunningTime="2026-03-14 07:01:01.791078169 +0000 UTC m=+141.053254961" Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.813629 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-6xtr4" podStartSLOduration=6.813612947 podStartE2EDuration="6.813612947s" podCreationTimestamp="2026-03-14 07:00:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:01:01.812823507 +0000 UTC m=+141.075000299" watchObservedRunningTime="2026-03-14 07:01:01.813612947 +0000 UTC m=+141.075789739" Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.847991 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjzxr\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.848158 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-2p482" podStartSLOduration=6.848145127 podStartE2EDuration="6.848145127s" podCreationTimestamp="2026-03-14 07:00:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:01:01.847613633 +0000 UTC m=+141.109790455" watchObservedRunningTime="2026-03-14 07:01:01.848145127 +0000 UTC m=+141.110321919" Mar 14 07:01:01 crc kubenswrapper[4893]: E0314 07:01:01.848310 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:01:02.34829878 +0000 UTC m=+141.610475572 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjzxr" (UID: "85a8345c-8774-4272-887a-42b2d64a65cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.883028 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qvvds" podStartSLOduration=78.883010194 podStartE2EDuration="1m18.883010194s" podCreationTimestamp="2026-03-14 06:59:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:01:01.876284001 +0000 UTC m=+141.138460793" watchObservedRunningTime="2026-03-14 07:01:01.883010194 +0000 UTC m=+141.145186986" Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.952102 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:01:01 crc kubenswrapper[4893]: E0314 07:01:01.952379 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:01:02.45236507 +0000 UTC m=+141.714541862 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.981182 4893 patch_prober.go:28] interesting pod/router-default-5444994796-ss7r8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 14 07:01:01 crc kubenswrapper[4893]: [-]has-synced failed: reason withheld Mar 14 07:01:01 crc kubenswrapper[4893]: [+]process-running ok Mar 14 07:01:01 crc kubenswrapper[4893]: healthz check failed Mar 14 07:01:01 crc kubenswrapper[4893]: I0314 07:01:01.981224 4893 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ss7r8" podUID="1c71efc6-2e03-405c-84f9-6ba44b085df4" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 14 07:01:02 crc kubenswrapper[4893]: I0314 07:01:02.053702 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjzxr\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:01:02 crc kubenswrapper[4893]: E0314 07:01:02.054075 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:01:02.554063192 +0000 UTC m=+141.816239974 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjzxr" (UID: "85a8345c-8774-4272-887a-42b2d64a65cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:01:02 crc kubenswrapper[4893]: I0314 07:01:02.156485 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:01:02 crc kubenswrapper[4893]: E0314 07:01:02.157166 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:01:02.657150088 +0000 UTC m=+141.919326870 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:01:02 crc kubenswrapper[4893]: I0314 07:01:02.264577 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjzxr\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:01:02 crc kubenswrapper[4893]: E0314 07:01:02.265094 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:01:02.765081581 +0000 UTC m=+142.027258373 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjzxr" (UID: "85a8345c-8774-4272-887a-42b2d64a65cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:01:02 crc kubenswrapper[4893]: I0314 07:01:02.372330 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:01:02 crc kubenswrapper[4893]: E0314 07:01:02.372737 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:01:02.872717188 +0000 UTC m=+142.134893980 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:01:02 crc kubenswrapper[4893]: I0314 07:01:02.372764 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lmn9t" Mar 14 07:01:02 crc kubenswrapper[4893]: I0314 07:01:02.372808 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lmn9t" Mar 14 07:01:02 crc kubenswrapper[4893]: I0314 07:01:02.476244 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjzxr\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:01:02 crc kubenswrapper[4893]: E0314 07:01:02.476702 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:01:02.976689796 +0000 UTC m=+142.238866588 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjzxr" (UID: "85a8345c-8774-4272-887a-42b2d64a65cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:01:02 crc kubenswrapper[4893]: I0314 07:01:02.580066 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:01:02 crc kubenswrapper[4893]: E0314 07:01:02.580545 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:01:03.08051235 +0000 UTC m=+142.342689142 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:01:02 crc kubenswrapper[4893]: I0314 07:01:02.580748 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjzxr\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:01:02 crc kubenswrapper[4893]: E0314 07:01:02.581056 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:01:03.081047902 +0000 UTC m=+142.343224694 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjzxr" (UID: "85a8345c-8774-4272-887a-42b2d64a65cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:01:02 crc kubenswrapper[4893]: I0314 07:01:02.588761 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-6xtr4"] Mar 14 07:01:02 crc kubenswrapper[4893]: I0314 07:01:02.649331 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lmn9t" Mar 14 07:01:02 crc kubenswrapper[4893]: I0314 07:01:02.673890 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tsbfw" event={"ID":"be7063fc-de10-4911-8c87-c3251c274912","Type":"ContainerStarted","Data":"d2ef4b1549a5b596d08086776d85373c5e869ecca58fd853643f2a9ad3f5fc55"} Mar 14 07:01:02 crc kubenswrapper[4893]: I0314 07:01:02.674543 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tsbfw" Mar 14 07:01:02 crc kubenswrapper[4893]: I0314 07:01:02.678606 4893 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-tsbfw container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:5443/healthz\": dial tcp 10.217.0.35:5443: connect: connection refused" start-of-body= Mar 14 07:01:02 crc kubenswrapper[4893]: I0314 07:01:02.678647 4893 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tsbfw" podUID="be7063fc-de10-4911-8c87-c3251c274912" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.35:5443/healthz\": dial tcp 10.217.0.35:5443: connect: connection refused" Mar 14 07:01:02 crc kubenswrapper[4893]: I0314 07:01:02.684091 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:01:02 crc kubenswrapper[4893]: E0314 07:01:02.684365 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:01:03.184350834 +0000 UTC m=+142.446527626 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:01:02 crc kubenswrapper[4893]: I0314 07:01:02.684483 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-57kx4" event={"ID":"33e432c4-ff58-4a8d-98b0-28668b3d7a88","Type":"ContainerStarted","Data":"191482e385d39ab0d3f9b5fda9af96b094827e5cfa3b3f5f08388be847816f67"} Mar 14 07:01:02 crc kubenswrapper[4893]: I0314 07:01:02.685200 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-57kx4" Mar 14 07:01:02 crc kubenswrapper[4893]: I0314 07:01:02.690614 4893 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-57kx4 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Mar 14 07:01:02 crc kubenswrapper[4893]: I0314 07:01:02.690672 4893 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-57kx4" podUID="33e432c4-ff58-4a8d-98b0-28668b3d7a88" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" Mar 14 07:01:02 crc kubenswrapper[4893]: I0314 07:01:02.733946 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-spcr2" event={"ID":"122146eb-6174-4393-aabf-121f89674223","Type":"ContainerStarted","Data":"7098c1781f808b69c67cf0b0f0c926153e3920c3bbc783579a382f7cb714b120"} Mar 14 07:01:02 crc kubenswrapper[4893]: I0314 07:01:02.750180 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m8wwx" event={"ID":"95fe2bd1-1c99-4105-931e-6a60dd881260","Type":"ContainerStarted","Data":"a74ffae6fa6d74169433ea285404d61603b4f354004b6e301fd226b0b054b3b6"} Mar 14 07:01:02 crc kubenswrapper[4893]: I0314 07:01:02.750725 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m8wwx" Mar 14 07:01:02 crc kubenswrapper[4893]: I0314 07:01:02.759784 4893 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-m8wwx container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.16:8443/healthz\": dial tcp 10.217.0.16:8443: connect: connection refused" start-of-body= Mar 14 07:01:02 crc kubenswrapper[4893]: I0314 07:01:02.759838 4893 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m8wwx" podUID="95fe2bd1-1c99-4105-931e-6a60dd881260" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.16:8443/healthz\": dial tcp 10.217.0.16:8443: connect: connection refused" Mar 14 07:01:02 crc kubenswrapper[4893]: I0314 07:01:02.766030 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tsbfw" podStartSLOduration=79.766010769 podStartE2EDuration="1m19.766010769s" podCreationTimestamp="2026-03-14 06:59:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:01:02.760265529 +0000 UTC m=+142.022442351" watchObservedRunningTime="2026-03-14 07:01:02.766010769 +0000 UTC m=+142.028187561" Mar 14 07:01:02 crc kubenswrapper[4893]: I0314 07:01:02.772099 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-cgrf8" event={"ID":"bc155a4b-53ea-4394-87d2-d4f966e3589d","Type":"ContainerStarted","Data":"b100914c965c1f465239ef114613b1ff2df958321a89e38d14ca2f1dc2199d01"} Mar 14 07:01:02 crc kubenswrapper[4893]: I0314 07:01:02.772584 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-cgrf8" Mar 14 07:01:02 crc kubenswrapper[4893]: I0314 07:01:02.777719 4893 patch_prober.go:28] interesting pod/console-operator-58897d9998-cgrf8 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/readyz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Mar 14 07:01:02 crc kubenswrapper[4893]: I0314 07:01:02.777778 4893 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-cgrf8" podUID="bc155a4b-53ea-4394-87d2-d4f966e3589d" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.7:8443/readyz\": dial tcp 10.217.0.7:8443: connect: connection refused" Mar 14 07:01:02 crc kubenswrapper[4893]: I0314 07:01:02.788292 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjzxr\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:01:02 crc kubenswrapper[4893]: E0314 07:01:02.789859 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:01:03.289840048 +0000 UTC m=+142.552016880 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjzxr" (UID: "85a8345c-8774-4272-887a-42b2d64a65cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:01:02 crc kubenswrapper[4893]: I0314 07:01:02.811183 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vcv9c" event={"ID":"26699cc4-344e-4154-baa7-8964daba84f7","Type":"ContainerStarted","Data":"a218dde2c117e8799c454f48c438df8e19ac1dfd9cff1228272b07e2870b5a0c"} Mar 14 07:01:02 crc kubenswrapper[4893]: I0314 07:01:02.811228 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vcv9c" event={"ID":"26699cc4-344e-4154-baa7-8964daba84f7","Type":"ContainerStarted","Data":"6174d1dad46a5bf53e874c6bb225b33f7656ff325708cee4c78c0d7d25eec8ff"} Mar 14 07:01:02 crc kubenswrapper[4893]: I0314 07:01:02.811874 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vcv9c" Mar 14 07:01:02 crc kubenswrapper[4893]: I0314 07:01:02.840360 4893 generic.go:334] "Generic (PLEG): container finished" podID="ef07db86-4677-41ce-8b6d-7960cc63a9b8" containerID="5cbdc8e7db8268b59eb24892f73c801b8c89377275b4e6cb4a0c83842189abeb" exitCode=0 Mar 14 07:01:02 crc kubenswrapper[4893]: I0314 07:01:02.840436 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-btl2s" event={"ID":"ef07db86-4677-41ce-8b6d-7960cc63a9b8","Type":"ContainerDied","Data":"5cbdc8e7db8268b59eb24892f73c801b8c89377275b4e6cb4a0c83842189abeb"} Mar 14 07:01:02 crc kubenswrapper[4893]: I0314 07:01:02.842032 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-phr5l" event={"ID":"38206b8e-cb94-47b8-b857-63a073f99ef7","Type":"ContainerStarted","Data":"ab6b2dbcb55a87d65c1160efab5463e039749f37f6ef239b4ee2d8e94b31381d"} Mar 14 07:01:02 crc kubenswrapper[4893]: I0314 07:01:02.851959 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-57kx4" podStartSLOduration=79.851943798 podStartE2EDuration="1m19.851943798s" podCreationTimestamp="2026-03-14 06:59:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:01:02.851377494 +0000 UTC m=+142.113554286" watchObservedRunningTime="2026-03-14 07:01:02.851943798 +0000 UTC m=+142.114120590" Mar 14 07:01:02 crc kubenswrapper[4893]: I0314 07:01:02.854670 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"5095bb5125f272011f74e0ce281181391d1abd79b09dbeeeab459090ad32f02c"} Mar 14 07:01:02 crc kubenswrapper[4893]: I0314 07:01:02.875760 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-gnrsd" event={"ID":"5b706875-4635-4744-941b-f235ebd548b0","Type":"ContainerStarted","Data":"de15eeaceb74f04edd6ee788371f9c1b2b174707980b8c279fbc820d4b0d5c97"} Mar 14 07:01:02 crc kubenswrapper[4893]: I0314 07:01:02.889682 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:01:02 crc kubenswrapper[4893]: E0314 07:01:02.890510 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:01:03.390496615 +0000 UTC m=+142.652673407 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:01:02 crc kubenswrapper[4893]: I0314 07:01:02.898162 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wnwsg" event={"ID":"4a772f0d-1184-43e0-9d33-b2d024933bd9","Type":"ContainerStarted","Data":"71704bb591275a0340b63f5393ad4a68687c7bf1a49ddc330ca725bb00b46954"} Mar 14 07:01:02 crc kubenswrapper[4893]: I0314 07:01:02.899806 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-cgrf8" podStartSLOduration=79.899795561 podStartE2EDuration="1m19.899795561s" podCreationTimestamp="2026-03-14 06:59:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:01:02.896797919 +0000 UTC m=+142.158974731" watchObservedRunningTime="2026-03-14 07:01:02.899795561 +0000 UTC m=+142.161972353" Mar 14 07:01:02 crc kubenswrapper[4893]: I0314 07:01:02.912539 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h4wwz" event={"ID":"19d07c7b-828b-4582-aa4c-a6f4c1f09c9f","Type":"ContainerStarted","Data":"f85381f3150ed06a377e5c17a7d2f545a8652dba945531c181b3127e4490535c"} Mar 14 07:01:02 crc kubenswrapper[4893]: I0314 07:01:02.914036 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-2vgnl" event={"ID":"7f7aced4-1923-447e-a85d-b84ff4974986","Type":"ContainerStarted","Data":"9ed6028b2bec2c98d96222522231093f57672827fbb2bbbf35c4fcb3c89b9163"} Mar 14 07:01:02 crc kubenswrapper[4893]: I0314 07:01:02.920904 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ccjq4" event={"ID":"0a74016e-259d-4d37-b12b-1a1255430d59","Type":"ContainerStarted","Data":"5f44077d78a23ba31a2a16e8278c4727a26fbd9192a2f9ff3abd22c28f759c55"} Mar 14 07:01:02 crc kubenswrapper[4893]: I0314 07:01:02.920933 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ccjq4" event={"ID":"0a74016e-259d-4d37-b12b-1a1255430d59","Type":"ContainerStarted","Data":"75a3263013e7e5716c6eae66dd54a979c892fa3693065aa93364b8f37aef9a51"} Mar 14 07:01:02 crc kubenswrapper[4893]: I0314 07:01:02.932861 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-psm2j" event={"ID":"ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9","Type":"ContainerStarted","Data":"864651349f8df1f5d4599ace23fddee6f61035514ad0b0ba08ca8d7eb8b4b65c"} Mar 14 07:01:02 crc kubenswrapper[4893]: I0314 07:01:02.939558 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vcv9c" podStartSLOduration=79.939542507 podStartE2EDuration="1m19.939542507s" podCreationTimestamp="2026-03-14 06:59:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:01:02.937336894 +0000 UTC m=+142.199513686" watchObservedRunningTime="2026-03-14 07:01:02.939542507 +0000 UTC m=+142.201719319" Mar 14 07:01:02 crc kubenswrapper[4893]: I0314 07:01:02.972352 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7wtvk" event={"ID":"2336c0f6-9ad9-45d3-bb00-cc5832466e7c","Type":"ContainerStarted","Data":"ccff6712c329a69db88aae6811570dbaa7599dd503f41668158c7ce6322a6ddc"} Mar 14 07:01:02 crc kubenswrapper[4893]: I0314 07:01:02.978630 4893 patch_prober.go:28] interesting pod/router-default-5444994796-ss7r8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 14 07:01:02 crc kubenswrapper[4893]: [-]has-synced failed: reason withheld Mar 14 07:01:02 crc kubenswrapper[4893]: [+]process-running ok Mar 14 07:01:02 crc kubenswrapper[4893]: healthz check failed Mar 14 07:01:02 crc kubenswrapper[4893]: I0314 07:01:02.978844 4893 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ss7r8" podUID="1c71efc6-2e03-405c-84f9-6ba44b085df4" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 14 07:01:03 crc kubenswrapper[4893]: I0314 07:01:03.008836 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m8wwx" podStartSLOduration=79.008819282 podStartE2EDuration="1m19.008819282s" podCreationTimestamp="2026-03-14 06:59:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:01:02.980903033 +0000 UTC m=+142.243079835" watchObservedRunningTime="2026-03-14 07:01:03.008819282 +0000 UTC m=+142.270996074" Mar 14 07:01:03 crc kubenswrapper[4893]: I0314 07:01:03.010330 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jn8vg" event={"ID":"982402e2-823c-4c34-a446-7a2b05e9a00d","Type":"ContainerStarted","Data":"5f3284fcbc15fce7debc65c5d3d3aa0c60e2954160f012152ed144eed9cb67f9"} Mar 14 07:01:03 crc kubenswrapper[4893]: I0314 07:01:03.014589 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-jn8vg" Mar 14 07:01:03 crc kubenswrapper[4893]: I0314 07:01:03.015216 4893 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-jn8vg container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" start-of-body= Mar 14 07:01:03 crc kubenswrapper[4893]: I0314 07:01:03.015286 4893 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-jn8vg" podUID="982402e2-823c-4c34-a446-7a2b05e9a00d" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" Mar 14 07:01:03 crc kubenswrapper[4893]: I0314 07:01:03.016141 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjzxr\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:01:03 crc kubenswrapper[4893]: E0314 07:01:03.020189 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:01:03.520170198 +0000 UTC m=+142.782346990 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjzxr" (UID: "85a8345c-8774-4272-887a-42b2d64a65cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:01:03 crc kubenswrapper[4893]: I0314 07:01:03.052201 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h4wwz" podStartSLOduration=80.052185726 podStartE2EDuration="1m20.052185726s" podCreationTimestamp="2026-03-14 06:59:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:01:03.009981119 +0000 UTC m=+142.272157921" watchObservedRunningTime="2026-03-14 07:01:03.052185726 +0000 UTC m=+142.314362518" Mar 14 07:01:03 crc kubenswrapper[4893]: I0314 07:01:03.064602 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-vbg6j" event={"ID":"871c529e-b263-4600-982a-d7be266f86e4","Type":"ContainerStarted","Data":"efd77547ee060b1dd0ff40ef4c2aa00f4a94e62a3f9c445d81963e732554a01f"} Mar 14 07:01:03 crc kubenswrapper[4893]: I0314 07:01:03.065794 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7wtvk" podStartSLOduration=80.065777846 podStartE2EDuration="1m20.065777846s" podCreationTimestamp="2026-03-14 06:59:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:01:03.051914618 +0000 UTC m=+142.314091410" watchObservedRunningTime="2026-03-14 07:01:03.065777846 +0000 UTC m=+142.327954638" Mar 14 07:01:03 crc kubenswrapper[4893]: I0314 07:01:03.096863 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wnwsg" podStartSLOduration=79.096847802 podStartE2EDuration="1m19.096847802s" podCreationTimestamp="2026-03-14 06:59:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:01:03.09431353 +0000 UTC m=+142.356490322" watchObservedRunningTime="2026-03-14 07:01:03.096847802 +0000 UTC m=+142.359024594" Mar 14 07:01:03 crc kubenswrapper[4893]: I0314 07:01:03.118537 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-jh6jw" event={"ID":"72b53c84-bbb9-4fd1-9acd-625ecf005bff","Type":"ContainerStarted","Data":"3be93a13f143359f3c4c877cbdee044580c452fbd8dc56e9028f5f0ccd14addc"} Mar 14 07:01:03 crc kubenswrapper[4893]: I0314 07:01:03.119590 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:01:03 crc kubenswrapper[4893]: E0314 07:01:03.119690 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:01:03.619666876 +0000 UTC m=+142.881843668 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:01:03 crc kubenswrapper[4893]: I0314 07:01:03.120009 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjzxr\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:01:03 crc kubenswrapper[4893]: E0314 07:01:03.120341 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:01:03.620324232 +0000 UTC m=+142.882501034 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjzxr" (UID: "85a8345c-8774-4272-887a-42b2d64a65cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:01:03 crc kubenswrapper[4893]: I0314 07:01:03.142572 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-psm2j" podStartSLOduration=80.142553762 podStartE2EDuration="1m20.142553762s" podCreationTimestamp="2026-03-14 06:59:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:01:03.141772924 +0000 UTC m=+142.403949746" watchObservedRunningTime="2026-03-14 07:01:03.142553762 +0000 UTC m=+142.404730554" Mar 14 07:01:03 crc kubenswrapper[4893]: I0314 07:01:03.192582 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-jn8vg" podStartSLOduration=80.192562738 podStartE2EDuration="1m20.192562738s" podCreationTimestamp="2026-03-14 06:59:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:01:03.192336863 +0000 UTC m=+142.454513665" watchObservedRunningTime="2026-03-14 07:01:03.192562738 +0000 UTC m=+142.454739530" Mar 14 07:01:03 crc kubenswrapper[4893]: I0314 07:01:03.212165 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7l845" event={"ID":"41364595-3dec-491e-823b-5cd2d7c4ea46","Type":"ContainerStarted","Data":"c838552d7b6be97772c86d7189bc5ce1c047b8e2813bb8069257e4af91175cd3"} Mar 14 07:01:03 crc kubenswrapper[4893]: I0314 07:01:03.228788 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:01:03 crc kubenswrapper[4893]: E0314 07:01:03.229718 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:01:03.72970352 +0000 UTC m=+142.991880312 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:01:03 crc kubenswrapper[4893]: I0314 07:01:03.229875 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"72c244f9d13e139eb183ba00afd89bf1ed30bca7d53d2207b03afe8434f5d431"} Mar 14 07:01:03 crc kubenswrapper[4893]: I0314 07:01:03.231278 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-jh6jw" podStartSLOduration=8.231255309 podStartE2EDuration="8.231255309s" podCreationTimestamp="2026-03-14 07:00:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:01:03.228293286 +0000 UTC m=+142.490470078" watchObservedRunningTime="2026-03-14 07:01:03.231255309 +0000 UTC m=+142.493432101" Mar 14 07:01:03 crc kubenswrapper[4893]: I0314 07:01:03.232117 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:01:03 crc kubenswrapper[4893]: I0314 07:01:03.262903 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wt8zp" event={"ID":"c2a95849-5cd5-46db-9979-7ed894406e61","Type":"ContainerStarted","Data":"6a497a03dc3c895f4a1793d34919c4961362d644b04467e071231b591eac4ecf"} Mar 14 07:01:03 crc kubenswrapper[4893]: I0314 07:01:03.288357 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v7lxq" event={"ID":"f23fcc26-f800-497a-b038-065687659df7","Type":"ContainerStarted","Data":"b8ccb0896067b9099d6bcc40f8df720124dc85219334e4d371907bb50ec0b291"} Mar 14 07:01:03 crc kubenswrapper[4893]: I0314 07:01:03.288395 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v7lxq" event={"ID":"f23fcc26-f800-497a-b038-065687659df7","Type":"ContainerStarted","Data":"31bec58d93811f551727a08548f02987848bab6e981ec916183a004dcb7d02ce"} Mar 14 07:01:03 crc kubenswrapper[4893]: I0314 07:01:03.310972 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qq2pq" event={"ID":"ef0e0ce2-d109-426d-8d69-7eb458708189","Type":"ContainerStarted","Data":"86d4d06d7766ef5dfa291709376e4ddeac4522b7b50d244000e7368d4a1e9beb"} Mar 14 07:01:03 crc kubenswrapper[4893]: I0314 07:01:03.319573 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wt8zp" podStartSLOduration=80.319558865 podStartE2EDuration="1m20.319558865s" podCreationTimestamp="2026-03-14 06:59:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:01:03.318424168 +0000 UTC m=+142.580600970" watchObservedRunningTime="2026-03-14 07:01:03.319558865 +0000 UTC m=+142.581735657" Mar 14 07:01:03 crc kubenswrapper[4893]: I0314 07:01:03.322608 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-xxwxx" event={"ID":"f5e5a700-4ef2-4e79-9eaf-b1291e395185","Type":"ContainerStarted","Data":"589ab1c0071eeebe502ce21e51c1024868508d1f1b41c97e9af64f63224d6aaa"} Mar 14 07:01:03 crc kubenswrapper[4893]: I0314 07:01:03.330453 4893 patch_prober.go:28] interesting pod/downloads-7954f5f757-vkgvv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Mar 14 07:01:03 crc kubenswrapper[4893]: I0314 07:01:03.330700 4893 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-vkgvv" podUID="b1b92f23-a052-41c6-817f-d43b04079105" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Mar 14 07:01:03 crc kubenswrapper[4893]: I0314 07:01:03.331595 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjzxr\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:01:03 crc kubenswrapper[4893]: E0314 07:01:03.337043 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:01:03.83702615 +0000 UTC m=+143.099202942 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjzxr" (UID: "85a8345c-8774-4272-887a-42b2d64a65cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:01:03 crc kubenswrapper[4893]: I0314 07:01:03.341240 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lmn9t" Mar 14 07:01:03 crc kubenswrapper[4893]: I0314 07:01:03.346169 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-sfrgk" Mar 14 07:01:03 crc kubenswrapper[4893]: I0314 07:01:03.389181 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v7lxq" podStartSLOduration=80.389166117 podStartE2EDuration="1m20.389166117s" podCreationTimestamp="2026-03-14 06:59:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:01:03.388343128 +0000 UTC m=+142.650519940" watchObservedRunningTime="2026-03-14 07:01:03.389166117 +0000 UTC m=+142.651342909" Mar 14 07:01:03 crc kubenswrapper[4893]: I0314 07:01:03.421429 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 14 07:01:03 crc kubenswrapper[4893]: I0314 07:01:03.422155 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 14 07:01:03 crc kubenswrapper[4893]: I0314 07:01:03.424354 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 14 07:01:03 crc kubenswrapper[4893]: I0314 07:01:03.424549 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 14 07:01:03 crc kubenswrapper[4893]: I0314 07:01:03.425546 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 14 07:01:03 crc kubenswrapper[4893]: I0314 07:01:03.434078 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:01:03 crc kubenswrapper[4893]: E0314 07:01:03.439645 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:01:03.939610733 +0000 UTC m=+143.201787525 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:01:03 crc kubenswrapper[4893]: I0314 07:01:03.455041 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qq2pq" podStartSLOduration=80.455019518 podStartE2EDuration="1m20.455019518s" podCreationTimestamp="2026-03-14 06:59:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:01:03.443340404 +0000 UTC m=+142.705517206" watchObservedRunningTime="2026-03-14 07:01:03.455019518 +0000 UTC m=+142.717196310" Mar 14 07:01:03 crc kubenswrapper[4893]: I0314 07:01:03.537173 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f3c36b8d-4898-4780-8e79-f4986e676f9d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"f3c36b8d-4898-4780-8e79-f4986e676f9d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 14 07:01:03 crc kubenswrapper[4893]: I0314 07:01:03.537223 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjzxr\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:01:03 crc kubenswrapper[4893]: I0314 07:01:03.537256 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f3c36b8d-4898-4780-8e79-f4986e676f9d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"f3c36b8d-4898-4780-8e79-f4986e676f9d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 14 07:01:03 crc kubenswrapper[4893]: E0314 07:01:03.537588 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:01:04.037568324 +0000 UTC m=+143.299745116 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjzxr" (UID: "85a8345c-8774-4272-887a-42b2d64a65cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:01:03 crc kubenswrapper[4893]: I0314 07:01:03.637984 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:01:03 crc kubenswrapper[4893]: I0314 07:01:03.638263 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f3c36b8d-4898-4780-8e79-f4986e676f9d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"f3c36b8d-4898-4780-8e79-f4986e676f9d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 14 07:01:03 crc kubenswrapper[4893]: I0314 07:01:03.638327 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f3c36b8d-4898-4780-8e79-f4986e676f9d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"f3c36b8d-4898-4780-8e79-f4986e676f9d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 14 07:01:03 crc kubenswrapper[4893]: E0314 07:01:03.638805 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:01:04.138790386 +0000 UTC m=+143.400967178 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:01:03 crc kubenswrapper[4893]: I0314 07:01:03.638853 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f3c36b8d-4898-4780-8e79-f4986e676f9d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"f3c36b8d-4898-4780-8e79-f4986e676f9d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 14 07:01:03 crc kubenswrapper[4893]: I0314 07:01:03.645038 4893 ???:1] "http: TLS handshake error from 192.168.126.11:40334: no serving certificate available for the kubelet" Mar 14 07:01:03 crc kubenswrapper[4893]: I0314 07:01:03.674069 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wdnt5" Mar 14 07:01:03 crc kubenswrapper[4893]: I0314 07:01:03.681480 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f3c36b8d-4898-4780-8e79-f4986e676f9d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"f3c36b8d-4898-4780-8e79-f4986e676f9d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 14 07:01:03 crc kubenswrapper[4893]: I0314 07:01:03.740645 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjzxr\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:01:03 crc kubenswrapper[4893]: E0314 07:01:03.742021 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:01:04.241990634 +0000 UTC m=+143.504167426 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjzxr" (UID: "85a8345c-8774-4272-887a-42b2d64a65cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:01:03 crc kubenswrapper[4893]: I0314 07:01:03.811584 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 14 07:01:03 crc kubenswrapper[4893]: I0314 07:01:03.838638 4893 ???:1] "http: TLS handshake error from 192.168.126.11:40340: no serving certificate available for the kubelet" Mar 14 07:01:03 crc kubenswrapper[4893]: I0314 07:01:03.842240 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:01:03 crc kubenswrapper[4893]: E0314 07:01:03.842445 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:01:04.342417145 +0000 UTC m=+143.604593937 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:01:03 crc kubenswrapper[4893]: I0314 07:01:03.842576 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjzxr\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:01:03 crc kubenswrapper[4893]: E0314 07:01:03.842868 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:01:04.342860746 +0000 UTC m=+143.605037538 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjzxr" (UID: "85a8345c-8774-4272-887a-42b2d64a65cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:01:03 crc kubenswrapper[4893]: I0314 07:01:03.943931 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:01:03 crc kubenswrapper[4893]: E0314 07:01:03.944122 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:01:04.444096657 +0000 UTC m=+143.706273449 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:01:03 crc kubenswrapper[4893]: I0314 07:01:03.944169 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjzxr\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:01:03 crc kubenswrapper[4893]: E0314 07:01:03.944827 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:01:04.444810264 +0000 UTC m=+143.706987056 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjzxr" (UID: "85a8345c-8774-4272-887a-42b2d64a65cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:01:03 crc kubenswrapper[4893]: I0314 07:01:03.976508 4893 ???:1] "http: TLS handshake error from 192.168.126.11:40354: no serving certificate available for the kubelet" Mar 14 07:01:03 crc kubenswrapper[4893]: I0314 07:01:03.984848 4893 patch_prober.go:28] interesting pod/router-default-5444994796-ss7r8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 14 07:01:03 crc kubenswrapper[4893]: [-]has-synced failed: reason withheld Mar 14 07:01:03 crc kubenswrapper[4893]: [+]process-running ok Mar 14 07:01:03 crc kubenswrapper[4893]: healthz check failed Mar 14 07:01:03 crc kubenswrapper[4893]: I0314 07:01:03.984913 4893 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ss7r8" podUID="1c71efc6-2e03-405c-84f9-6ba44b085df4" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 14 07:01:04 crc kubenswrapper[4893]: I0314 07:01:04.045089 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:01:04 crc kubenswrapper[4893]: E0314 07:01:04.045299 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:01:04.545272757 +0000 UTC m=+143.807449549 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:01:04 crc kubenswrapper[4893]: I0314 07:01:04.098656 4893 ???:1] "http: TLS handshake error from 192.168.126.11:40360: no serving certificate available for the kubelet" Mar 14 07:01:04 crc kubenswrapper[4893]: I0314 07:01:04.146848 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjzxr\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:01:04 crc kubenswrapper[4893]: E0314 07:01:04.147168 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:01:04.647156244 +0000 UTC m=+143.909333036 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjzxr" (UID: "85a8345c-8774-4272-887a-42b2d64a65cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:01:04 crc kubenswrapper[4893]: I0314 07:01:04.177638 4893 ???:1] "http: TLS handshake error from 192.168.126.11:40374: no serving certificate available for the kubelet" Mar 14 07:01:04 crc kubenswrapper[4893]: I0314 07:01:04.240185 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 14 07:01:04 crc kubenswrapper[4893]: I0314 07:01:04.264312 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:01:04 crc kubenswrapper[4893]: E0314 07:01:04.264729 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:01:04.764711571 +0000 UTC m=+144.026888363 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:01:04 crc kubenswrapper[4893]: I0314 07:01:04.279731 4893 ???:1] "http: TLS handshake error from 192.168.126.11:40390: no serving certificate available for the kubelet" Mar 14 07:01:04 crc kubenswrapper[4893]: I0314 07:01:04.368930 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjzxr\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:01:04 crc kubenswrapper[4893]: E0314 07:01:04.369307 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:01:04.869286723 +0000 UTC m=+144.131463515 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjzxr" (UID: "85a8345c-8774-4272-887a-42b2d64a65cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:01:04 crc kubenswrapper[4893]: I0314 07:01:04.376307 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ccjq4" event={"ID":"0a74016e-259d-4d37-b12b-1a1255430d59","Type":"ContainerStarted","Data":"e539edacff54a51dd31a4043bd85af58dc231465177eb57c5c15dc5611e4f93a"} Mar 14 07:01:04 crc kubenswrapper[4893]: I0314 07:01:04.378782 4893 ???:1] "http: TLS handshake error from 192.168.126.11:40398: no serving certificate available for the kubelet" Mar 14 07:01:04 crc kubenswrapper[4893]: I0314 07:01:04.387562 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-spcr2" event={"ID":"122146eb-6174-4393-aabf-121f89674223","Type":"ContainerStarted","Data":"7a7605ffc8f628341c30c057f0e917a83f82efde8970da9dfd7b0e0aee6a1df1"} Mar 14 07:01:04 crc kubenswrapper[4893]: I0314 07:01:04.397475 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-psm2j" event={"ID":"ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9","Type":"ContainerStarted","Data":"cb290f12124e3f2357962cddd4eb5a16f8241a5b4d7405a119ac670484aa3c11"} Mar 14 07:01:04 crc kubenswrapper[4893]: I0314 07:01:04.417661 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-gnrsd" event={"ID":"5b706875-4635-4744-941b-f235ebd548b0","Type":"ContainerStarted","Data":"bf21d507f3141dc5cece72fb4b3c163d7466a850a738197b7824cd5f59e95ea2"} Mar 14 07:01:04 crc kubenswrapper[4893]: I0314 07:01:04.423281 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-phr5l" event={"ID":"38206b8e-cb94-47b8-b857-63a073f99ef7","Type":"ContainerStarted","Data":"7197de2998b5155b7ff0815db50697f920722a592d788778b31b5d5738466341"} Mar 14 07:01:04 crc kubenswrapper[4893]: I0314 07:01:04.427929 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ccjq4" podStartSLOduration=81.427907008 podStartE2EDuration="1m21.427907008s" podCreationTimestamp="2026-03-14 06:59:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:01:04.399926338 +0000 UTC m=+143.662103130" watchObservedRunningTime="2026-03-14 07:01:04.427907008 +0000 UTC m=+143.690083800" Mar 14 07:01:04 crc kubenswrapper[4893]: I0314 07:01:04.433405 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-spcr2" podStartSLOduration=81.433393581 podStartE2EDuration="1m21.433393581s" podCreationTimestamp="2026-03-14 06:59:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:01:04.426851292 +0000 UTC m=+143.689028104" watchObservedRunningTime="2026-03-14 07:01:04.433393581 +0000 UTC m=+143.695570373" Mar 14 07:01:04 crc kubenswrapper[4893]: I0314 07:01:04.446008 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-vbg6j" event={"ID":"871c529e-b263-4600-982a-d7be266f86e4","Type":"ContainerStarted","Data":"904ec7eee40bd534c5023fe070e008fff8582f244beb9faea11e47412e50c22c"} Mar 14 07:01:04 crc kubenswrapper[4893]: I0314 07:01:04.446050 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-vbg6j" event={"ID":"871c529e-b263-4600-982a-d7be266f86e4","Type":"ContainerStarted","Data":"f2ff5ccf8047c1f2da90a1766eb7e3efa1cd63c0a016d98bf4bd9f7159fa02f7"} Mar 14 07:01:04 crc kubenswrapper[4893]: I0314 07:01:04.454249 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-xxwxx" event={"ID":"f5e5a700-4ef2-4e79-9eaf-b1291e395185","Type":"ContainerStarted","Data":"744207c7094659ff220eddf7fce6a9431a2b94ca5d128b53c65057ec808bcd5f"} Mar 14 07:01:04 crc kubenswrapper[4893]: I0314 07:01:04.454284 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-xxwxx" event={"ID":"f5e5a700-4ef2-4e79-9eaf-b1291e395185","Type":"ContainerStarted","Data":"e511733f0b89643659597d676b090fb42241bfe8a5c004aad23e582921935635"} Mar 14 07:01:04 crc kubenswrapper[4893]: I0314 07:01:04.454752 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-xxwxx" Mar 14 07:01:04 crc kubenswrapper[4893]: I0314 07:01:04.457491 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"f3c36b8d-4898-4780-8e79-f4986e676f9d","Type":"ContainerStarted","Data":"d59c7d7390bfbb8861a61eb3b91f23e2a453aa379a2fd061897f486b3a6ad5db"} Mar 14 07:01:04 crc kubenswrapper[4893]: I0314 07:01:04.458655 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7l845" event={"ID":"41364595-3dec-491e-823b-5cd2d7c4ea46","Type":"ContainerStarted","Data":"c2f368c003ce22068cdfb308ff4a698e61e762a0dd131c6f353116601a8d443e"} Mar 14 07:01:04 crc kubenswrapper[4893]: I0314 07:01:04.460419 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"1950265ad8050b1cdfdd09e1b0641b49282431a16dcd028eb938c612b2ab5925"} Mar 14 07:01:04 crc kubenswrapper[4893]: I0314 07:01:04.462594 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"657fa20f6d51fbcf9ead8e737cdef0ad441419b7bd78b1d8b3481b68dc368517"} Mar 14 07:01:04 crc kubenswrapper[4893]: I0314 07:01:04.462638 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"a5e23fcf240cae1926c2acb9e82fa7d0999b7ecd1eadda4e19478fb835ec8250"} Mar 14 07:01:04 crc kubenswrapper[4893]: I0314 07:01:04.472923 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-2vgnl" event={"ID":"7f7aced4-1923-447e-a85d-b84ff4974986","Type":"ContainerStarted","Data":"cdfbd28e3ab0d341370024d8c74b1f5fce52774b186bba6828fc4dd78e793bf8"} Mar 14 07:01:04 crc kubenswrapper[4893]: I0314 07:01:04.473515 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-gnrsd" podStartSLOduration=81.473498227 podStartE2EDuration="1m21.473498227s" podCreationTimestamp="2026-03-14 06:59:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:01:04.448011537 +0000 UTC m=+143.710188339" watchObservedRunningTime="2026-03-14 07:01:04.473498227 +0000 UTC m=+143.735675019" Mar 14 07:01:04 crc kubenswrapper[4893]: I0314 07:01:04.474117 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:01:04 crc kubenswrapper[4893]: E0314 07:01:04.474299 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:01:04.974273065 +0000 UTC m=+144.236449857 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:01:04 crc kubenswrapper[4893]: I0314 07:01:04.474446 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjzxr\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:01:04 crc kubenswrapper[4893]: E0314 07:01:04.476252 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:01:04.976240743 +0000 UTC m=+144.238417535 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjzxr" (UID: "85a8345c-8774-4272-887a-42b2d64a65cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:01:04 crc kubenswrapper[4893]: I0314 07:01:04.510851 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-vbg6j" podStartSLOduration=81.510837474 podStartE2EDuration="1m21.510837474s" podCreationTimestamp="2026-03-14 06:59:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:01:04.473713242 +0000 UTC m=+143.735890034" watchObservedRunningTime="2026-03-14 07:01:04.510837474 +0000 UTC m=+143.773014266" Mar 14 07:01:04 crc kubenswrapper[4893]: I0314 07:01:04.512045 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"3b95d27b09dfa14fc540e1a733220a4162dab4da57c8e8eeb8a6c87b7a98a5ed"} Mar 14 07:01:04 crc kubenswrapper[4893]: I0314 07:01:04.513751 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-6xtr4" podUID="cf444907-0c34-43ab-9bbd-b9ef0773743c" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://5d47d584e62acecf57128aff773b22cb948e4d61adb20657341d87d377075cca" gracePeriod=30 Mar 14 07:01:04 crc kubenswrapper[4893]: I0314 07:01:04.514183 4893 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-jn8vg container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" start-of-body= Mar 14 07:01:04 crc kubenswrapper[4893]: I0314 07:01:04.514215 4893 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-jn8vg" podUID="982402e2-823c-4c34-a446-7a2b05e9a00d" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" Mar 14 07:01:04 crc kubenswrapper[4893]: I0314 07:01:04.522471 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m8wwx" Mar 14 07:01:04 crc kubenswrapper[4893]: I0314 07:01:04.535926 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tsbfw" Mar 14 07:01:04 crc kubenswrapper[4893]: I0314 07:01:04.536637 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-57kx4" Mar 14 07:01:04 crc kubenswrapper[4893]: I0314 07:01:04.548134 4893 ???:1] "http: TLS handshake error from 192.168.126.11:40412: no serving certificate available for the kubelet" Mar 14 07:01:04 crc kubenswrapper[4893]: I0314 07:01:04.560165 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-xxwxx" podStartSLOduration=9.560146833 podStartE2EDuration="9.560146833s" podCreationTimestamp="2026-03-14 07:00:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:01:04.511743196 +0000 UTC m=+143.773920008" watchObservedRunningTime="2026-03-14 07:01:04.560146833 +0000 UTC m=+143.822323625" Mar 14 07:01:04 crc kubenswrapper[4893]: I0314 07:01:04.583406 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:01:04 crc kubenswrapper[4893]: I0314 07:01:04.591023 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-2vgnl" podStartSLOduration=81.591004793 podStartE2EDuration="1m21.591004793s" podCreationTimestamp="2026-03-14 06:59:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:01:04.582871035 +0000 UTC m=+143.845047827" watchObservedRunningTime="2026-03-14 07:01:04.591004793 +0000 UTC m=+143.853181575" Mar 14 07:01:04 crc kubenswrapper[4893]: E0314 07:01:04.592456 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:01:05.092437068 +0000 UTC m=+144.354613870 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:01:04 crc kubenswrapper[4893]: I0314 07:01:04.620326 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7l845" podStartSLOduration=81.620312076 podStartE2EDuration="1m21.620312076s" podCreationTimestamp="2026-03-14 06:59:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:01:04.617992499 +0000 UTC m=+143.880169311" watchObservedRunningTime="2026-03-14 07:01:04.620312076 +0000 UTC m=+143.882488868" Mar 14 07:01:04 crc kubenswrapper[4893]: I0314 07:01:04.689400 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjzxr\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:01:04 crc kubenswrapper[4893]: E0314 07:01:04.697290 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:01:05.197275257 +0000 UTC m=+144.459452039 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjzxr" (UID: "85a8345c-8774-4272-887a-42b2d64a65cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:01:04 crc kubenswrapper[4893]: I0314 07:01:04.788291 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-cgrf8" Mar 14 07:01:04 crc kubenswrapper[4893]: I0314 07:01:04.792229 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:01:04 crc kubenswrapper[4893]: E0314 07:01:04.792487 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:01:05.292471881 +0000 UTC m=+144.554648673 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:01:04 crc kubenswrapper[4893]: I0314 07:01:04.912805 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjzxr\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:01:04 crc kubenswrapper[4893]: E0314 07:01:04.913345 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:01:05.413333788 +0000 UTC m=+144.675510580 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjzxr" (UID: "85a8345c-8774-4272-887a-42b2d64a65cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:01:04 crc kubenswrapper[4893]: I0314 07:01:04.982877 4893 patch_prober.go:28] interesting pod/router-default-5444994796-ss7r8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 14 07:01:04 crc kubenswrapper[4893]: [-]has-synced failed: reason withheld Mar 14 07:01:04 crc kubenswrapper[4893]: [+]process-running ok Mar 14 07:01:04 crc kubenswrapper[4893]: healthz check failed Mar 14 07:01:04 crc kubenswrapper[4893]: I0314 07:01:04.982937 4893 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ss7r8" podUID="1c71efc6-2e03-405c-84f9-6ba44b085df4" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 14 07:01:05 crc kubenswrapper[4893]: I0314 07:01:05.013934 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:01:05 crc kubenswrapper[4893]: E0314 07:01:05.014236 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:01:05.514217541 +0000 UTC m=+144.776394333 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:01:05 crc kubenswrapper[4893]: I0314 07:01:05.018389 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 07:01:05 crc kubenswrapper[4893]: I0314 07:01:05.115330 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjzxr\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:01:05 crc kubenswrapper[4893]: E0314 07:01:05.115634 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:01:05.615623235 +0000 UTC m=+144.877800027 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjzxr" (UID: "85a8345c-8774-4272-887a-42b2d64a65cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:01:05 crc kubenswrapper[4893]: I0314 07:01:05.217337 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:01:05 crc kubenswrapper[4893]: E0314 07:01:05.217603 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:01:05.717589014 +0000 UTC m=+144.979765806 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:01:05 crc kubenswrapper[4893]: I0314 07:01:05.250755 4893 ???:1] "http: TLS handshake error from 192.168.126.11:40422: no serving certificate available for the kubelet" Mar 14 07:01:05 crc kubenswrapper[4893]: I0314 07:01:05.319492 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjzxr\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:01:05 crc kubenswrapper[4893]: E0314 07:01:05.319883 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:01:05.819866031 +0000 UTC m=+145.082042823 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjzxr" (UID: "85a8345c-8774-4272-887a-42b2d64a65cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:01:05 crc kubenswrapper[4893]: I0314 07:01:05.352667 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cmmld"] Mar 14 07:01:05 crc kubenswrapper[4893]: I0314 07:01:05.353707 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cmmld" Mar 14 07:01:05 crc kubenswrapper[4893]: I0314 07:01:05.355446 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 14 07:01:05 crc kubenswrapper[4893]: I0314 07:01:05.372175 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cmmld"] Mar 14 07:01:05 crc kubenswrapper[4893]: I0314 07:01:05.420545 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:01:05 crc kubenswrapper[4893]: E0314 07:01:05.420948 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:01:05.920927427 +0000 UTC m=+145.183104219 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:01:05 crc kubenswrapper[4893]: I0314 07:01:05.519170 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-btl2s" event={"ID":"ef07db86-4677-41ce-8b6d-7960cc63a9b8","Type":"ContainerStarted","Data":"e339bf08d250bfd0617bcfc858ce9e28a44c59e2c8111223748d24046fe66c1a"} Mar 14 07:01:05 crc kubenswrapper[4893]: I0314 07:01:05.521565 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjzxr\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:01:05 crc kubenswrapper[4893]: I0314 07:01:05.521613 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1c55410-c44f-483c-801a-de26ae05a415-utilities\") pod \"community-operators-cmmld\" (UID: \"b1c55410-c44f-483c-801a-de26ae05a415\") " pod="openshift-marketplace/community-operators-cmmld" Mar 14 07:01:05 crc kubenswrapper[4893]: I0314 07:01:05.521648 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1c55410-c44f-483c-801a-de26ae05a415-catalog-content\") pod \"community-operators-cmmld\" (UID: \"b1c55410-c44f-483c-801a-de26ae05a415\") " pod="openshift-marketplace/community-operators-cmmld" Mar 14 07:01:05 crc kubenswrapper[4893]: I0314 07:01:05.521692 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5jrk\" (UniqueName: \"kubernetes.io/projected/b1c55410-c44f-483c-801a-de26ae05a415-kube-api-access-w5jrk\") pod \"community-operators-cmmld\" (UID: \"b1c55410-c44f-483c-801a-de26ae05a415\") " pod="openshift-marketplace/community-operators-cmmld" Mar 14 07:01:05 crc kubenswrapper[4893]: E0314 07:01:05.521999 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:01:06.021985514 +0000 UTC m=+145.284162306 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjzxr" (UID: "85a8345c-8774-4272-887a-42b2d64a65cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:01:05 crc kubenswrapper[4893]: I0314 07:01:05.522260 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-phr5l" event={"ID":"38206b8e-cb94-47b8-b857-63a073f99ef7","Type":"ContainerStarted","Data":"310e41f6f7355a5c884ad43bdfe19226dd49e74875dc66725a4b6d01db12fe51"} Mar 14 07:01:05 crc kubenswrapper[4893]: I0314 07:01:05.525409 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"f3c36b8d-4898-4780-8e79-f4986e676f9d","Type":"ContainerStarted","Data":"5ce9dc6c81d97f31b9b4dfe2644ee91acb05a00becfa0bdf0f84c1051da80fc0"} Mar 14 07:01:05 crc kubenswrapper[4893]: I0314 07:01:05.533628 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-jn8vg" Mar 14 07:01:05 crc kubenswrapper[4893]: I0314 07:01:05.541329 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.541309493 podStartE2EDuration="2.541309493s" podCreationTimestamp="2026-03-14 07:01:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:01:05.538255869 +0000 UTC m=+144.800432661" watchObservedRunningTime="2026-03-14 07:01:05.541309493 +0000 UTC m=+144.803486285" Mar 14 07:01:05 crc kubenswrapper[4893]: I0314 07:01:05.548718 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-727v2"] Mar 14 07:01:05 crc kubenswrapper[4893]: I0314 07:01:05.549707 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-727v2" Mar 14 07:01:05 crc kubenswrapper[4893]: I0314 07:01:05.552160 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 14 07:01:05 crc kubenswrapper[4893]: I0314 07:01:05.571387 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-727v2"] Mar 14 07:01:05 crc kubenswrapper[4893]: I0314 07:01:05.618794 4893 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 14 07:01:05 crc kubenswrapper[4893]: I0314 07:01:05.622374 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:01:05 crc kubenswrapper[4893]: I0314 07:01:05.622772 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5crnx\" (UniqueName: \"kubernetes.io/projected/6eb806cc-dc34-40ff-b7d5-c33a575822ec-kube-api-access-5crnx\") pod \"certified-operators-727v2\" (UID: \"6eb806cc-dc34-40ff-b7d5-c33a575822ec\") " pod="openshift-marketplace/certified-operators-727v2" Mar 14 07:01:05 crc kubenswrapper[4893]: I0314 07:01:05.623155 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1c55410-c44f-483c-801a-de26ae05a415-utilities\") pod \"community-operators-cmmld\" (UID: \"b1c55410-c44f-483c-801a-de26ae05a415\") " pod="openshift-marketplace/community-operators-cmmld" Mar 14 07:01:05 crc kubenswrapper[4893]: I0314 07:01:05.623302 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6eb806cc-dc34-40ff-b7d5-c33a575822ec-utilities\") pod \"certified-operators-727v2\" (UID: \"6eb806cc-dc34-40ff-b7d5-c33a575822ec\") " pod="openshift-marketplace/certified-operators-727v2" Mar 14 07:01:05 crc kubenswrapper[4893]: I0314 07:01:05.623382 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1c55410-c44f-483c-801a-de26ae05a415-catalog-content\") pod \"community-operators-cmmld\" (UID: \"b1c55410-c44f-483c-801a-de26ae05a415\") " pod="openshift-marketplace/community-operators-cmmld" Mar 14 07:01:05 crc kubenswrapper[4893]: I0314 07:01:05.623496 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6eb806cc-dc34-40ff-b7d5-c33a575822ec-catalog-content\") pod \"certified-operators-727v2\" (UID: \"6eb806cc-dc34-40ff-b7d5-c33a575822ec\") " pod="openshift-marketplace/certified-operators-727v2" Mar 14 07:01:05 crc kubenswrapper[4893]: I0314 07:01:05.623792 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5jrk\" (UniqueName: \"kubernetes.io/projected/b1c55410-c44f-483c-801a-de26ae05a415-kube-api-access-w5jrk\") pod \"community-operators-cmmld\" (UID: \"b1c55410-c44f-483c-801a-de26ae05a415\") " pod="openshift-marketplace/community-operators-cmmld" Mar 14 07:01:05 crc kubenswrapper[4893]: E0314 07:01:05.624835 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:01:06.124813614 +0000 UTC m=+145.386990406 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:01:05 crc kubenswrapper[4893]: I0314 07:01:05.626989 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1c55410-c44f-483c-801a-de26ae05a415-utilities\") pod \"community-operators-cmmld\" (UID: \"b1c55410-c44f-483c-801a-de26ae05a415\") " pod="openshift-marketplace/community-operators-cmmld" Mar 14 07:01:05 crc kubenswrapper[4893]: I0314 07:01:05.628418 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1c55410-c44f-483c-801a-de26ae05a415-catalog-content\") pod \"community-operators-cmmld\" (UID: \"b1c55410-c44f-483c-801a-de26ae05a415\") " pod="openshift-marketplace/community-operators-cmmld" Mar 14 07:01:05 crc kubenswrapper[4893]: I0314 07:01:05.648268 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5jrk\" (UniqueName: \"kubernetes.io/projected/b1c55410-c44f-483c-801a-de26ae05a415-kube-api-access-w5jrk\") pod \"community-operators-cmmld\" (UID: \"b1c55410-c44f-483c-801a-de26ae05a415\") " pod="openshift-marketplace/community-operators-cmmld" Mar 14 07:01:05 crc kubenswrapper[4893]: I0314 07:01:05.684277 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cmmld" Mar 14 07:01:05 crc kubenswrapper[4893]: I0314 07:01:05.726318 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjzxr\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:01:05 crc kubenswrapper[4893]: I0314 07:01:05.726377 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6eb806cc-dc34-40ff-b7d5-c33a575822ec-utilities\") pod \"certified-operators-727v2\" (UID: \"6eb806cc-dc34-40ff-b7d5-c33a575822ec\") " pod="openshift-marketplace/certified-operators-727v2" Mar 14 07:01:05 crc kubenswrapper[4893]: I0314 07:01:05.726409 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6eb806cc-dc34-40ff-b7d5-c33a575822ec-catalog-content\") pod \"certified-operators-727v2\" (UID: \"6eb806cc-dc34-40ff-b7d5-c33a575822ec\") " pod="openshift-marketplace/certified-operators-727v2" Mar 14 07:01:05 crc kubenswrapper[4893]: I0314 07:01:05.726470 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5crnx\" (UniqueName: \"kubernetes.io/projected/6eb806cc-dc34-40ff-b7d5-c33a575822ec-kube-api-access-5crnx\") pod \"certified-operators-727v2\" (UID: \"6eb806cc-dc34-40ff-b7d5-c33a575822ec\") " pod="openshift-marketplace/certified-operators-727v2" Mar 14 07:01:05 crc kubenswrapper[4893]: E0314 07:01:05.726743 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:01:06.226713981 +0000 UTC m=+145.488890773 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjzxr" (UID: "85a8345c-8774-4272-887a-42b2d64a65cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:01:05 crc kubenswrapper[4893]: I0314 07:01:05.727252 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6eb806cc-dc34-40ff-b7d5-c33a575822ec-utilities\") pod \"certified-operators-727v2\" (UID: \"6eb806cc-dc34-40ff-b7d5-c33a575822ec\") " pod="openshift-marketplace/certified-operators-727v2" Mar 14 07:01:05 crc kubenswrapper[4893]: I0314 07:01:05.727446 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6eb806cc-dc34-40ff-b7d5-c33a575822ec-catalog-content\") pod \"certified-operators-727v2\" (UID: \"6eb806cc-dc34-40ff-b7d5-c33a575822ec\") " pod="openshift-marketplace/certified-operators-727v2" Mar 14 07:01:05 crc kubenswrapper[4893]: I0314 07:01:05.757041 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vhk5b"] Mar 14 07:01:05 crc kubenswrapper[4893]: I0314 07:01:05.758113 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vhk5b" Mar 14 07:01:05 crc kubenswrapper[4893]: I0314 07:01:05.766975 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vhk5b"] Mar 14 07:01:05 crc kubenswrapper[4893]: I0314 07:01:05.774552 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5crnx\" (UniqueName: \"kubernetes.io/projected/6eb806cc-dc34-40ff-b7d5-c33a575822ec-kube-api-access-5crnx\") pod \"certified-operators-727v2\" (UID: \"6eb806cc-dc34-40ff-b7d5-c33a575822ec\") " pod="openshift-marketplace/certified-operators-727v2" Mar 14 07:01:05 crc kubenswrapper[4893]: I0314 07:01:05.839369 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:01:05 crc kubenswrapper[4893]: E0314 07:01:05.839935 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:01:06.339916413 +0000 UTC m=+145.602093205 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:01:05 crc kubenswrapper[4893]: I0314 07:01:05.873630 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-727v2" Mar 14 07:01:05 crc kubenswrapper[4893]: I0314 07:01:05.943653 4893 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-14T07:01:05.618818618Z","Handler":null,"Name":""} Mar 14 07:01:05 crc kubenswrapper[4893]: I0314 07:01:05.944345 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ks8zw\" (UniqueName: \"kubernetes.io/projected/28f15d4e-4de1-481e-bd52-d4dcada252a4-kube-api-access-ks8zw\") pod \"community-operators-vhk5b\" (UID: \"28f15d4e-4de1-481e-bd52-d4dcada252a4\") " pod="openshift-marketplace/community-operators-vhk5b" Mar 14 07:01:05 crc kubenswrapper[4893]: I0314 07:01:05.944396 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28f15d4e-4de1-481e-bd52-d4dcada252a4-catalog-content\") pod \"community-operators-vhk5b\" (UID: \"28f15d4e-4de1-481e-bd52-d4dcada252a4\") " pod="openshift-marketplace/community-operators-vhk5b" Mar 14 07:01:05 crc kubenswrapper[4893]: I0314 07:01:05.944426 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjzxr\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:01:05 crc kubenswrapper[4893]: I0314 07:01:05.944501 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28f15d4e-4de1-481e-bd52-d4dcada252a4-utilities\") pod \"community-operators-vhk5b\" (UID: \"28f15d4e-4de1-481e-bd52-d4dcada252a4\") " pod="openshift-marketplace/community-operators-vhk5b" Mar 14 07:01:05 crc kubenswrapper[4893]: E0314 07:01:05.944825 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:01:06.444814163 +0000 UTC m=+145.706990955 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjzxr" (UID: "85a8345c-8774-4272-887a-42b2d64a65cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:01:05 crc kubenswrapper[4893]: I0314 07:01:05.959669 4893 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 14 07:01:05 crc kubenswrapper[4893]: I0314 07:01:05.959700 4893 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 14 07:01:05 crc kubenswrapper[4893]: I0314 07:01:05.987702 4893 patch_prober.go:28] interesting pod/router-default-5444994796-ss7r8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 14 07:01:05 crc kubenswrapper[4893]: [-]has-synced failed: reason withheld Mar 14 07:01:05 crc kubenswrapper[4893]: [+]process-running ok Mar 14 07:01:05 crc kubenswrapper[4893]: healthz check failed Mar 14 07:01:05 crc kubenswrapper[4893]: I0314 07:01:05.988052 4893 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ss7r8" podUID="1c71efc6-2e03-405c-84f9-6ba44b085df4" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 14 07:01:06 crc kubenswrapper[4893]: I0314 07:01:06.007613 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dkbml"] Mar 14 07:01:06 crc kubenswrapper[4893]: I0314 07:01:06.008811 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dkbml" Mar 14 07:01:06 crc kubenswrapper[4893]: I0314 07:01:06.009824 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dkbml"] Mar 14 07:01:06 crc kubenswrapper[4893]: I0314 07:01:06.045003 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:01:06 crc kubenswrapper[4893]: I0314 07:01:06.045351 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28f15d4e-4de1-481e-bd52-d4dcada252a4-utilities\") pod \"community-operators-vhk5b\" (UID: \"28f15d4e-4de1-481e-bd52-d4dcada252a4\") " pod="openshift-marketplace/community-operators-vhk5b" Mar 14 07:01:06 crc kubenswrapper[4893]: I0314 07:01:06.045423 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ks8zw\" (UniqueName: \"kubernetes.io/projected/28f15d4e-4de1-481e-bd52-d4dcada252a4-kube-api-access-ks8zw\") pod \"community-operators-vhk5b\" (UID: \"28f15d4e-4de1-481e-bd52-d4dcada252a4\") " pod="openshift-marketplace/community-operators-vhk5b" Mar 14 07:01:06 crc kubenswrapper[4893]: I0314 07:01:06.045460 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28f15d4e-4de1-481e-bd52-d4dcada252a4-catalog-content\") pod \"community-operators-vhk5b\" (UID: \"28f15d4e-4de1-481e-bd52-d4dcada252a4\") " pod="openshift-marketplace/community-operators-vhk5b" Mar 14 07:01:06 crc kubenswrapper[4893]: I0314 07:01:06.046159 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28f15d4e-4de1-481e-bd52-d4dcada252a4-catalog-content\") pod \"community-operators-vhk5b\" (UID: \"28f15d4e-4de1-481e-bd52-d4dcada252a4\") " pod="openshift-marketplace/community-operators-vhk5b" Mar 14 07:01:06 crc kubenswrapper[4893]: I0314 07:01:06.046390 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28f15d4e-4de1-481e-bd52-d4dcada252a4-utilities\") pod \"community-operators-vhk5b\" (UID: \"28f15d4e-4de1-481e-bd52-d4dcada252a4\") " pod="openshift-marketplace/community-operators-vhk5b" Mar 14 07:01:06 crc kubenswrapper[4893]: I0314 07:01:06.053391 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 14 07:01:06 crc kubenswrapper[4893]: I0314 07:01:06.059866 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cmmld"] Mar 14 07:01:06 crc kubenswrapper[4893]: I0314 07:01:06.075838 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ks8zw\" (UniqueName: \"kubernetes.io/projected/28f15d4e-4de1-481e-bd52-d4dcada252a4-kube-api-access-ks8zw\") pod \"community-operators-vhk5b\" (UID: \"28f15d4e-4de1-481e-bd52-d4dcada252a4\") " pod="openshift-marketplace/community-operators-vhk5b" Mar 14 07:01:06 crc kubenswrapper[4893]: I0314 07:01:06.090915 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vhk5b" Mar 14 07:01:06 crc kubenswrapper[4893]: I0314 07:01:06.147847 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dzwh\" (UniqueName: \"kubernetes.io/projected/bd723b75-2439-4310-af24-f180d494ec68-kube-api-access-4dzwh\") pod \"certified-operators-dkbml\" (UID: \"bd723b75-2439-4310-af24-f180d494ec68\") " pod="openshift-marketplace/certified-operators-dkbml" Mar 14 07:01:06 crc kubenswrapper[4893]: I0314 07:01:06.147917 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd723b75-2439-4310-af24-f180d494ec68-utilities\") pod \"certified-operators-dkbml\" (UID: \"bd723b75-2439-4310-af24-f180d494ec68\") " pod="openshift-marketplace/certified-operators-dkbml" Mar 14 07:01:06 crc kubenswrapper[4893]: I0314 07:01:06.147952 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjzxr\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:01:06 crc kubenswrapper[4893]: I0314 07:01:06.147974 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd723b75-2439-4310-af24-f180d494ec68-catalog-content\") pod \"certified-operators-dkbml\" (UID: \"bd723b75-2439-4310-af24-f180d494ec68\") " pod="openshift-marketplace/certified-operators-dkbml" Mar 14 07:01:06 crc kubenswrapper[4893]: I0314 07:01:06.170744 4893 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 14 07:01:06 crc kubenswrapper[4893]: I0314 07:01:06.170792 4893 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjzxr\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:01:06 crc kubenswrapper[4893]: I0314 07:01:06.192375 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-x2sb4"] Mar 14 07:01:06 crc kubenswrapper[4893]: I0314 07:01:06.192650 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-x2sb4" podUID="6f37e61e-cd0d-4394-bdfa-a7c08c0742f3" containerName="controller-manager" containerID="cri-o://ee417ed0d4a034b0fa24a26bc52e41cd081acb674244d479d6e3cb1b213ae4ad" gracePeriod=30 Mar 14 07:01:06 crc kubenswrapper[4893]: I0314 07:01:06.200834 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-x2sb4" Mar 14 07:01:06 crc kubenswrapper[4893]: I0314 07:01:06.206514 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-m8wwx"] Mar 14 07:01:06 crc kubenswrapper[4893]: I0314 07:01:06.256184 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dzwh\" (UniqueName: \"kubernetes.io/projected/bd723b75-2439-4310-af24-f180d494ec68-kube-api-access-4dzwh\") pod \"certified-operators-dkbml\" (UID: \"bd723b75-2439-4310-af24-f180d494ec68\") " pod="openshift-marketplace/certified-operators-dkbml" Mar 14 07:01:06 crc kubenswrapper[4893]: I0314 07:01:06.256468 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd723b75-2439-4310-af24-f180d494ec68-utilities\") pod \"certified-operators-dkbml\" (UID: \"bd723b75-2439-4310-af24-f180d494ec68\") " pod="openshift-marketplace/certified-operators-dkbml" Mar 14 07:01:06 crc kubenswrapper[4893]: I0314 07:01:06.256514 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd723b75-2439-4310-af24-f180d494ec68-catalog-content\") pod \"certified-operators-dkbml\" (UID: \"bd723b75-2439-4310-af24-f180d494ec68\") " pod="openshift-marketplace/certified-operators-dkbml" Mar 14 07:01:06 crc kubenswrapper[4893]: I0314 07:01:06.256929 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd723b75-2439-4310-af24-f180d494ec68-catalog-content\") pod \"certified-operators-dkbml\" (UID: \"bd723b75-2439-4310-af24-f180d494ec68\") " pod="openshift-marketplace/certified-operators-dkbml" Mar 14 07:01:06 crc kubenswrapper[4893]: I0314 07:01:06.257236 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd723b75-2439-4310-af24-f180d494ec68-utilities\") pod \"certified-operators-dkbml\" (UID: \"bd723b75-2439-4310-af24-f180d494ec68\") " pod="openshift-marketplace/certified-operators-dkbml" Mar 14 07:01:06 crc kubenswrapper[4893]: I0314 07:01:06.279662 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-727v2"] Mar 14 07:01:06 crc kubenswrapper[4893]: I0314 07:01:06.285601 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dzwh\" (UniqueName: \"kubernetes.io/projected/bd723b75-2439-4310-af24-f180d494ec68-kube-api-access-4dzwh\") pod \"certified-operators-dkbml\" (UID: \"bd723b75-2439-4310-af24-f180d494ec68\") " pod="openshift-marketplace/certified-operators-dkbml" Mar 14 07:01:06 crc kubenswrapper[4893]: W0314 07:01:06.320578 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6eb806cc_dc34_40ff_b7d5_c33a575822ec.slice/crio-a6d444db79cf5a51427aefa0db554dbc6509fb62dcc44c8d03cd38ce740eecbc WatchSource:0}: Error finding container a6d444db79cf5a51427aefa0db554dbc6509fb62dcc44c8d03cd38ce740eecbc: Status 404 returned error can't find the container with id a6d444db79cf5a51427aefa0db554dbc6509fb62dcc44c8d03cd38ce740eecbc Mar 14 07:01:06 crc kubenswrapper[4893]: I0314 07:01:06.327558 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjzxr\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:01:06 crc kubenswrapper[4893]: I0314 07:01:06.350890 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dkbml" Mar 14 07:01:06 crc kubenswrapper[4893]: I0314 07:01:06.535154 4893 generic.go:334] "Generic (PLEG): container finished" podID="3009e07b-2452-425c-95c3-3a78fa993d62" containerID="f346a9d6640b2d3342f142ca50107602a1405bb3b1a5118c157b77375c6a113a" exitCode=0 Mar 14 07:01:06 crc kubenswrapper[4893]: I0314 07:01:06.535483 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557860-g42hf" event={"ID":"3009e07b-2452-425c-95c3-3a78fa993d62","Type":"ContainerDied","Data":"f346a9d6640b2d3342f142ca50107602a1405bb3b1a5118c157b77375c6a113a"} Mar 14 07:01:06 crc kubenswrapper[4893]: I0314 07:01:06.539465 4893 generic.go:334] "Generic (PLEG): container finished" podID="f3c36b8d-4898-4780-8e79-f4986e676f9d" containerID="5ce9dc6c81d97f31b9b4dfe2644ee91acb05a00becfa0bdf0f84c1051da80fc0" exitCode=0 Mar 14 07:01:06 crc kubenswrapper[4893]: I0314 07:01:06.539562 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"f3c36b8d-4898-4780-8e79-f4986e676f9d","Type":"ContainerDied","Data":"5ce9dc6c81d97f31b9b4dfe2644ee91acb05a00becfa0bdf0f84c1051da80fc0"} Mar 14 07:01:06 crc kubenswrapper[4893]: I0314 07:01:06.542160 4893 generic.go:334] "Generic (PLEG): container finished" podID="b1c55410-c44f-483c-801a-de26ae05a415" containerID="20f2f1dc2429b8e6e78d7ae2eed4ef4cfa7fc42b3b0e1bff3994778ef128453b" exitCode=0 Mar 14 07:01:06 crc kubenswrapper[4893]: I0314 07:01:06.542242 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cmmld" event={"ID":"b1c55410-c44f-483c-801a-de26ae05a415","Type":"ContainerDied","Data":"20f2f1dc2429b8e6e78d7ae2eed4ef4cfa7fc42b3b0e1bff3994778ef128453b"} Mar 14 07:01:06 crc kubenswrapper[4893]: I0314 07:01:06.542288 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cmmld" event={"ID":"b1c55410-c44f-483c-801a-de26ae05a415","Type":"ContainerStarted","Data":"42f4cdccc077318852a23e6115fa0a24f777f47214a0a9abf2a3f8029dd0491c"} Mar 14 07:01:06 crc kubenswrapper[4893]: I0314 07:01:06.547150 4893 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 07:01:06 crc kubenswrapper[4893]: I0314 07:01:06.561923 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-727v2" event={"ID":"6eb806cc-dc34-40ff-b7d5-c33a575822ec","Type":"ContainerStarted","Data":"a6d444db79cf5a51427aefa0db554dbc6509fb62dcc44c8d03cd38ce740eecbc"} Mar 14 07:01:06 crc kubenswrapper[4893]: I0314 07:01:06.567023 4893 generic.go:334] "Generic (PLEG): container finished" podID="6f37e61e-cd0d-4394-bdfa-a7c08c0742f3" containerID="ee417ed0d4a034b0fa24a26bc52e41cd081acb674244d479d6e3cb1b213ae4ad" exitCode=0 Mar 14 07:01:06 crc kubenswrapper[4893]: I0314 07:01:06.567095 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-x2sb4" event={"ID":"6f37e61e-cd0d-4394-bdfa-a7c08c0742f3","Type":"ContainerDied","Data":"ee417ed0d4a034b0fa24a26bc52e41cd081acb674244d479d6e3cb1b213ae4ad"} Mar 14 07:01:06 crc kubenswrapper[4893]: I0314 07:01:06.571474 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-btl2s" event={"ID":"ef07db86-4677-41ce-8b6d-7960cc63a9b8","Type":"ContainerStarted","Data":"edcec5e4fdabccd2ee50cd629759d3ba71d0ae360714a2a44ee515ad4f7ae681"} Mar 14 07:01:06 crc kubenswrapper[4893]: I0314 07:01:06.573030 4893 ???:1] "http: TLS handshake error from 192.168.126.11:40424: no serving certificate available for the kubelet" Mar 14 07:01:06 crc kubenswrapper[4893]: I0314 07:01:06.584288 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-x2sb4" Mar 14 07:01:06 crc kubenswrapper[4893]: I0314 07:01:06.586628 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-phr5l" event={"ID":"38206b8e-cb94-47b8-b857-63a073f99ef7","Type":"ContainerStarted","Data":"e0dfa728ffbdf05044ce13c02beb4a1202afc499e2308e8e59e8b82c1e081ced"} Mar 14 07:01:06 crc kubenswrapper[4893]: I0314 07:01:06.586652 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m8wwx" podUID="95fe2bd1-1c99-4105-931e-6a60dd881260" containerName="route-controller-manager" containerID="cri-o://a74ffae6fa6d74169433ea285404d61603b4f354004b6e301fd226b0b054b3b6" gracePeriod=30 Mar 14 07:01:06 crc kubenswrapper[4893]: I0314 07:01:06.603067 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:01:06 crc kubenswrapper[4893]: I0314 07:01:06.619646 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-phr5l" podStartSLOduration=11.619628046999999 podStartE2EDuration="11.619628047s" podCreationTimestamp="2026-03-14 07:00:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:01:06.619140365 +0000 UTC m=+145.881317167" watchObservedRunningTime="2026-03-14 07:01:06.619628047 +0000 UTC m=+145.881804839" Mar 14 07:01:06 crc kubenswrapper[4893]: I0314 07:01:06.663237 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-btl2s" podStartSLOduration=83.663219127 podStartE2EDuration="1m23.663219127s" podCreationTimestamp="2026-03-14 06:59:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:01:06.65843514 +0000 UTC m=+145.920611952" watchObservedRunningTime="2026-03-14 07:01:06.663219127 +0000 UTC m=+145.925395919" Mar 14 07:01:06 crc kubenswrapper[4893]: I0314 07:01:06.697184 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vhk5b"] Mar 14 07:01:06 crc kubenswrapper[4893]: I0314 07:01:06.713995 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dkbml"] Mar 14 07:01:06 crc kubenswrapper[4893]: I0314 07:01:06.764482 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbpjl\" (UniqueName: \"kubernetes.io/projected/6f37e61e-cd0d-4394-bdfa-a7c08c0742f3-kube-api-access-wbpjl\") pod \"6f37e61e-cd0d-4394-bdfa-a7c08c0742f3\" (UID: \"6f37e61e-cd0d-4394-bdfa-a7c08c0742f3\") " Mar 14 07:01:06 crc kubenswrapper[4893]: I0314 07:01:06.764923 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f37e61e-cd0d-4394-bdfa-a7c08c0742f3-serving-cert\") pod \"6f37e61e-cd0d-4394-bdfa-a7c08c0742f3\" (UID: \"6f37e61e-cd0d-4394-bdfa-a7c08c0742f3\") " Mar 14 07:01:06 crc kubenswrapper[4893]: I0314 07:01:06.765030 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6f37e61e-cd0d-4394-bdfa-a7c08c0742f3-proxy-ca-bundles\") pod \"6f37e61e-cd0d-4394-bdfa-a7c08c0742f3\" (UID: \"6f37e61e-cd0d-4394-bdfa-a7c08c0742f3\") " Mar 14 07:01:06 crc kubenswrapper[4893]: I0314 07:01:06.765079 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6f37e61e-cd0d-4394-bdfa-a7c08c0742f3-client-ca\") pod \"6f37e61e-cd0d-4394-bdfa-a7c08c0742f3\" (UID: \"6f37e61e-cd0d-4394-bdfa-a7c08c0742f3\") " Mar 14 07:01:06 crc kubenswrapper[4893]: I0314 07:01:06.765117 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f37e61e-cd0d-4394-bdfa-a7c08c0742f3-config\") pod \"6f37e61e-cd0d-4394-bdfa-a7c08c0742f3\" (UID: \"6f37e61e-cd0d-4394-bdfa-a7c08c0742f3\") " Mar 14 07:01:06 crc kubenswrapper[4893]: I0314 07:01:06.769153 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f37e61e-cd0d-4394-bdfa-a7c08c0742f3-client-ca" (OuterVolumeSpecName: "client-ca") pod "6f37e61e-cd0d-4394-bdfa-a7c08c0742f3" (UID: "6f37e61e-cd0d-4394-bdfa-a7c08c0742f3"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:01:06 crc kubenswrapper[4893]: I0314 07:01:06.769531 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f37e61e-cd0d-4394-bdfa-a7c08c0742f3-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "6f37e61e-cd0d-4394-bdfa-a7c08c0742f3" (UID: "6f37e61e-cd0d-4394-bdfa-a7c08c0742f3"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:01:06 crc kubenswrapper[4893]: I0314 07:01:06.770459 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f37e61e-cd0d-4394-bdfa-a7c08c0742f3-config" (OuterVolumeSpecName: "config") pod "6f37e61e-cd0d-4394-bdfa-a7c08c0742f3" (UID: "6f37e61e-cd0d-4394-bdfa-a7c08c0742f3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:01:06 crc kubenswrapper[4893]: I0314 07:01:06.776939 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f37e61e-cd0d-4394-bdfa-a7c08c0742f3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6f37e61e-cd0d-4394-bdfa-a7c08c0742f3" (UID: "6f37e61e-cd0d-4394-bdfa-a7c08c0742f3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:01:06 crc kubenswrapper[4893]: I0314 07:01:06.778823 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f37e61e-cd0d-4394-bdfa-a7c08c0742f3-kube-api-access-wbpjl" (OuterVolumeSpecName: "kube-api-access-wbpjl") pod "6f37e61e-cd0d-4394-bdfa-a7c08c0742f3" (UID: "6f37e61e-cd0d-4394-bdfa-a7c08c0742f3"). InnerVolumeSpecName "kube-api-access-wbpjl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:01:06 crc kubenswrapper[4893]: I0314 07:01:06.866899 4893 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6f37e61e-cd0d-4394-bdfa-a7c08c0742f3-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 14 07:01:06 crc kubenswrapper[4893]: I0314 07:01:06.866937 4893 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6f37e61e-cd0d-4394-bdfa-a7c08c0742f3-client-ca\") on node \"crc\" DevicePath \"\"" Mar 14 07:01:06 crc kubenswrapper[4893]: I0314 07:01:06.866948 4893 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f37e61e-cd0d-4394-bdfa-a7c08c0742f3-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:01:06 crc kubenswrapper[4893]: I0314 07:01:06.866960 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbpjl\" (UniqueName: \"kubernetes.io/projected/6f37e61e-cd0d-4394-bdfa-a7c08c0742f3-kube-api-access-wbpjl\") on node \"crc\" DevicePath \"\"" Mar 14 07:01:06 crc kubenswrapper[4893]: I0314 07:01:06.866975 4893 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f37e61e-cd0d-4394-bdfa-a7c08c0742f3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:01:06 crc kubenswrapper[4893]: I0314 07:01:06.977169 4893 patch_prober.go:28] interesting pod/router-default-5444994796-ss7r8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 14 07:01:06 crc kubenswrapper[4893]: [-]has-synced failed: reason withheld Mar 14 07:01:06 crc kubenswrapper[4893]: [+]process-running ok Mar 14 07:01:06 crc kubenswrapper[4893]: healthz check failed Mar 14 07:01:06 crc kubenswrapper[4893]: I0314 07:01:06.977438 4893 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ss7r8" podUID="1c71efc6-2e03-405c-84f9-6ba44b085df4" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 14 07:01:07 crc kubenswrapper[4893]: I0314 07:01:07.053922 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m8wwx" Mar 14 07:01:07 crc kubenswrapper[4893]: I0314 07:01:07.091094 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mjzxr"] Mar 14 07:01:07 crc kubenswrapper[4893]: W0314 07:01:07.101878 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85a8345c_8774_4272_887a_42b2d64a65cf.slice/crio-47937fbea22995d6de1a09e7355cd647ed372fae6e9897681b4eb0c16893141d WatchSource:0}: Error finding container 47937fbea22995d6de1a09e7355cd647ed372fae6e9897681b4eb0c16893141d: Status 404 returned error can't find the container with id 47937fbea22995d6de1a09e7355cd647ed372fae6e9897681b4eb0c16893141d Mar 14 07:01:07 crc kubenswrapper[4893]: I0314 07:01:07.171365 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/95fe2bd1-1c99-4105-931e-6a60dd881260-client-ca\") pod \"95fe2bd1-1c99-4105-931e-6a60dd881260\" (UID: \"95fe2bd1-1c99-4105-931e-6a60dd881260\") " Mar 14 07:01:07 crc kubenswrapper[4893]: I0314 07:01:07.171408 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95fe2bd1-1c99-4105-931e-6a60dd881260-config\") pod \"95fe2bd1-1c99-4105-931e-6a60dd881260\" (UID: \"95fe2bd1-1c99-4105-931e-6a60dd881260\") " Mar 14 07:01:07 crc kubenswrapper[4893]: I0314 07:01:07.171431 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z28pl\" (UniqueName: \"kubernetes.io/projected/95fe2bd1-1c99-4105-931e-6a60dd881260-kube-api-access-z28pl\") pod \"95fe2bd1-1c99-4105-931e-6a60dd881260\" (UID: \"95fe2bd1-1c99-4105-931e-6a60dd881260\") " Mar 14 07:01:07 crc kubenswrapper[4893]: I0314 07:01:07.171503 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/95fe2bd1-1c99-4105-931e-6a60dd881260-serving-cert\") pod \"95fe2bd1-1c99-4105-931e-6a60dd881260\" (UID: \"95fe2bd1-1c99-4105-931e-6a60dd881260\") " Mar 14 07:01:07 crc kubenswrapper[4893]: I0314 07:01:07.172126 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95fe2bd1-1c99-4105-931e-6a60dd881260-client-ca" (OuterVolumeSpecName: "client-ca") pod "95fe2bd1-1c99-4105-931e-6a60dd881260" (UID: "95fe2bd1-1c99-4105-931e-6a60dd881260"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:01:07 crc kubenswrapper[4893]: I0314 07:01:07.172590 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95fe2bd1-1c99-4105-931e-6a60dd881260-config" (OuterVolumeSpecName: "config") pod "95fe2bd1-1c99-4105-931e-6a60dd881260" (UID: "95fe2bd1-1c99-4105-931e-6a60dd881260"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:01:07 crc kubenswrapper[4893]: I0314 07:01:07.176700 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95fe2bd1-1c99-4105-931e-6a60dd881260-kube-api-access-z28pl" (OuterVolumeSpecName: "kube-api-access-z28pl") pod "95fe2bd1-1c99-4105-931e-6a60dd881260" (UID: "95fe2bd1-1c99-4105-931e-6a60dd881260"). InnerVolumeSpecName "kube-api-access-z28pl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:01:07 crc kubenswrapper[4893]: I0314 07:01:07.176901 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95fe2bd1-1c99-4105-931e-6a60dd881260-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "95fe2bd1-1c99-4105-931e-6a60dd881260" (UID: "95fe2bd1-1c99-4105-931e-6a60dd881260"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:01:07 crc kubenswrapper[4893]: I0314 07:01:07.273099 4893 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95fe2bd1-1c99-4105-931e-6a60dd881260-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:01:07 crc kubenswrapper[4893]: I0314 07:01:07.273147 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z28pl\" (UniqueName: \"kubernetes.io/projected/95fe2bd1-1c99-4105-931e-6a60dd881260-kube-api-access-z28pl\") on node \"crc\" DevicePath \"\"" Mar 14 07:01:07 crc kubenswrapper[4893]: I0314 07:01:07.273162 4893 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/95fe2bd1-1c99-4105-931e-6a60dd881260-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:01:07 crc kubenswrapper[4893]: I0314 07:01:07.273174 4893 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/95fe2bd1-1c99-4105-931e-6a60dd881260-client-ca\") on node \"crc\" DevicePath \"\"" Mar 14 07:01:07 crc kubenswrapper[4893]: I0314 07:01:07.355700 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gn5xf"] Mar 14 07:01:07 crc kubenswrapper[4893]: E0314 07:01:07.355942 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f37e61e-cd0d-4394-bdfa-a7c08c0742f3" containerName="controller-manager" Mar 14 07:01:07 crc kubenswrapper[4893]: I0314 07:01:07.355958 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f37e61e-cd0d-4394-bdfa-a7c08c0742f3" containerName="controller-manager" Mar 14 07:01:07 crc kubenswrapper[4893]: E0314 07:01:07.355993 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95fe2bd1-1c99-4105-931e-6a60dd881260" containerName="route-controller-manager" Mar 14 07:01:07 crc kubenswrapper[4893]: I0314 07:01:07.356003 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="95fe2bd1-1c99-4105-931e-6a60dd881260" containerName="route-controller-manager" Mar 14 07:01:07 crc kubenswrapper[4893]: I0314 07:01:07.356104 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f37e61e-cd0d-4394-bdfa-a7c08c0742f3" containerName="controller-manager" Mar 14 07:01:07 crc kubenswrapper[4893]: I0314 07:01:07.356120 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="95fe2bd1-1c99-4105-931e-6a60dd881260" containerName="route-controller-manager" Mar 14 07:01:07 crc kubenswrapper[4893]: I0314 07:01:07.356982 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gn5xf" Mar 14 07:01:07 crc kubenswrapper[4893]: I0314 07:01:07.360254 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 14 07:01:07 crc kubenswrapper[4893]: I0314 07:01:07.368871 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gn5xf"] Mar 14 07:01:07 crc kubenswrapper[4893]: I0314 07:01:07.390653 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 14 07:01:07 crc kubenswrapper[4893]: I0314 07:01:07.477104 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/212d2416-8201-4cae-a8b9-3121de2e8348-catalog-content\") pod \"redhat-marketplace-gn5xf\" (UID: \"212d2416-8201-4cae-a8b9-3121de2e8348\") " pod="openshift-marketplace/redhat-marketplace-gn5xf" Mar 14 07:01:07 crc kubenswrapper[4893]: I0314 07:01:07.477169 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/212d2416-8201-4cae-a8b9-3121de2e8348-utilities\") pod \"redhat-marketplace-gn5xf\" (UID: \"212d2416-8201-4cae-a8b9-3121de2e8348\") " pod="openshift-marketplace/redhat-marketplace-gn5xf" Mar 14 07:01:07 crc kubenswrapper[4893]: I0314 07:01:07.477195 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2m5r\" (UniqueName: \"kubernetes.io/projected/212d2416-8201-4cae-a8b9-3121de2e8348-kube-api-access-d2m5r\") pod \"redhat-marketplace-gn5xf\" (UID: \"212d2416-8201-4cae-a8b9-3121de2e8348\") " pod="openshift-marketplace/redhat-marketplace-gn5xf" Mar 14 07:01:07 crc kubenswrapper[4893]: I0314 07:01:07.578966 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2m5r\" (UniqueName: \"kubernetes.io/projected/212d2416-8201-4cae-a8b9-3121de2e8348-kube-api-access-d2m5r\") pod \"redhat-marketplace-gn5xf\" (UID: \"212d2416-8201-4cae-a8b9-3121de2e8348\") " pod="openshift-marketplace/redhat-marketplace-gn5xf" Mar 14 07:01:07 crc kubenswrapper[4893]: I0314 07:01:07.579053 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/212d2416-8201-4cae-a8b9-3121de2e8348-catalog-content\") pod \"redhat-marketplace-gn5xf\" (UID: \"212d2416-8201-4cae-a8b9-3121de2e8348\") " pod="openshift-marketplace/redhat-marketplace-gn5xf" Mar 14 07:01:07 crc kubenswrapper[4893]: I0314 07:01:07.579097 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/212d2416-8201-4cae-a8b9-3121de2e8348-utilities\") pod \"redhat-marketplace-gn5xf\" (UID: \"212d2416-8201-4cae-a8b9-3121de2e8348\") " pod="openshift-marketplace/redhat-marketplace-gn5xf" Mar 14 07:01:07 crc kubenswrapper[4893]: I0314 07:01:07.579624 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/212d2416-8201-4cae-a8b9-3121de2e8348-utilities\") pod \"redhat-marketplace-gn5xf\" (UID: \"212d2416-8201-4cae-a8b9-3121de2e8348\") " pod="openshift-marketplace/redhat-marketplace-gn5xf" Mar 14 07:01:07 crc kubenswrapper[4893]: I0314 07:01:07.580067 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/212d2416-8201-4cae-a8b9-3121de2e8348-catalog-content\") pod \"redhat-marketplace-gn5xf\" (UID: \"212d2416-8201-4cae-a8b9-3121de2e8348\") " pod="openshift-marketplace/redhat-marketplace-gn5xf" Mar 14 07:01:07 crc kubenswrapper[4893]: I0314 07:01:07.608665 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2m5r\" (UniqueName: \"kubernetes.io/projected/212d2416-8201-4cae-a8b9-3121de2e8348-kube-api-access-d2m5r\") pod \"redhat-marketplace-gn5xf\" (UID: \"212d2416-8201-4cae-a8b9-3121de2e8348\") " pod="openshift-marketplace/redhat-marketplace-gn5xf" Mar 14 07:01:07 crc kubenswrapper[4893]: I0314 07:01:07.611889 4893 generic.go:334] "Generic (PLEG): container finished" podID="bd723b75-2439-4310-af24-f180d494ec68" containerID="34d7649e839505bd8fc824c6ec09c8a6328c336a0b41703f8c7dfe891eaf0278" exitCode=0 Mar 14 07:01:07 crc kubenswrapper[4893]: I0314 07:01:07.611986 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dkbml" event={"ID":"bd723b75-2439-4310-af24-f180d494ec68","Type":"ContainerDied","Data":"34d7649e839505bd8fc824c6ec09c8a6328c336a0b41703f8c7dfe891eaf0278"} Mar 14 07:01:07 crc kubenswrapper[4893]: I0314 07:01:07.612022 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dkbml" event={"ID":"bd723b75-2439-4310-af24-f180d494ec68","Type":"ContainerStarted","Data":"9232b33f6d77f897b909f7e366384a54d3b02c19c9adc33122f68997eda7ad3a"} Mar 14 07:01:07 crc kubenswrapper[4893]: I0314 07:01:07.614477 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-x2sb4" event={"ID":"6f37e61e-cd0d-4394-bdfa-a7c08c0742f3","Type":"ContainerDied","Data":"fd9c4c7203ed35aba1bf7c835c91534ce3d24daa0a7215c46b3d277e32b19c4a"} Mar 14 07:01:07 crc kubenswrapper[4893]: I0314 07:01:07.614495 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-x2sb4" Mar 14 07:01:07 crc kubenswrapper[4893]: I0314 07:01:07.614514 4893 scope.go:117] "RemoveContainer" containerID="ee417ed0d4a034b0fa24a26bc52e41cd081acb674244d479d6e3cb1b213ae4ad" Mar 14 07:01:07 crc kubenswrapper[4893]: I0314 07:01:07.619194 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" event={"ID":"85a8345c-8774-4272-887a-42b2d64a65cf","Type":"ContainerStarted","Data":"4ef354b203d1da84471c23e5fea5474eab1dcc41b9caa9c5bae3440f86862da8"} Mar 14 07:01:07 crc kubenswrapper[4893]: I0314 07:01:07.619239 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" event={"ID":"85a8345c-8774-4272-887a-42b2d64a65cf","Type":"ContainerStarted","Data":"47937fbea22995d6de1a09e7355cd647ed372fae6e9897681b4eb0c16893141d"} Mar 14 07:01:07 crc kubenswrapper[4893]: I0314 07:01:07.619293 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:01:07 crc kubenswrapper[4893]: I0314 07:01:07.621922 4893 generic.go:334] "Generic (PLEG): container finished" podID="28f15d4e-4de1-481e-bd52-d4dcada252a4" containerID="ddc5f1090ac1a8c06dbbcbc5d6a750840f369a0d3b5747a26457caadf0fdf413" exitCode=0 Mar 14 07:01:07 crc kubenswrapper[4893]: I0314 07:01:07.622005 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vhk5b" event={"ID":"28f15d4e-4de1-481e-bd52-d4dcada252a4","Type":"ContainerDied","Data":"ddc5f1090ac1a8c06dbbcbc5d6a750840f369a0d3b5747a26457caadf0fdf413"} Mar 14 07:01:07 crc kubenswrapper[4893]: I0314 07:01:07.622031 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vhk5b" event={"ID":"28f15d4e-4de1-481e-bd52-d4dcada252a4","Type":"ContainerStarted","Data":"acc8a9581e067969c51fc5d53c4ddb0fcd7ed049ae41db327b296a4eb9d42cdb"} Mar 14 07:01:07 crc kubenswrapper[4893]: I0314 07:01:07.630288 4893 generic.go:334] "Generic (PLEG): container finished" podID="95fe2bd1-1c99-4105-931e-6a60dd881260" containerID="a74ffae6fa6d74169433ea285404d61603b4f354004b6e301fd226b0b054b3b6" exitCode=0 Mar 14 07:01:07 crc kubenswrapper[4893]: I0314 07:01:07.630983 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m8wwx" Mar 14 07:01:07 crc kubenswrapper[4893]: I0314 07:01:07.631009 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m8wwx" event={"ID":"95fe2bd1-1c99-4105-931e-6a60dd881260","Type":"ContainerDied","Data":"a74ffae6fa6d74169433ea285404d61603b4f354004b6e301fd226b0b054b3b6"} Mar 14 07:01:07 crc kubenswrapper[4893]: I0314 07:01:07.631045 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m8wwx" event={"ID":"95fe2bd1-1c99-4105-931e-6a60dd881260","Type":"ContainerDied","Data":"bb8ac88b07a14007465d064315f90614080167db4a0a536cb40f5bdd3942c16f"} Mar 14 07:01:07 crc kubenswrapper[4893]: I0314 07:01:07.642876 4893 generic.go:334] "Generic (PLEG): container finished" podID="6eb806cc-dc34-40ff-b7d5-c33a575822ec" containerID="97318b5c1f5183cec550a9425d7cb2e610ec362d17a9b6013829e490c5a55233" exitCode=0 Mar 14 07:01:07 crc kubenswrapper[4893]: I0314 07:01:07.642939 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-727v2" event={"ID":"6eb806cc-dc34-40ff-b7d5-c33a575822ec","Type":"ContainerDied","Data":"97318b5c1f5183cec550a9425d7cb2e610ec362d17a9b6013829e490c5a55233"} Mar 14 07:01:07 crc kubenswrapper[4893]: I0314 07:01:07.650323 4893 scope.go:117] "RemoveContainer" containerID="a74ffae6fa6d74169433ea285404d61603b4f354004b6e301fd226b0b054b3b6" Mar 14 07:01:07 crc kubenswrapper[4893]: I0314 07:01:07.655153 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-x2sb4"] Mar 14 07:01:07 crc kubenswrapper[4893]: I0314 07:01:07.655428 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-x2sb4"] Mar 14 07:01:07 crc kubenswrapper[4893]: I0314 07:01:07.681574 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gn5xf" Mar 14 07:01:07 crc kubenswrapper[4893]: I0314 07:01:07.687256 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" podStartSLOduration=84.687229039 podStartE2EDuration="1m24.687229039s" podCreationTimestamp="2026-03-14 06:59:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:01:07.672679796 +0000 UTC m=+146.934856618" watchObservedRunningTime="2026-03-14 07:01:07.687229039 +0000 UTC m=+146.949405851" Mar 14 07:01:07 crc kubenswrapper[4893]: I0314 07:01:07.744359 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-m8wwx"] Mar 14 07:01:07 crc kubenswrapper[4893]: I0314 07:01:07.747134 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-m8wwx"] Mar 14 07:01:07 crc kubenswrapper[4893]: I0314 07:01:07.750456 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-84hsf"] Mar 14 07:01:07 crc kubenswrapper[4893]: I0314 07:01:07.753106 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-84hsf" Mar 14 07:01:07 crc kubenswrapper[4893]: I0314 07:01:07.764951 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-84hsf"] Mar 14 07:01:07 crc kubenswrapper[4893]: I0314 07:01:07.768364 4893 scope.go:117] "RemoveContainer" containerID="a74ffae6fa6d74169433ea285404d61603b4f354004b6e301fd226b0b054b3b6" Mar 14 07:01:07 crc kubenswrapper[4893]: E0314 07:01:07.769780 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a74ffae6fa6d74169433ea285404d61603b4f354004b6e301fd226b0b054b3b6\": container with ID starting with a74ffae6fa6d74169433ea285404d61603b4f354004b6e301fd226b0b054b3b6 not found: ID does not exist" containerID="a74ffae6fa6d74169433ea285404d61603b4f354004b6e301fd226b0b054b3b6" Mar 14 07:01:07 crc kubenswrapper[4893]: I0314 07:01:07.769816 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a74ffae6fa6d74169433ea285404d61603b4f354004b6e301fd226b0b054b3b6"} err="failed to get container status \"a74ffae6fa6d74169433ea285404d61603b4f354004b6e301fd226b0b054b3b6\": rpc error: code = NotFound desc = could not find container \"a74ffae6fa6d74169433ea285404d61603b4f354004b6e301fd226b0b054b3b6\": container with ID starting with a74ffae6fa6d74169433ea285404d61603b4f354004b6e301fd226b0b054b3b6 not found: ID does not exist" Mar 14 07:01:07 crc kubenswrapper[4893]: I0314 07:01:07.890626 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dq87\" (UniqueName: \"kubernetes.io/projected/00db91c2-32b1-4b21-9795-e47042b4b9a4-kube-api-access-4dq87\") pod \"redhat-marketplace-84hsf\" (UID: \"00db91c2-32b1-4b21-9795-e47042b4b9a4\") " pod="openshift-marketplace/redhat-marketplace-84hsf" Mar 14 07:01:07 crc kubenswrapper[4893]: I0314 07:01:07.890971 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00db91c2-32b1-4b21-9795-e47042b4b9a4-utilities\") pod \"redhat-marketplace-84hsf\" (UID: \"00db91c2-32b1-4b21-9795-e47042b4b9a4\") " pod="openshift-marketplace/redhat-marketplace-84hsf" Mar 14 07:01:07 crc kubenswrapper[4893]: I0314 07:01:07.891029 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00db91c2-32b1-4b21-9795-e47042b4b9a4-catalog-content\") pod \"redhat-marketplace-84hsf\" (UID: \"00db91c2-32b1-4b21-9795-e47042b4b9a4\") " pod="openshift-marketplace/redhat-marketplace-84hsf" Mar 14 07:01:07 crc kubenswrapper[4893]: I0314 07:01:07.924646 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557860-g42hf" Mar 14 07:01:07 crc kubenswrapper[4893]: I0314 07:01:07.977581 4893 patch_prober.go:28] interesting pod/router-default-5444994796-ss7r8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 14 07:01:07 crc kubenswrapper[4893]: [-]has-synced failed: reason withheld Mar 14 07:01:07 crc kubenswrapper[4893]: [+]process-running ok Mar 14 07:01:07 crc kubenswrapper[4893]: healthz check failed Mar 14 07:01:07 crc kubenswrapper[4893]: I0314 07:01:07.977658 4893 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ss7r8" podUID="1c71efc6-2e03-405c-84f9-6ba44b085df4" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 14 07:01:07 crc kubenswrapper[4893]: I0314 07:01:07.986847 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 14 07:01:07 crc kubenswrapper[4893]: I0314 07:01:07.991712 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00db91c2-32b1-4b21-9795-e47042b4b9a4-catalog-content\") pod \"redhat-marketplace-84hsf\" (UID: \"00db91c2-32b1-4b21-9795-e47042b4b9a4\") " pod="openshift-marketplace/redhat-marketplace-84hsf" Mar 14 07:01:07 crc kubenswrapper[4893]: I0314 07:01:07.991778 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dq87\" (UniqueName: \"kubernetes.io/projected/00db91c2-32b1-4b21-9795-e47042b4b9a4-kube-api-access-4dq87\") pod \"redhat-marketplace-84hsf\" (UID: \"00db91c2-32b1-4b21-9795-e47042b4b9a4\") " pod="openshift-marketplace/redhat-marketplace-84hsf" Mar 14 07:01:07 crc kubenswrapper[4893]: I0314 07:01:07.991822 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00db91c2-32b1-4b21-9795-e47042b4b9a4-utilities\") pod \"redhat-marketplace-84hsf\" (UID: \"00db91c2-32b1-4b21-9795-e47042b4b9a4\") " pod="openshift-marketplace/redhat-marketplace-84hsf" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:07.993451 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00db91c2-32b1-4b21-9795-e47042b4b9a4-utilities\") pod \"redhat-marketplace-84hsf\" (UID: \"00db91c2-32b1-4b21-9795-e47042b4b9a4\") " pod="openshift-marketplace/redhat-marketplace-84hsf" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:07.993480 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00db91c2-32b1-4b21-9795-e47042b4b9a4-catalog-content\") pod \"redhat-marketplace-84hsf\" (UID: \"00db91c2-32b1-4b21-9795-e47042b4b9a4\") " pod="openshift-marketplace/redhat-marketplace-84hsf" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.018205 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dq87\" (UniqueName: \"kubernetes.io/projected/00db91c2-32b1-4b21-9795-e47042b4b9a4-kube-api-access-4dq87\") pod \"redhat-marketplace-84hsf\" (UID: \"00db91c2-32b1-4b21-9795-e47042b4b9a4\") " pod="openshift-marketplace/redhat-marketplace-84hsf" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.020689 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gn5xf"] Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.043823 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-557b98d776-rjlsc"] Mar 14 07:01:08 crc kubenswrapper[4893]: E0314 07:01:08.044081 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3c36b8d-4898-4780-8e79-f4986e676f9d" containerName="pruner" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.044096 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3c36b8d-4898-4780-8e79-f4986e676f9d" containerName="pruner" Mar 14 07:01:08 crc kubenswrapper[4893]: E0314 07:01:08.044126 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3009e07b-2452-425c-95c3-3a78fa993d62" containerName="collect-profiles" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.044135 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="3009e07b-2452-425c-95c3-3a78fa993d62" containerName="collect-profiles" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.044246 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3c36b8d-4898-4780-8e79-f4986e676f9d" containerName="pruner" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.044266 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="3009e07b-2452-425c-95c3-3a78fa993d62" containerName="collect-profiles" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.044819 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-557b98d776-rjlsc" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.046407 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.046940 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.047713 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.048079 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.048097 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.048597 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.049619 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f8c694bcf-tk89h"] Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.050280 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f8c694bcf-tk89h" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.054457 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-557b98d776-rjlsc"] Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.056655 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.057181 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.057334 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.057419 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.057614 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.057724 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.060615 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.074549 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f8c694bcf-tk89h"] Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.092998 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f3c36b8d-4898-4780-8e79-f4986e676f9d-kubelet-dir\") pod \"f3c36b8d-4898-4780-8e79-f4986e676f9d\" (UID: \"f3c36b8d-4898-4780-8e79-f4986e676f9d\") " Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.093055 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3009e07b-2452-425c-95c3-3a78fa993d62-secret-volume\") pod \"3009e07b-2452-425c-95c3-3a78fa993d62\" (UID: \"3009e07b-2452-425c-95c3-3a78fa993d62\") " Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.093099 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3009e07b-2452-425c-95c3-3a78fa993d62-config-volume\") pod \"3009e07b-2452-425c-95c3-3a78fa993d62\" (UID: \"3009e07b-2452-425c-95c3-3a78fa993d62\") " Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.093089 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f3c36b8d-4898-4780-8e79-f4986e676f9d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "f3c36b8d-4898-4780-8e79-f4986e676f9d" (UID: "f3c36b8d-4898-4780-8e79-f4986e676f9d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.093118 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvhx6\" (UniqueName: \"kubernetes.io/projected/3009e07b-2452-425c-95c3-3a78fa993d62-kube-api-access-vvhx6\") pod \"3009e07b-2452-425c-95c3-3a78fa993d62\" (UID: \"3009e07b-2452-425c-95c3-3a78fa993d62\") " Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.093233 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f3c36b8d-4898-4780-8e79-f4986e676f9d-kube-api-access\") pod \"f3c36b8d-4898-4780-8e79-f4986e676f9d\" (UID: \"f3c36b8d-4898-4780-8e79-f4986e676f9d\") " Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.093610 4893 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f3c36b8d-4898-4780-8e79-f4986e676f9d-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.094097 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3009e07b-2452-425c-95c3-3a78fa993d62-config-volume" (OuterVolumeSpecName: "config-volume") pod "3009e07b-2452-425c-95c3-3a78fa993d62" (UID: "3009e07b-2452-425c-95c3-3a78fa993d62"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.096921 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3009e07b-2452-425c-95c3-3a78fa993d62-kube-api-access-vvhx6" (OuterVolumeSpecName: "kube-api-access-vvhx6") pod "3009e07b-2452-425c-95c3-3a78fa993d62" (UID: "3009e07b-2452-425c-95c3-3a78fa993d62"). InnerVolumeSpecName "kube-api-access-vvhx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.097621 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3c36b8d-4898-4780-8e79-f4986e676f9d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "f3c36b8d-4898-4780-8e79-f4986e676f9d" (UID: "f3c36b8d-4898-4780-8e79-f4986e676f9d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.099760 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3009e07b-2452-425c-95c3-3a78fa993d62-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3009e07b-2452-425c-95c3-3a78fa993d62" (UID: "3009e07b-2452-425c-95c3-3a78fa993d62"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.123115 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-84hsf" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.194981 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdmc6\" (UniqueName: \"kubernetes.io/projected/f5d2015c-7890-46c7-af4c-f41ba6d04737-kube-api-access-mdmc6\") pod \"controller-manager-557b98d776-rjlsc\" (UID: \"f5d2015c-7890-46c7-af4c-f41ba6d04737\") " pod="openshift-controller-manager/controller-manager-557b98d776-rjlsc" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.195025 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f5d2015c-7890-46c7-af4c-f41ba6d04737-proxy-ca-bundles\") pod \"controller-manager-557b98d776-rjlsc\" (UID: \"f5d2015c-7890-46c7-af4c-f41ba6d04737\") " pod="openshift-controller-manager/controller-manager-557b98d776-rjlsc" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.195098 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8531aa7-1637-458e-9477-41aa6f89ac67-serving-cert\") pod \"route-controller-manager-7f8c694bcf-tk89h\" (UID: \"a8531aa7-1637-458e-9477-41aa6f89ac67\") " pod="openshift-route-controller-manager/route-controller-manager-7f8c694bcf-tk89h" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.195124 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f5d2015c-7890-46c7-af4c-f41ba6d04737-client-ca\") pod \"controller-manager-557b98d776-rjlsc\" (UID: \"f5d2015c-7890-46c7-af4c-f41ba6d04737\") " pod="openshift-controller-manager/controller-manager-557b98d776-rjlsc" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.195145 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5d2015c-7890-46c7-af4c-f41ba6d04737-serving-cert\") pod \"controller-manager-557b98d776-rjlsc\" (UID: \"f5d2015c-7890-46c7-af4c-f41ba6d04737\") " pod="openshift-controller-manager/controller-manager-557b98d776-rjlsc" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.195167 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8531aa7-1637-458e-9477-41aa6f89ac67-config\") pod \"route-controller-manager-7f8c694bcf-tk89h\" (UID: \"a8531aa7-1637-458e-9477-41aa6f89ac67\") " pod="openshift-route-controller-manager/route-controller-manager-7f8c694bcf-tk89h" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.195202 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a8531aa7-1637-458e-9477-41aa6f89ac67-client-ca\") pod \"route-controller-manager-7f8c694bcf-tk89h\" (UID: \"a8531aa7-1637-458e-9477-41aa6f89ac67\") " pod="openshift-route-controller-manager/route-controller-manager-7f8c694bcf-tk89h" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.195227 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5d2015c-7890-46c7-af4c-f41ba6d04737-config\") pod \"controller-manager-557b98d776-rjlsc\" (UID: \"f5d2015c-7890-46c7-af4c-f41ba6d04737\") " pod="openshift-controller-manager/controller-manager-557b98d776-rjlsc" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.195244 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6skhg\" (UniqueName: \"kubernetes.io/projected/a8531aa7-1637-458e-9477-41aa6f89ac67-kube-api-access-6skhg\") pod \"route-controller-manager-7f8c694bcf-tk89h\" (UID: \"a8531aa7-1637-458e-9477-41aa6f89ac67\") " pod="openshift-route-controller-manager/route-controller-manager-7f8c694bcf-tk89h" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.195277 4893 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3009e07b-2452-425c-95c3-3a78fa993d62-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.195288 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvhx6\" (UniqueName: \"kubernetes.io/projected/3009e07b-2452-425c-95c3-3a78fa993d62-kube-api-access-vvhx6\") on node \"crc\" DevicePath \"\"" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.195297 4893 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3009e07b-2452-425c-95c3-3a78fa993d62-config-volume\") on node \"crc\" DevicePath \"\"" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.195305 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f3c36b8d-4898-4780-8e79-f4986e676f9d-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.298653 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8531aa7-1637-458e-9477-41aa6f89ac67-serving-cert\") pod \"route-controller-manager-7f8c694bcf-tk89h\" (UID: \"a8531aa7-1637-458e-9477-41aa6f89ac67\") " pod="openshift-route-controller-manager/route-controller-manager-7f8c694bcf-tk89h" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.299045 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f5d2015c-7890-46c7-af4c-f41ba6d04737-client-ca\") pod \"controller-manager-557b98d776-rjlsc\" (UID: \"f5d2015c-7890-46c7-af4c-f41ba6d04737\") " pod="openshift-controller-manager/controller-manager-557b98d776-rjlsc" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.299113 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5d2015c-7890-46c7-af4c-f41ba6d04737-serving-cert\") pod \"controller-manager-557b98d776-rjlsc\" (UID: \"f5d2015c-7890-46c7-af4c-f41ba6d04737\") " pod="openshift-controller-manager/controller-manager-557b98d776-rjlsc" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.299175 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8531aa7-1637-458e-9477-41aa6f89ac67-config\") pod \"route-controller-manager-7f8c694bcf-tk89h\" (UID: \"a8531aa7-1637-458e-9477-41aa6f89ac67\") " pod="openshift-route-controller-manager/route-controller-manager-7f8c694bcf-tk89h" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.299294 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a8531aa7-1637-458e-9477-41aa6f89ac67-client-ca\") pod \"route-controller-manager-7f8c694bcf-tk89h\" (UID: \"a8531aa7-1637-458e-9477-41aa6f89ac67\") " pod="openshift-route-controller-manager/route-controller-manager-7f8c694bcf-tk89h" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.299369 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5d2015c-7890-46c7-af4c-f41ba6d04737-config\") pod \"controller-manager-557b98d776-rjlsc\" (UID: \"f5d2015c-7890-46c7-af4c-f41ba6d04737\") " pod="openshift-controller-manager/controller-manager-557b98d776-rjlsc" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.299398 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6skhg\" (UniqueName: \"kubernetes.io/projected/a8531aa7-1637-458e-9477-41aa6f89ac67-kube-api-access-6skhg\") pod \"route-controller-manager-7f8c694bcf-tk89h\" (UID: \"a8531aa7-1637-458e-9477-41aa6f89ac67\") " pod="openshift-route-controller-manager/route-controller-manager-7f8c694bcf-tk89h" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.299505 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdmc6\" (UniqueName: \"kubernetes.io/projected/f5d2015c-7890-46c7-af4c-f41ba6d04737-kube-api-access-mdmc6\") pod \"controller-manager-557b98d776-rjlsc\" (UID: \"f5d2015c-7890-46c7-af4c-f41ba6d04737\") " pod="openshift-controller-manager/controller-manager-557b98d776-rjlsc" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.299562 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f5d2015c-7890-46c7-af4c-f41ba6d04737-proxy-ca-bundles\") pod \"controller-manager-557b98d776-rjlsc\" (UID: \"f5d2015c-7890-46c7-af4c-f41ba6d04737\") " pod="openshift-controller-manager/controller-manager-557b98d776-rjlsc" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.301688 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f5d2015c-7890-46c7-af4c-f41ba6d04737-proxy-ca-bundles\") pod \"controller-manager-557b98d776-rjlsc\" (UID: \"f5d2015c-7890-46c7-af4c-f41ba6d04737\") " pod="openshift-controller-manager/controller-manager-557b98d776-rjlsc" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.301987 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a8531aa7-1637-458e-9477-41aa6f89ac67-client-ca\") pod \"route-controller-manager-7f8c694bcf-tk89h\" (UID: \"a8531aa7-1637-458e-9477-41aa6f89ac67\") " pod="openshift-route-controller-manager/route-controller-manager-7f8c694bcf-tk89h" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.303458 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5d2015c-7890-46c7-af4c-f41ba6d04737-config\") pod \"controller-manager-557b98d776-rjlsc\" (UID: \"f5d2015c-7890-46c7-af4c-f41ba6d04737\") " pod="openshift-controller-manager/controller-manager-557b98d776-rjlsc" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.303767 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8531aa7-1637-458e-9477-41aa6f89ac67-config\") pod \"route-controller-manager-7f8c694bcf-tk89h\" (UID: \"a8531aa7-1637-458e-9477-41aa6f89ac67\") " pod="openshift-route-controller-manager/route-controller-manager-7f8c694bcf-tk89h" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.306228 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f5d2015c-7890-46c7-af4c-f41ba6d04737-client-ca\") pod \"controller-manager-557b98d776-rjlsc\" (UID: \"f5d2015c-7890-46c7-af4c-f41ba6d04737\") " pod="openshift-controller-manager/controller-manager-557b98d776-rjlsc" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.307378 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8531aa7-1637-458e-9477-41aa6f89ac67-serving-cert\") pod \"route-controller-manager-7f8c694bcf-tk89h\" (UID: \"a8531aa7-1637-458e-9477-41aa6f89ac67\") " pod="openshift-route-controller-manager/route-controller-manager-7f8c694bcf-tk89h" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.308166 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5d2015c-7890-46c7-af4c-f41ba6d04737-serving-cert\") pod \"controller-manager-557b98d776-rjlsc\" (UID: \"f5d2015c-7890-46c7-af4c-f41ba6d04737\") " pod="openshift-controller-manager/controller-manager-557b98d776-rjlsc" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.324242 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6skhg\" (UniqueName: \"kubernetes.io/projected/a8531aa7-1637-458e-9477-41aa6f89ac67-kube-api-access-6skhg\") pod \"route-controller-manager-7f8c694bcf-tk89h\" (UID: \"a8531aa7-1637-458e-9477-41aa6f89ac67\") " pod="openshift-route-controller-manager/route-controller-manager-7f8c694bcf-tk89h" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.328264 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdmc6\" (UniqueName: \"kubernetes.io/projected/f5d2015c-7890-46c7-af4c-f41ba6d04737-kube-api-access-mdmc6\") pod \"controller-manager-557b98d776-rjlsc\" (UID: \"f5d2015c-7890-46c7-af4c-f41ba6d04737\") " pod="openshift-controller-manager/controller-manager-557b98d776-rjlsc" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.339583 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-84hsf"] Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.385025 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-557b98d776-rjlsc" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.396189 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f8c694bcf-tk89h" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.666353 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-84hsf" event={"ID":"00db91c2-32b1-4b21-9795-e47042b4b9a4","Type":"ContainerStarted","Data":"6975f86bbf0ae73d6c39bd214508fc8255f16230902b8dc4881c906e227fe6fd"} Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.670553 4893 generic.go:334] "Generic (PLEG): container finished" podID="212d2416-8201-4cae-a8b9-3121de2e8348" containerID="0b353a2ceb899b0c40f4537fd4e5c8d75dbe5c8ae4b340db7817680d2b954937" exitCode=0 Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.670612 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gn5xf" event={"ID":"212d2416-8201-4cae-a8b9-3121de2e8348","Type":"ContainerDied","Data":"0b353a2ceb899b0c40f4537fd4e5c8d75dbe5c8ae4b340db7817680d2b954937"} Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.670637 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gn5xf" event={"ID":"212d2416-8201-4cae-a8b9-3121de2e8348","Type":"ContainerStarted","Data":"e73e11a07f7d3f2820c603e841a9048a4b79bdab2f7ae5ec2fd94c3462e6a997"} Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.677221 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557860-g42hf" event={"ID":"3009e07b-2452-425c-95c3-3a78fa993d62","Type":"ContainerDied","Data":"7a0ae4aa4520ad5604e171adcc4ccc71f706656e975e7a676c3c34a478e0e9f2"} Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.677260 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a0ae4aa4520ad5604e171adcc4ccc71f706656e975e7a676c3c34a478e0e9f2" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.677340 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557860-g42hf" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.683700 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"f3c36b8d-4898-4780-8e79-f4986e676f9d","Type":"ContainerDied","Data":"d59c7d7390bfbb8861a61eb3b91f23e2a453aa379a2fd061897f486b3a6ad5db"} Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.683800 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d59c7d7390bfbb8861a61eb3b91f23e2a453aa379a2fd061897f486b3a6ad5db" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.683933 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.762015 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xsf27"] Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.763413 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xsf27" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.765006 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f8c694bcf-tk89h"] Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.766779 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 14 07:01:08 crc kubenswrapper[4893]: W0314 07:01:08.768373 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8531aa7_1637_458e_9477_41aa6f89ac67.slice/crio-995caf1824cf15547438e0690e2f7ad44a0b1e4a9c93e72f85c7aeda7939a8ad WatchSource:0}: Error finding container 995caf1824cf15547438e0690e2f7ad44a0b1e4a9c93e72f85c7aeda7939a8ad: Status 404 returned error can't find the container with id 995caf1824cf15547438e0690e2f7ad44a0b1e4a9c93e72f85c7aeda7939a8ad Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.769950 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xsf27"] Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.894241 4893 patch_prober.go:28] interesting pod/downloads-7954f5f757-vkgvv container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.894308 4893 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-vkgvv" podUID="b1b92f23-a052-41c6-817f-d43b04079105" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.894321 4893 patch_prober.go:28] interesting pod/downloads-7954f5f757-vkgvv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.894371 4893 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-vkgvv" podUID="b1b92f23-a052-41c6-817f-d43b04079105" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.911378 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a28de63-7c73-4b79-9242-7dda511afc68-catalog-content\") pod \"redhat-operators-xsf27\" (UID: \"3a28de63-7c73-4b79-9242-7dda511afc68\") " pod="openshift-marketplace/redhat-operators-xsf27" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.911425 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a28de63-7c73-4b79-9242-7dda511afc68-utilities\") pod \"redhat-operators-xsf27\" (UID: \"3a28de63-7c73-4b79-9242-7dda511afc68\") " pod="openshift-marketplace/redhat-operators-xsf27" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.911485 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldjfm\" (UniqueName: \"kubernetes.io/projected/3a28de63-7c73-4b79-9242-7dda511afc68-kube-api-access-ldjfm\") pod \"redhat-operators-xsf27\" (UID: \"3a28de63-7c73-4b79-9242-7dda511afc68\") " pod="openshift-marketplace/redhat-operators-xsf27" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.972918 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-557b98d776-rjlsc"] Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.974654 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-ss7r8" Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.980774 4893 patch_prober.go:28] interesting pod/router-default-5444994796-ss7r8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 14 07:01:08 crc kubenswrapper[4893]: [-]has-synced failed: reason withheld Mar 14 07:01:08 crc kubenswrapper[4893]: [+]process-running ok Mar 14 07:01:08 crc kubenswrapper[4893]: healthz check failed Mar 14 07:01:08 crc kubenswrapper[4893]: I0314 07:01:08.980828 4893 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ss7r8" podUID="1c71efc6-2e03-405c-84f9-6ba44b085df4" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 14 07:01:08 crc kubenswrapper[4893]: W0314 07:01:08.983164 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5d2015c_7890_46c7_af4c_f41ba6d04737.slice/crio-934e88daa1b17df22e6a4c45f4f77335f26cc4e0ae74392899d602f0170d31d2 WatchSource:0}: Error finding container 934e88daa1b17df22e6a4c45f4f77335f26cc4e0ae74392899d602f0170d31d2: Status 404 returned error can't find the container with id 934e88daa1b17df22e6a4c45f4f77335f26cc4e0ae74392899d602f0170d31d2 Mar 14 07:01:09 crc kubenswrapper[4893]: I0314 07:01:09.013012 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a28de63-7c73-4b79-9242-7dda511afc68-catalog-content\") pod \"redhat-operators-xsf27\" (UID: \"3a28de63-7c73-4b79-9242-7dda511afc68\") " pod="openshift-marketplace/redhat-operators-xsf27" Mar 14 07:01:09 crc kubenswrapper[4893]: I0314 07:01:09.013419 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a28de63-7c73-4b79-9242-7dda511afc68-utilities\") pod \"redhat-operators-xsf27\" (UID: \"3a28de63-7c73-4b79-9242-7dda511afc68\") " pod="openshift-marketplace/redhat-operators-xsf27" Mar 14 07:01:09 crc kubenswrapper[4893]: I0314 07:01:09.013683 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a28de63-7c73-4b79-9242-7dda511afc68-catalog-content\") pod \"redhat-operators-xsf27\" (UID: \"3a28de63-7c73-4b79-9242-7dda511afc68\") " pod="openshift-marketplace/redhat-operators-xsf27" Mar 14 07:01:09 crc kubenswrapper[4893]: I0314 07:01:09.013825 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a28de63-7c73-4b79-9242-7dda511afc68-utilities\") pod \"redhat-operators-xsf27\" (UID: \"3a28de63-7c73-4b79-9242-7dda511afc68\") " pod="openshift-marketplace/redhat-operators-xsf27" Mar 14 07:01:09 crc kubenswrapper[4893]: I0314 07:01:09.013904 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldjfm\" (UniqueName: \"kubernetes.io/projected/3a28de63-7c73-4b79-9242-7dda511afc68-kube-api-access-ldjfm\") pod \"redhat-operators-xsf27\" (UID: \"3a28de63-7c73-4b79-9242-7dda511afc68\") " pod="openshift-marketplace/redhat-operators-xsf27" Mar 14 07:01:09 crc kubenswrapper[4893]: I0314 07:01:09.037901 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldjfm\" (UniqueName: \"kubernetes.io/projected/3a28de63-7c73-4b79-9242-7dda511afc68-kube-api-access-ldjfm\") pod \"redhat-operators-xsf27\" (UID: \"3a28de63-7c73-4b79-9242-7dda511afc68\") " pod="openshift-marketplace/redhat-operators-xsf27" Mar 14 07:01:09 crc kubenswrapper[4893]: E0314 07:01:09.125363 4893 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5d47d584e62acecf57128aff773b22cb948e4d61adb20657341d87d377075cca" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 14 07:01:09 crc kubenswrapper[4893]: E0314 07:01:09.127219 4893 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5d47d584e62acecf57128aff773b22cb948e4d61adb20657341d87d377075cca" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 14 07:01:09 crc kubenswrapper[4893]: E0314 07:01:09.129955 4893 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5d47d584e62acecf57128aff773b22cb948e4d61adb20657341d87d377075cca" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 14 07:01:09 crc kubenswrapper[4893]: E0314 07:01:09.130010 4893 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-6xtr4" podUID="cf444907-0c34-43ab-9bbd-b9ef0773743c" containerName="kube-multus-additional-cni-plugins" Mar 14 07:01:09 crc kubenswrapper[4893]: I0314 07:01:09.155798 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6hlsj"] Mar 14 07:01:09 crc kubenswrapper[4893]: I0314 07:01:09.156998 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6hlsj" Mar 14 07:01:09 crc kubenswrapper[4893]: I0314 07:01:09.163237 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6hlsj"] Mar 14 07:01:09 crc kubenswrapper[4893]: I0314 07:01:09.163310 4893 ???:1] "http: TLS handshake error from 192.168.126.11:36304: no serving certificate available for the kubelet" Mar 14 07:01:09 crc kubenswrapper[4893]: I0314 07:01:09.220066 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 14 07:01:09 crc kubenswrapper[4893]: I0314 07:01:09.220921 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 14 07:01:09 crc kubenswrapper[4893]: I0314 07:01:09.229226 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xsf27" Mar 14 07:01:09 crc kubenswrapper[4893]: I0314 07:01:09.231330 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 14 07:01:09 crc kubenswrapper[4893]: I0314 07:01:09.231621 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 14 07:01:09 crc kubenswrapper[4893]: I0314 07:01:09.240251 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 14 07:01:09 crc kubenswrapper[4893]: I0314 07:01:09.317110 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-227db\" (UniqueName: \"kubernetes.io/projected/250e7aeb-ae56-47aa-99b5-47e01b338fd1-kube-api-access-227db\") pod \"redhat-operators-6hlsj\" (UID: \"250e7aeb-ae56-47aa-99b5-47e01b338fd1\") " pod="openshift-marketplace/redhat-operators-6hlsj" Mar 14 07:01:09 crc kubenswrapper[4893]: I0314 07:01:09.317175 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cacbe7d2-a8d1-4fb4-9ece-ac61ff329685-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"cacbe7d2-a8d1-4fb4-9ece-ac61ff329685\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 14 07:01:09 crc kubenswrapper[4893]: I0314 07:01:09.317210 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/250e7aeb-ae56-47aa-99b5-47e01b338fd1-catalog-content\") pod \"redhat-operators-6hlsj\" (UID: \"250e7aeb-ae56-47aa-99b5-47e01b338fd1\") " pod="openshift-marketplace/redhat-operators-6hlsj" Mar 14 07:01:09 crc kubenswrapper[4893]: I0314 07:01:09.317290 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/250e7aeb-ae56-47aa-99b5-47e01b338fd1-utilities\") pod \"redhat-operators-6hlsj\" (UID: \"250e7aeb-ae56-47aa-99b5-47e01b338fd1\") " pod="openshift-marketplace/redhat-operators-6hlsj" Mar 14 07:01:09 crc kubenswrapper[4893]: I0314 07:01:09.317321 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cacbe7d2-a8d1-4fb4-9ece-ac61ff329685-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"cacbe7d2-a8d1-4fb4-9ece-ac61ff329685\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 14 07:01:09 crc kubenswrapper[4893]: I0314 07:01:09.318615 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-psm2j" Mar 14 07:01:09 crc kubenswrapper[4893]: I0314 07:01:09.319478 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-psm2j" Mar 14 07:01:09 crc kubenswrapper[4893]: I0314 07:01:09.320955 4893 patch_prober.go:28] interesting pod/console-f9d7485db-psm2j container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Mar 14 07:01:09 crc kubenswrapper[4893]: I0314 07:01:09.320991 4893 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-psm2j" podUID="ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9" containerName="console" probeResult="failure" output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" Mar 14 07:01:09 crc kubenswrapper[4893]: I0314 07:01:09.397925 4893 patch_prober.go:28] interesting pod/apiserver-76f77b778f-btl2s container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 14 07:01:09 crc kubenswrapper[4893]: [+]log ok Mar 14 07:01:09 crc kubenswrapper[4893]: [+]etcd ok Mar 14 07:01:09 crc kubenswrapper[4893]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 14 07:01:09 crc kubenswrapper[4893]: [+]poststarthook/generic-apiserver-start-informers ok Mar 14 07:01:09 crc kubenswrapper[4893]: [+]poststarthook/max-in-flight-filter ok Mar 14 07:01:09 crc kubenswrapper[4893]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 14 07:01:09 crc kubenswrapper[4893]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 14 07:01:09 crc kubenswrapper[4893]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 14 07:01:09 crc kubenswrapper[4893]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Mar 14 07:01:09 crc kubenswrapper[4893]: [+]poststarthook/project.openshift.io-projectcache ok Mar 14 07:01:09 crc kubenswrapper[4893]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 14 07:01:09 crc kubenswrapper[4893]: [+]poststarthook/openshift.io-startinformers ok Mar 14 07:01:09 crc kubenswrapper[4893]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 14 07:01:09 crc kubenswrapper[4893]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 14 07:01:09 crc kubenswrapper[4893]: livez check failed Mar 14 07:01:09 crc kubenswrapper[4893]: I0314 07:01:09.397974 4893 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-btl2s" podUID="ef07db86-4677-41ce-8b6d-7960cc63a9b8" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 14 07:01:09 crc kubenswrapper[4893]: I0314 07:01:09.418239 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cacbe7d2-a8d1-4fb4-9ece-ac61ff329685-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"cacbe7d2-a8d1-4fb4-9ece-ac61ff329685\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 14 07:01:09 crc kubenswrapper[4893]: I0314 07:01:09.418651 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/250e7aeb-ae56-47aa-99b5-47e01b338fd1-catalog-content\") pod \"redhat-operators-6hlsj\" (UID: \"250e7aeb-ae56-47aa-99b5-47e01b338fd1\") " pod="openshift-marketplace/redhat-operators-6hlsj" Mar 14 07:01:09 crc kubenswrapper[4893]: I0314 07:01:09.418780 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/250e7aeb-ae56-47aa-99b5-47e01b338fd1-utilities\") pod \"redhat-operators-6hlsj\" (UID: \"250e7aeb-ae56-47aa-99b5-47e01b338fd1\") " pod="openshift-marketplace/redhat-operators-6hlsj" Mar 14 07:01:09 crc kubenswrapper[4893]: I0314 07:01:09.418811 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cacbe7d2-a8d1-4fb4-9ece-ac61ff329685-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"cacbe7d2-a8d1-4fb4-9ece-ac61ff329685\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 14 07:01:09 crc kubenswrapper[4893]: I0314 07:01:09.418833 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-227db\" (UniqueName: \"kubernetes.io/projected/250e7aeb-ae56-47aa-99b5-47e01b338fd1-kube-api-access-227db\") pod \"redhat-operators-6hlsj\" (UID: \"250e7aeb-ae56-47aa-99b5-47e01b338fd1\") " pod="openshift-marketplace/redhat-operators-6hlsj" Mar 14 07:01:09 crc kubenswrapper[4893]: I0314 07:01:09.419746 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/250e7aeb-ae56-47aa-99b5-47e01b338fd1-catalog-content\") pod \"redhat-operators-6hlsj\" (UID: \"250e7aeb-ae56-47aa-99b5-47e01b338fd1\") " pod="openshift-marketplace/redhat-operators-6hlsj" Mar 14 07:01:09 crc kubenswrapper[4893]: I0314 07:01:09.420218 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cacbe7d2-a8d1-4fb4-9ece-ac61ff329685-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"cacbe7d2-a8d1-4fb4-9ece-ac61ff329685\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 14 07:01:09 crc kubenswrapper[4893]: I0314 07:01:09.430043 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/250e7aeb-ae56-47aa-99b5-47e01b338fd1-utilities\") pod \"redhat-operators-6hlsj\" (UID: \"250e7aeb-ae56-47aa-99b5-47e01b338fd1\") " pod="openshift-marketplace/redhat-operators-6hlsj" Mar 14 07:01:09 crc kubenswrapper[4893]: I0314 07:01:09.440665 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f37e61e-cd0d-4394-bdfa-a7c08c0742f3" path="/var/lib/kubelet/pods/6f37e61e-cd0d-4394-bdfa-a7c08c0742f3/volumes" Mar 14 07:01:09 crc kubenswrapper[4893]: I0314 07:01:09.441477 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95fe2bd1-1c99-4105-931e-6a60dd881260" path="/var/lib/kubelet/pods/95fe2bd1-1c99-4105-931e-6a60dd881260/volumes" Mar 14 07:01:09 crc kubenswrapper[4893]: I0314 07:01:09.444230 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-btl2s" Mar 14 07:01:09 crc kubenswrapper[4893]: I0314 07:01:09.444261 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-btl2s" Mar 14 07:01:09 crc kubenswrapper[4893]: I0314 07:01:09.463122 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cacbe7d2-a8d1-4fb4-9ece-ac61ff329685-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"cacbe7d2-a8d1-4fb4-9ece-ac61ff329685\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 14 07:01:09 crc kubenswrapper[4893]: I0314 07:01:09.510693 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-227db\" (UniqueName: \"kubernetes.io/projected/250e7aeb-ae56-47aa-99b5-47e01b338fd1-kube-api-access-227db\") pod \"redhat-operators-6hlsj\" (UID: \"250e7aeb-ae56-47aa-99b5-47e01b338fd1\") " pod="openshift-marketplace/redhat-operators-6hlsj" Mar 14 07:01:09 crc kubenswrapper[4893]: I0314 07:01:09.537011 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xsf27"] Mar 14 07:01:09 crc kubenswrapper[4893]: I0314 07:01:09.541792 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 14 07:01:09 crc kubenswrapper[4893]: I0314 07:01:09.766030 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f8c694bcf-tk89h" event={"ID":"a8531aa7-1637-458e-9477-41aa6f89ac67","Type":"ContainerStarted","Data":"abe1124fdb72efcc0b1c3cbafeeaa4fefb56d790b52a9835d17c03e70b930a07"} Mar 14 07:01:09 crc kubenswrapper[4893]: I0314 07:01:09.766508 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f8c694bcf-tk89h" event={"ID":"a8531aa7-1637-458e-9477-41aa6f89ac67","Type":"ContainerStarted","Data":"995caf1824cf15547438e0690e2f7ad44a0b1e4a9c93e72f85c7aeda7939a8ad"} Mar 14 07:01:09 crc kubenswrapper[4893]: I0314 07:01:09.776385 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xsf27" event={"ID":"3a28de63-7c73-4b79-9242-7dda511afc68","Type":"ContainerStarted","Data":"dfcbbd27839215f5bfe14727bea300df23cbcad886196e54e6bdd5783b284eee"} Mar 14 07:01:09 crc kubenswrapper[4893]: I0314 07:01:09.783310 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-557b98d776-rjlsc" event={"ID":"f5d2015c-7890-46c7-af4c-f41ba6d04737","Type":"ContainerStarted","Data":"84360df70c3c0e454d28c422a3ce61eda4e38b12f2ea621d2d33c6b3a3813f66"} Mar 14 07:01:09 crc kubenswrapper[4893]: I0314 07:01:09.783386 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-557b98d776-rjlsc" event={"ID":"f5d2015c-7890-46c7-af4c-f41ba6d04737","Type":"ContainerStarted","Data":"934e88daa1b17df22e6a4c45f4f77335f26cc4e0ae74392899d602f0170d31d2"} Mar 14 07:01:09 crc kubenswrapper[4893]: I0314 07:01:09.789427 4893 generic.go:334] "Generic (PLEG): container finished" podID="00db91c2-32b1-4b21-9795-e47042b4b9a4" containerID="4b69fcab3f4b059d2e0c4faa086a3d2a184ca90d28b0a7ef54b75327656d28c5" exitCode=0 Mar 14 07:01:09 crc kubenswrapper[4893]: I0314 07:01:09.790529 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-84hsf" event={"ID":"00db91c2-32b1-4b21-9795-e47042b4b9a4","Type":"ContainerDied","Data":"4b69fcab3f4b059d2e0c4faa086a3d2a184ca90d28b0a7ef54b75327656d28c5"} Mar 14 07:01:09 crc kubenswrapper[4893]: I0314 07:01:09.795234 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6hlsj" Mar 14 07:01:09 crc kubenswrapper[4893]: I0314 07:01:09.979290 4893 patch_prober.go:28] interesting pod/router-default-5444994796-ss7r8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 14 07:01:09 crc kubenswrapper[4893]: [-]has-synced failed: reason withheld Mar 14 07:01:09 crc kubenswrapper[4893]: [+]process-running ok Mar 14 07:01:09 crc kubenswrapper[4893]: healthz check failed Mar 14 07:01:09 crc kubenswrapper[4893]: I0314 07:01:09.979366 4893 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ss7r8" podUID="1c71efc6-2e03-405c-84f9-6ba44b085df4" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 14 07:01:10 crc kubenswrapper[4893]: I0314 07:01:10.078689 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 14 07:01:10 crc kubenswrapper[4893]: I0314 07:01:10.224743 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6hlsj"] Mar 14 07:01:10 crc kubenswrapper[4893]: I0314 07:01:10.801608 4893 generic.go:334] "Generic (PLEG): container finished" podID="3a28de63-7c73-4b79-9242-7dda511afc68" containerID="474febce8eacef3d5593565114dfaa7f8dfb770d1b2839934b5abcfa043ad782" exitCode=0 Mar 14 07:01:10 crc kubenswrapper[4893]: I0314 07:01:10.801778 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xsf27" event={"ID":"3a28de63-7c73-4b79-9242-7dda511afc68","Type":"ContainerDied","Data":"474febce8eacef3d5593565114dfaa7f8dfb770d1b2839934b5abcfa043ad782"} Mar 14 07:01:10 crc kubenswrapper[4893]: I0314 07:01:10.804903 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"cacbe7d2-a8d1-4fb4-9ece-ac61ff329685","Type":"ContainerStarted","Data":"23ee05b30f9ba348ade34e6209a4f22402508e3cdbd17fe54d4a3376d0b560e7"} Mar 14 07:01:10 crc kubenswrapper[4893]: I0314 07:01:10.804926 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"cacbe7d2-a8d1-4fb4-9ece-ac61ff329685","Type":"ContainerStarted","Data":"3ea080d71ff6a309beab197b8ebb76436a2e4b0de7bb32b2c8cdae6725683e11"} Mar 14 07:01:10 crc kubenswrapper[4893]: I0314 07:01:10.807285 4893 generic.go:334] "Generic (PLEG): container finished" podID="250e7aeb-ae56-47aa-99b5-47e01b338fd1" containerID="2ea75b4d656e4cb1f477668083b27517f228be9251653a28525ccafe25c86c0e" exitCode=0 Mar 14 07:01:10 crc kubenswrapper[4893]: I0314 07:01:10.807362 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6hlsj" event={"ID":"250e7aeb-ae56-47aa-99b5-47e01b338fd1","Type":"ContainerDied","Data":"2ea75b4d656e4cb1f477668083b27517f228be9251653a28525ccafe25c86c0e"} Mar 14 07:01:10 crc kubenswrapper[4893]: I0314 07:01:10.807384 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6hlsj" event={"ID":"250e7aeb-ae56-47aa-99b5-47e01b338fd1","Type":"ContainerStarted","Data":"8479710ac7103e8375eadf05c3fa294bd42048ca83ede063787971a3d95b495d"} Mar 14 07:01:10 crc kubenswrapper[4893]: I0314 07:01:10.807737 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-557b98d776-rjlsc" Mar 14 07:01:10 crc kubenswrapper[4893]: I0314 07:01:10.820148 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-557b98d776-rjlsc" Mar 14 07:01:10 crc kubenswrapper[4893]: I0314 07:01:10.828989 4893 ???:1] "http: TLS handshake error from 192.168.126.11:36316: no serving certificate available for the kubelet" Mar 14 07:01:10 crc kubenswrapper[4893]: I0314 07:01:10.862566 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7f8c694bcf-tk89h" podStartSLOduration=4.862537798 podStartE2EDuration="4.862537798s" podCreationTimestamp="2026-03-14 07:01:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:01:10.836431663 +0000 UTC m=+150.098608475" watchObservedRunningTime="2026-03-14 07:01:10.862537798 +0000 UTC m=+150.124714590" Mar 14 07:01:10 crc kubenswrapper[4893]: I0314 07:01:10.864775 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-557b98d776-rjlsc" podStartSLOduration=4.864767452 podStartE2EDuration="4.864767452s" podCreationTimestamp="2026-03-14 07:01:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:01:10.859813982 +0000 UTC m=+150.121990784" watchObservedRunningTime="2026-03-14 07:01:10.864767452 +0000 UTC m=+150.126944244" Mar 14 07:01:10 crc kubenswrapper[4893]: I0314 07:01:10.914966 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=1.914946502 podStartE2EDuration="1.914946502s" podCreationTimestamp="2026-03-14 07:01:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:01:10.912465032 +0000 UTC m=+150.174641814" watchObservedRunningTime="2026-03-14 07:01:10.914946502 +0000 UTC m=+150.177123294" Mar 14 07:01:10 crc kubenswrapper[4893]: I0314 07:01:10.977090 4893 patch_prober.go:28] interesting pod/router-default-5444994796-ss7r8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 14 07:01:10 crc kubenswrapper[4893]: [-]has-synced failed: reason withheld Mar 14 07:01:10 crc kubenswrapper[4893]: [+]process-running ok Mar 14 07:01:10 crc kubenswrapper[4893]: healthz check failed Mar 14 07:01:10 crc kubenswrapper[4893]: I0314 07:01:10.977542 4893 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ss7r8" podUID="1c71efc6-2e03-405c-84f9-6ba44b085df4" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 14 07:01:11 crc kubenswrapper[4893]: I0314 07:01:11.831207 4893 generic.go:334] "Generic (PLEG): container finished" podID="cacbe7d2-a8d1-4fb4-9ece-ac61ff329685" containerID="23ee05b30f9ba348ade34e6209a4f22402508e3cdbd17fe54d4a3376d0b560e7" exitCode=0 Mar 14 07:01:11 crc kubenswrapper[4893]: I0314 07:01:11.831367 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"cacbe7d2-a8d1-4fb4-9ece-ac61ff329685","Type":"ContainerDied","Data":"23ee05b30f9ba348ade34e6209a4f22402508e3cdbd17fe54d4a3376d0b560e7"} Mar 14 07:01:11 crc kubenswrapper[4893]: I0314 07:01:11.976182 4893 patch_prober.go:28] interesting pod/router-default-5444994796-ss7r8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 14 07:01:11 crc kubenswrapper[4893]: [-]has-synced failed: reason withheld Mar 14 07:01:11 crc kubenswrapper[4893]: [+]process-running ok Mar 14 07:01:11 crc kubenswrapper[4893]: healthz check failed Mar 14 07:01:11 crc kubenswrapper[4893]: I0314 07:01:11.976334 4893 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ss7r8" podUID="1c71efc6-2e03-405c-84f9-6ba44b085df4" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 14 07:01:12 crc kubenswrapper[4893]: I0314 07:01:12.389296 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 14 07:01:12 crc kubenswrapper[4893]: I0314 07:01:12.976710 4893 patch_prober.go:28] interesting pod/router-default-5444994796-ss7r8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 14 07:01:12 crc kubenswrapper[4893]: [-]has-synced failed: reason withheld Mar 14 07:01:12 crc kubenswrapper[4893]: [+]process-running ok Mar 14 07:01:12 crc kubenswrapper[4893]: healthz check failed Mar 14 07:01:12 crc kubenswrapper[4893]: I0314 07:01:12.976765 4893 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ss7r8" podUID="1c71efc6-2e03-405c-84f9-6ba44b085df4" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 14 07:01:13 crc kubenswrapper[4893]: I0314 07:01:13.976574 4893 patch_prober.go:28] interesting pod/router-default-5444994796-ss7r8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 14 07:01:13 crc kubenswrapper[4893]: [-]has-synced failed: reason withheld Mar 14 07:01:13 crc kubenswrapper[4893]: [+]process-running ok Mar 14 07:01:13 crc kubenswrapper[4893]: healthz check failed Mar 14 07:01:13 crc kubenswrapper[4893]: I0314 07:01:13.977042 4893 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ss7r8" podUID="1c71efc6-2e03-405c-84f9-6ba44b085df4" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 14 07:01:14 crc kubenswrapper[4893]: I0314 07:01:14.310562 4893 ???:1] "http: TLS handshake error from 192.168.126.11:36324: no serving certificate available for the kubelet" Mar 14 07:01:14 crc kubenswrapper[4893]: I0314 07:01:14.389871 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-btl2s" Mar 14 07:01:14 crc kubenswrapper[4893]: I0314 07:01:14.394496 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-btl2s" Mar 14 07:01:14 crc kubenswrapper[4893]: I0314 07:01:14.409378 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=2.409357118 podStartE2EDuration="2.409357118s" podCreationTimestamp="2026-03-14 07:01:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:01:14.404225233 +0000 UTC m=+153.666402035" watchObservedRunningTime="2026-03-14 07:01:14.409357118 +0000 UTC m=+153.671533910" Mar 14 07:01:14 crc kubenswrapper[4893]: I0314 07:01:14.480356 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-xxwxx" Mar 14 07:01:14 crc kubenswrapper[4893]: I0314 07:01:14.975769 4893 patch_prober.go:28] interesting pod/router-default-5444994796-ss7r8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 14 07:01:14 crc kubenswrapper[4893]: [-]has-synced failed: reason withheld Mar 14 07:01:14 crc kubenswrapper[4893]: [+]process-running ok Mar 14 07:01:14 crc kubenswrapper[4893]: healthz check failed Mar 14 07:01:14 crc kubenswrapper[4893]: I0314 07:01:14.975833 4893 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ss7r8" podUID="1c71efc6-2e03-405c-84f9-6ba44b085df4" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 14 07:01:15 crc kubenswrapper[4893]: I0314 07:01:15.976483 4893 patch_prober.go:28] interesting pod/router-default-5444994796-ss7r8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 14 07:01:15 crc kubenswrapper[4893]: [-]has-synced failed: reason withheld Mar 14 07:01:15 crc kubenswrapper[4893]: [+]process-running ok Mar 14 07:01:15 crc kubenswrapper[4893]: healthz check failed Mar 14 07:01:15 crc kubenswrapper[4893]: I0314 07:01:15.976815 4893 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ss7r8" podUID="1c71efc6-2e03-405c-84f9-6ba44b085df4" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 14 07:01:16 crc kubenswrapper[4893]: I0314 07:01:16.975889 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-ss7r8" Mar 14 07:01:16 crc kubenswrapper[4893]: I0314 07:01:16.979123 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-ss7r8" Mar 14 07:01:18 crc kubenswrapper[4893]: I0314 07:01:18.396721 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7f8c694bcf-tk89h" Mar 14 07:01:18 crc kubenswrapper[4893]: I0314 07:01:18.402099 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7f8c694bcf-tk89h" Mar 14 07:01:18 crc kubenswrapper[4893]: I0314 07:01:18.877164 4893 patch_prober.go:28] interesting pod/downloads-7954f5f757-vkgvv container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Mar 14 07:01:18 crc kubenswrapper[4893]: I0314 07:01:18.877372 4893 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-vkgvv" podUID="b1b92f23-a052-41c6-817f-d43b04079105" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Mar 14 07:01:18 crc kubenswrapper[4893]: I0314 07:01:18.877640 4893 patch_prober.go:28] interesting pod/downloads-7954f5f757-vkgvv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Mar 14 07:01:18 crc kubenswrapper[4893]: I0314 07:01:18.877709 4893 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-vkgvv" podUID="b1b92f23-a052-41c6-817f-d43b04079105" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Mar 14 07:01:19 crc kubenswrapper[4893]: E0314 07:01:19.126677 4893 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5d47d584e62acecf57128aff773b22cb948e4d61adb20657341d87d377075cca" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 14 07:01:19 crc kubenswrapper[4893]: E0314 07:01:19.130231 4893 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5d47d584e62acecf57128aff773b22cb948e4d61adb20657341d87d377075cca" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 14 07:01:19 crc kubenswrapper[4893]: E0314 07:01:19.131609 4893 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5d47d584e62acecf57128aff773b22cb948e4d61adb20657341d87d377075cca" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 14 07:01:19 crc kubenswrapper[4893]: E0314 07:01:19.131640 4893 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-6xtr4" podUID="cf444907-0c34-43ab-9bbd-b9ef0773743c" containerName="kube-multus-additional-cni-plugins" Mar 14 07:01:19 crc kubenswrapper[4893]: I0314 07:01:19.318796 4893 patch_prober.go:28] interesting pod/console-f9d7485db-psm2j container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Mar 14 07:01:19 crc kubenswrapper[4893]: I0314 07:01:19.318858 4893 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-psm2j" podUID="ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9" containerName="console" probeResult="failure" output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" Mar 14 07:01:21 crc kubenswrapper[4893]: I0314 07:01:21.663581 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 14 07:01:21 crc kubenswrapper[4893]: I0314 07:01:21.820752 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cacbe7d2-a8d1-4fb4-9ece-ac61ff329685-kubelet-dir\") pod \"cacbe7d2-a8d1-4fb4-9ece-ac61ff329685\" (UID: \"cacbe7d2-a8d1-4fb4-9ece-ac61ff329685\") " Mar 14 07:01:21 crc kubenswrapper[4893]: I0314 07:01:21.820891 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cacbe7d2-a8d1-4fb4-9ece-ac61ff329685-kube-api-access\") pod \"cacbe7d2-a8d1-4fb4-9ece-ac61ff329685\" (UID: \"cacbe7d2-a8d1-4fb4-9ece-ac61ff329685\") " Mar 14 07:01:21 crc kubenswrapper[4893]: I0314 07:01:21.820968 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cacbe7d2-a8d1-4fb4-9ece-ac61ff329685-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "cacbe7d2-a8d1-4fb4-9ece-ac61ff329685" (UID: "cacbe7d2-a8d1-4fb4-9ece-ac61ff329685"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:01:21 crc kubenswrapper[4893]: I0314 07:01:21.821231 4893 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cacbe7d2-a8d1-4fb4-9ece-ac61ff329685-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 14 07:01:21 crc kubenswrapper[4893]: I0314 07:01:21.829612 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cacbe7d2-a8d1-4fb4-9ece-ac61ff329685-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "cacbe7d2-a8d1-4fb4-9ece-ac61ff329685" (UID: "cacbe7d2-a8d1-4fb4-9ece-ac61ff329685"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:01:21 crc kubenswrapper[4893]: I0314 07:01:21.908741 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"cacbe7d2-a8d1-4fb4-9ece-ac61ff329685","Type":"ContainerDied","Data":"3ea080d71ff6a309beab197b8ebb76436a2e4b0de7bb32b2c8cdae6725683e11"} Mar 14 07:01:21 crc kubenswrapper[4893]: I0314 07:01:21.908789 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 14 07:01:21 crc kubenswrapper[4893]: I0314 07:01:21.908792 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ea080d71ff6a309beab197b8ebb76436a2e4b0de7bb32b2c8cdae6725683e11" Mar 14 07:01:21 crc kubenswrapper[4893]: I0314 07:01:21.924112 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cacbe7d2-a8d1-4fb4-9ece-ac61ff329685-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 14 07:01:25 crc kubenswrapper[4893]: I0314 07:01:25.814134 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-557b98d776-rjlsc"] Mar 14 07:01:25 crc kubenswrapper[4893]: I0314 07:01:25.815918 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-557b98d776-rjlsc" podUID="f5d2015c-7890-46c7-af4c-f41ba6d04737" containerName="controller-manager" containerID="cri-o://84360df70c3c0e454d28c422a3ce61eda4e38b12f2ea621d2d33c6b3a3813f66" gracePeriod=30 Mar 14 07:01:25 crc kubenswrapper[4893]: I0314 07:01:25.831670 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f8c694bcf-tk89h"] Mar 14 07:01:25 crc kubenswrapper[4893]: I0314 07:01:25.832112 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7f8c694bcf-tk89h" podUID="a8531aa7-1637-458e-9477-41aa6f89ac67" containerName="route-controller-manager" containerID="cri-o://abe1124fdb72efcc0b1c3cbafeeaa4fefb56d790b52a9835d17c03e70b930a07" gracePeriod=30 Mar 14 07:01:26 crc kubenswrapper[4893]: I0314 07:01:26.610424 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:01:26 crc kubenswrapper[4893]: I0314 07:01:26.939767 4893 generic.go:334] "Generic (PLEG): container finished" podID="a8531aa7-1637-458e-9477-41aa6f89ac67" containerID="abe1124fdb72efcc0b1c3cbafeeaa4fefb56d790b52a9835d17c03e70b930a07" exitCode=0 Mar 14 07:01:26 crc kubenswrapper[4893]: I0314 07:01:26.939832 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f8c694bcf-tk89h" event={"ID":"a8531aa7-1637-458e-9477-41aa6f89ac67","Type":"ContainerDied","Data":"abe1124fdb72efcc0b1c3cbafeeaa4fefb56d790b52a9835d17c03e70b930a07"} Mar 14 07:01:26 crc kubenswrapper[4893]: I0314 07:01:26.941687 4893 generic.go:334] "Generic (PLEG): container finished" podID="f5d2015c-7890-46c7-af4c-f41ba6d04737" containerID="84360df70c3c0e454d28c422a3ce61eda4e38b12f2ea621d2d33c6b3a3813f66" exitCode=0 Mar 14 07:01:26 crc kubenswrapper[4893]: I0314 07:01:26.941719 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-557b98d776-rjlsc" event={"ID":"f5d2015c-7890-46c7-af4c-f41ba6d04737","Type":"ContainerDied","Data":"84360df70c3c0e454d28c422a3ce61eda4e38b12f2ea621d2d33c6b3a3813f66"} Mar 14 07:01:28 crc kubenswrapper[4893]: I0314 07:01:28.895710 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-vkgvv" Mar 14 07:01:29 crc kubenswrapper[4893]: E0314 07:01:29.123540 4893 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5d47d584e62acecf57128aff773b22cb948e4d61adb20657341d87d377075cca" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 14 07:01:29 crc kubenswrapper[4893]: E0314 07:01:29.124741 4893 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5d47d584e62acecf57128aff773b22cb948e4d61adb20657341d87d377075cca" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 14 07:01:29 crc kubenswrapper[4893]: E0314 07:01:29.126361 4893 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5d47d584e62acecf57128aff773b22cb948e4d61adb20657341d87d377075cca" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 14 07:01:29 crc kubenswrapper[4893]: E0314 07:01:29.126456 4893 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-6xtr4" podUID="cf444907-0c34-43ab-9bbd-b9ef0773743c" containerName="kube-multus-additional-cni-plugins" Mar 14 07:01:29 crc kubenswrapper[4893]: I0314 07:01:29.324133 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-psm2j" Mar 14 07:01:29 crc kubenswrapper[4893]: I0314 07:01:29.333049 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-psm2j" Mar 14 07:01:29 crc kubenswrapper[4893]: I0314 07:01:29.386205 4893 patch_prober.go:28] interesting pod/controller-manager-557b98d776-rjlsc container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.51:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 07:01:29 crc kubenswrapper[4893]: I0314 07:01:29.387034 4893 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-557b98d776-rjlsc" podUID="f5d2015c-7890-46c7-af4c-f41ba6d04737" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.51:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 07:01:29 crc kubenswrapper[4893]: I0314 07:01:29.397776 4893 patch_prober.go:28] interesting pod/route-controller-manager-7f8c694bcf-tk89h container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.52:8443/healthz\": dial tcp 10.217.0.52:8443: i/o timeout" start-of-body= Mar 14 07:01:29 crc kubenswrapper[4893]: I0314 07:01:29.397873 4893 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7f8c694bcf-tk89h" podUID="a8531aa7-1637-458e-9477-41aa6f89ac67" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.52:8443/healthz\": dial tcp 10.217.0.52:8443: i/o timeout" Mar 14 07:01:34 crc kubenswrapper[4893]: I0314 07:01:34.095415 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-557b98d776-rjlsc" Mar 14 07:01:34 crc kubenswrapper[4893]: I0314 07:01:34.133812 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-84b49d764f-vrgs2"] Mar 14 07:01:34 crc kubenswrapper[4893]: E0314 07:01:34.134070 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cacbe7d2-a8d1-4fb4-9ece-ac61ff329685" containerName="pruner" Mar 14 07:01:34 crc kubenswrapper[4893]: I0314 07:01:34.134090 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="cacbe7d2-a8d1-4fb4-9ece-ac61ff329685" containerName="pruner" Mar 14 07:01:34 crc kubenswrapper[4893]: E0314 07:01:34.134116 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5d2015c-7890-46c7-af4c-f41ba6d04737" containerName="controller-manager" Mar 14 07:01:34 crc kubenswrapper[4893]: I0314 07:01:34.134124 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5d2015c-7890-46c7-af4c-f41ba6d04737" containerName="controller-manager" Mar 14 07:01:34 crc kubenswrapper[4893]: I0314 07:01:34.134251 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5d2015c-7890-46c7-af4c-f41ba6d04737" containerName="controller-manager" Mar 14 07:01:34 crc kubenswrapper[4893]: I0314 07:01:34.134267 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="cacbe7d2-a8d1-4fb4-9ece-ac61ff329685" containerName="pruner" Mar 14 07:01:34 crc kubenswrapper[4893]: I0314 07:01:34.134786 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-84b49d764f-vrgs2" Mar 14 07:01:34 crc kubenswrapper[4893]: I0314 07:01:34.143870 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-84b49d764f-vrgs2"] Mar 14 07:01:34 crc kubenswrapper[4893]: I0314 07:01:34.187110 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdmc6\" (UniqueName: \"kubernetes.io/projected/f5d2015c-7890-46c7-af4c-f41ba6d04737-kube-api-access-mdmc6\") pod \"f5d2015c-7890-46c7-af4c-f41ba6d04737\" (UID: \"f5d2015c-7890-46c7-af4c-f41ba6d04737\") " Mar 14 07:01:34 crc kubenswrapper[4893]: I0314 07:01:34.187158 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5d2015c-7890-46c7-af4c-f41ba6d04737-config\") pod \"f5d2015c-7890-46c7-af4c-f41ba6d04737\" (UID: \"f5d2015c-7890-46c7-af4c-f41ba6d04737\") " Mar 14 07:01:34 crc kubenswrapper[4893]: I0314 07:01:34.187203 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f5d2015c-7890-46c7-af4c-f41ba6d04737-client-ca\") pod \"f5d2015c-7890-46c7-af4c-f41ba6d04737\" (UID: \"f5d2015c-7890-46c7-af4c-f41ba6d04737\") " Mar 14 07:01:34 crc kubenswrapper[4893]: I0314 07:01:34.187247 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f5d2015c-7890-46c7-af4c-f41ba6d04737-proxy-ca-bundles\") pod \"f5d2015c-7890-46c7-af4c-f41ba6d04737\" (UID: \"f5d2015c-7890-46c7-af4c-f41ba6d04737\") " Mar 14 07:01:34 crc kubenswrapper[4893]: I0314 07:01:34.187277 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5d2015c-7890-46c7-af4c-f41ba6d04737-serving-cert\") pod \"f5d2015c-7890-46c7-af4c-f41ba6d04737\" (UID: \"f5d2015c-7890-46c7-af4c-f41ba6d04737\") " Mar 14 07:01:34 crc kubenswrapper[4893]: I0314 07:01:34.188161 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5d2015c-7890-46c7-af4c-f41ba6d04737-config" (OuterVolumeSpecName: "config") pod "f5d2015c-7890-46c7-af4c-f41ba6d04737" (UID: "f5d2015c-7890-46c7-af4c-f41ba6d04737"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:01:34 crc kubenswrapper[4893]: I0314 07:01:34.188396 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5d2015c-7890-46c7-af4c-f41ba6d04737-client-ca" (OuterVolumeSpecName: "client-ca") pod "f5d2015c-7890-46c7-af4c-f41ba6d04737" (UID: "f5d2015c-7890-46c7-af4c-f41ba6d04737"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:01:34 crc kubenswrapper[4893]: I0314 07:01:34.188480 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5d2015c-7890-46c7-af4c-f41ba6d04737-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "f5d2015c-7890-46c7-af4c-f41ba6d04737" (UID: "f5d2015c-7890-46c7-af4c-f41ba6d04737"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:01:34 crc kubenswrapper[4893]: I0314 07:01:34.193436 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5d2015c-7890-46c7-af4c-f41ba6d04737-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f5d2015c-7890-46c7-af4c-f41ba6d04737" (UID: "f5d2015c-7890-46c7-af4c-f41ba6d04737"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:01:34 crc kubenswrapper[4893]: I0314 07:01:34.199390 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5d2015c-7890-46c7-af4c-f41ba6d04737-kube-api-access-mdmc6" (OuterVolumeSpecName: "kube-api-access-mdmc6") pod "f5d2015c-7890-46c7-af4c-f41ba6d04737" (UID: "f5d2015c-7890-46c7-af4c-f41ba6d04737"). InnerVolumeSpecName "kube-api-access-mdmc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:01:34 crc kubenswrapper[4893]: I0314 07:01:34.288707 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/023e988c-ee82-44cf-aef6-235e9f4dbce0-client-ca\") pod \"controller-manager-84b49d764f-vrgs2\" (UID: \"023e988c-ee82-44cf-aef6-235e9f4dbce0\") " pod="openshift-controller-manager/controller-manager-84b49d764f-vrgs2" Mar 14 07:01:34 crc kubenswrapper[4893]: I0314 07:01:34.288765 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkgj8\" (UniqueName: \"kubernetes.io/projected/023e988c-ee82-44cf-aef6-235e9f4dbce0-kube-api-access-rkgj8\") pod \"controller-manager-84b49d764f-vrgs2\" (UID: \"023e988c-ee82-44cf-aef6-235e9f4dbce0\") " pod="openshift-controller-manager/controller-manager-84b49d764f-vrgs2" Mar 14 07:01:34 crc kubenswrapper[4893]: I0314 07:01:34.288805 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/023e988c-ee82-44cf-aef6-235e9f4dbce0-proxy-ca-bundles\") pod \"controller-manager-84b49d764f-vrgs2\" (UID: \"023e988c-ee82-44cf-aef6-235e9f4dbce0\") " pod="openshift-controller-manager/controller-manager-84b49d764f-vrgs2" Mar 14 07:01:34 crc kubenswrapper[4893]: I0314 07:01:34.289017 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/023e988c-ee82-44cf-aef6-235e9f4dbce0-config\") pod \"controller-manager-84b49d764f-vrgs2\" (UID: \"023e988c-ee82-44cf-aef6-235e9f4dbce0\") " pod="openshift-controller-manager/controller-manager-84b49d764f-vrgs2" Mar 14 07:01:34 crc kubenswrapper[4893]: I0314 07:01:34.289082 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/023e988c-ee82-44cf-aef6-235e9f4dbce0-serving-cert\") pod \"controller-manager-84b49d764f-vrgs2\" (UID: \"023e988c-ee82-44cf-aef6-235e9f4dbce0\") " pod="openshift-controller-manager/controller-manager-84b49d764f-vrgs2" Mar 14 07:01:34 crc kubenswrapper[4893]: I0314 07:01:34.289228 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdmc6\" (UniqueName: \"kubernetes.io/projected/f5d2015c-7890-46c7-af4c-f41ba6d04737-kube-api-access-mdmc6\") on node \"crc\" DevicePath \"\"" Mar 14 07:01:34 crc kubenswrapper[4893]: I0314 07:01:34.289249 4893 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5d2015c-7890-46c7-af4c-f41ba6d04737-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:01:34 crc kubenswrapper[4893]: I0314 07:01:34.289260 4893 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f5d2015c-7890-46c7-af4c-f41ba6d04737-client-ca\") on node \"crc\" DevicePath \"\"" Mar 14 07:01:34 crc kubenswrapper[4893]: I0314 07:01:34.289270 4893 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f5d2015c-7890-46c7-af4c-f41ba6d04737-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 14 07:01:34 crc kubenswrapper[4893]: I0314 07:01:34.289279 4893 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5d2015c-7890-46c7-af4c-f41ba6d04737-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:01:34 crc kubenswrapper[4893]: I0314 07:01:34.389974 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/023e988c-ee82-44cf-aef6-235e9f4dbce0-config\") pod \"controller-manager-84b49d764f-vrgs2\" (UID: \"023e988c-ee82-44cf-aef6-235e9f4dbce0\") " pod="openshift-controller-manager/controller-manager-84b49d764f-vrgs2" Mar 14 07:01:34 crc kubenswrapper[4893]: I0314 07:01:34.390034 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/023e988c-ee82-44cf-aef6-235e9f4dbce0-serving-cert\") pod \"controller-manager-84b49d764f-vrgs2\" (UID: \"023e988c-ee82-44cf-aef6-235e9f4dbce0\") " pod="openshift-controller-manager/controller-manager-84b49d764f-vrgs2" Mar 14 07:01:34 crc kubenswrapper[4893]: I0314 07:01:34.390099 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/023e988c-ee82-44cf-aef6-235e9f4dbce0-client-ca\") pod \"controller-manager-84b49d764f-vrgs2\" (UID: \"023e988c-ee82-44cf-aef6-235e9f4dbce0\") " pod="openshift-controller-manager/controller-manager-84b49d764f-vrgs2" Mar 14 07:01:34 crc kubenswrapper[4893]: I0314 07:01:34.390123 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkgj8\" (UniqueName: \"kubernetes.io/projected/023e988c-ee82-44cf-aef6-235e9f4dbce0-kube-api-access-rkgj8\") pod \"controller-manager-84b49d764f-vrgs2\" (UID: \"023e988c-ee82-44cf-aef6-235e9f4dbce0\") " pod="openshift-controller-manager/controller-manager-84b49d764f-vrgs2" Mar 14 07:01:34 crc kubenswrapper[4893]: I0314 07:01:34.390164 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/023e988c-ee82-44cf-aef6-235e9f4dbce0-proxy-ca-bundles\") pod \"controller-manager-84b49d764f-vrgs2\" (UID: \"023e988c-ee82-44cf-aef6-235e9f4dbce0\") " pod="openshift-controller-manager/controller-manager-84b49d764f-vrgs2" Mar 14 07:01:34 crc kubenswrapper[4893]: I0314 07:01:34.391627 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/023e988c-ee82-44cf-aef6-235e9f4dbce0-proxy-ca-bundles\") pod \"controller-manager-84b49d764f-vrgs2\" (UID: \"023e988c-ee82-44cf-aef6-235e9f4dbce0\") " pod="openshift-controller-manager/controller-manager-84b49d764f-vrgs2" Mar 14 07:01:34 crc kubenswrapper[4893]: I0314 07:01:34.392135 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/023e988c-ee82-44cf-aef6-235e9f4dbce0-client-ca\") pod \"controller-manager-84b49d764f-vrgs2\" (UID: \"023e988c-ee82-44cf-aef6-235e9f4dbce0\") " pod="openshift-controller-manager/controller-manager-84b49d764f-vrgs2" Mar 14 07:01:34 crc kubenswrapper[4893]: I0314 07:01:34.393488 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/023e988c-ee82-44cf-aef6-235e9f4dbce0-config\") pod \"controller-manager-84b49d764f-vrgs2\" (UID: \"023e988c-ee82-44cf-aef6-235e9f4dbce0\") " pod="openshift-controller-manager/controller-manager-84b49d764f-vrgs2" Mar 14 07:01:34 crc kubenswrapper[4893]: I0314 07:01:34.396104 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/023e988c-ee82-44cf-aef6-235e9f4dbce0-serving-cert\") pod \"controller-manager-84b49d764f-vrgs2\" (UID: \"023e988c-ee82-44cf-aef6-235e9f4dbce0\") " pod="openshift-controller-manager/controller-manager-84b49d764f-vrgs2" Mar 14 07:01:34 crc kubenswrapper[4893]: I0314 07:01:34.413952 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkgj8\" (UniqueName: \"kubernetes.io/projected/023e988c-ee82-44cf-aef6-235e9f4dbce0-kube-api-access-rkgj8\") pod \"controller-manager-84b49d764f-vrgs2\" (UID: \"023e988c-ee82-44cf-aef6-235e9f4dbce0\") " pod="openshift-controller-manager/controller-manager-84b49d764f-vrgs2" Mar 14 07:01:34 crc kubenswrapper[4893]: I0314 07:01:34.456948 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-84b49d764f-vrgs2" Mar 14 07:01:34 crc kubenswrapper[4893]: I0314 07:01:34.812047 4893 ???:1] "http: TLS handshake error from 192.168.126.11:37544: no serving certificate available for the kubelet" Mar 14 07:01:35 crc kubenswrapper[4893]: I0314 07:01:35.018961 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-557b98d776-rjlsc" Mar 14 07:01:35 crc kubenswrapper[4893]: I0314 07:01:35.018983 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-557b98d776-rjlsc" event={"ID":"f5d2015c-7890-46c7-af4c-f41ba6d04737","Type":"ContainerDied","Data":"934e88daa1b17df22e6a4c45f4f77335f26cc4e0ae74392899d602f0170d31d2"} Mar 14 07:01:35 crc kubenswrapper[4893]: I0314 07:01:35.019159 4893 scope.go:117] "RemoveContainer" containerID="84360df70c3c0e454d28c422a3ce61eda4e38b12f2ea621d2d33c6b3a3813f66" Mar 14 07:01:35 crc kubenswrapper[4893]: I0314 07:01:35.022497 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-6xtr4_cf444907-0c34-43ab-9bbd-b9ef0773743c/kube-multus-additional-cni-plugins/0.log" Mar 14 07:01:35 crc kubenswrapper[4893]: I0314 07:01:35.022760 4893 generic.go:334] "Generic (PLEG): container finished" podID="cf444907-0c34-43ab-9bbd-b9ef0773743c" containerID="5d47d584e62acecf57128aff773b22cb948e4d61adb20657341d87d377075cca" exitCode=137 Mar 14 07:01:35 crc kubenswrapper[4893]: I0314 07:01:35.022803 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-6xtr4" event={"ID":"cf444907-0c34-43ab-9bbd-b9ef0773743c","Type":"ContainerDied","Data":"5d47d584e62acecf57128aff773b22cb948e4d61adb20657341d87d377075cca"} Mar 14 07:01:35 crc kubenswrapper[4893]: I0314 07:01:35.072176 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-557b98d776-rjlsc"] Mar 14 07:01:35 crc kubenswrapper[4893]: I0314 07:01:35.075440 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-557b98d776-rjlsc"] Mar 14 07:01:35 crc kubenswrapper[4893]: I0314 07:01:35.384848 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5d2015c-7890-46c7-af4c-f41ba6d04737" path="/var/lib/kubelet/pods/f5d2015c-7890-46c7-af4c-f41ba6d04737/volumes" Mar 14 07:01:35 crc kubenswrapper[4893]: E0314 07:01:35.954001 4893 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 14 07:01:35 crc kubenswrapper[4893]: E0314 07:01:35.954176 4893 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5crnx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-727v2_openshift-marketplace(6eb806cc-dc34-40ff-b7d5-c33a575822ec): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 14 07:01:35 crc kubenswrapper[4893]: E0314 07:01:35.955350 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-727v2" podUID="6eb806cc-dc34-40ff-b7d5-c33a575822ec" Mar 14 07:01:38 crc kubenswrapper[4893]: E0314 07:01:38.177001 4893 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 14 07:01:38 crc kubenswrapper[4893]: E0314 07:01:38.178388 4893 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ks8zw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-vhk5b_openshift-marketplace(28f15d4e-4de1-481e-bd52-d4dcada252a4): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 14 07:01:38 crc kubenswrapper[4893]: E0314 07:01:38.179760 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-vhk5b" podUID="28f15d4e-4de1-481e-bd52-d4dcada252a4" Mar 14 07:01:39 crc kubenswrapper[4893]: I0314 07:01:39.052006 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vcv9c" Mar 14 07:01:39 crc kubenswrapper[4893]: E0314 07:01:39.121910 4893 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5d47d584e62acecf57128aff773b22cb948e4d61adb20657341d87d377075cca is running failed: container process not found" containerID="5d47d584e62acecf57128aff773b22cb948e4d61adb20657341d87d377075cca" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 14 07:01:39 crc kubenswrapper[4893]: E0314 07:01:39.122272 4893 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5d47d584e62acecf57128aff773b22cb948e4d61adb20657341d87d377075cca is running failed: container process not found" containerID="5d47d584e62acecf57128aff773b22cb948e4d61adb20657341d87d377075cca" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 14 07:01:39 crc kubenswrapper[4893]: E0314 07:01:39.122645 4893 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5d47d584e62acecf57128aff773b22cb948e4d61adb20657341d87d377075cca is running failed: container process not found" containerID="5d47d584e62acecf57128aff773b22cb948e4d61adb20657341d87d377075cca" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 14 07:01:39 crc kubenswrapper[4893]: E0314 07:01:39.122691 4893 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5d47d584e62acecf57128aff773b22cb948e4d61adb20657341d87d377075cca is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-6xtr4" podUID="cf444907-0c34-43ab-9bbd-b9ef0773743c" containerName="kube-multus-additional-cni-plugins" Mar 14 07:01:39 crc kubenswrapper[4893]: I0314 07:01:39.397036 4893 patch_prober.go:28] interesting pod/route-controller-manager-7f8c694bcf-tk89h container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.52:8443/healthz\": dial tcp 10.217.0.52:8443: i/o timeout" start-of-body= Mar 14 07:01:39 crc kubenswrapper[4893]: I0314 07:01:39.397100 4893 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7f8c694bcf-tk89h" podUID="a8531aa7-1637-458e-9477-41aa6f89ac67" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.52:8443/healthz\": dial tcp 10.217.0.52:8443: i/o timeout" Mar 14 07:01:39 crc kubenswrapper[4893]: E0314 07:01:39.706490 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-vhk5b" podUID="28f15d4e-4de1-481e-bd52-d4dcada252a4" Mar 14 07:01:39 crc kubenswrapper[4893]: E0314 07:01:39.706597 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-727v2" podUID="6eb806cc-dc34-40ff-b7d5-c33a575822ec" Mar 14 07:01:39 crc kubenswrapper[4893]: E0314 07:01:39.901405 4893 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 14 07:01:39 crc kubenswrapper[4893]: E0314 07:01:39.901637 4893 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d2m5r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-gn5xf_openshift-marketplace(212d2416-8201-4cae-a8b9-3121de2e8348): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 14 07:01:39 crc kubenswrapper[4893]: E0314 07:01:39.902871 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-gn5xf" podUID="212d2416-8201-4cae-a8b9-3121de2e8348" Mar 14 07:01:40 crc kubenswrapper[4893]: E0314 07:01:40.353591 4893 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 14 07:01:40 crc kubenswrapper[4893]: E0314 07:01:40.353926 4893 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4dq87,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-84hsf_openshift-marketplace(00db91c2-32b1-4b21-9795-e47042b4b9a4): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 14 07:01:40 crc kubenswrapper[4893]: E0314 07:01:40.355074 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-84hsf" podUID="00db91c2-32b1-4b21-9795-e47042b4b9a4" Mar 14 07:01:40 crc kubenswrapper[4893]: I0314 07:01:40.829168 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 14 07:01:40 crc kubenswrapper[4893]: I0314 07:01:40.829966 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 14 07:01:40 crc kubenswrapper[4893]: I0314 07:01:40.832204 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 14 07:01:40 crc kubenswrapper[4893]: I0314 07:01:40.832343 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 14 07:01:40 crc kubenswrapper[4893]: I0314 07:01:40.835316 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 14 07:01:40 crc kubenswrapper[4893]: I0314 07:01:40.982152 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c5e9e7e9-82a8-4fe5-87fe-68eac1a350f1-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c5e9e7e9-82a8-4fe5-87fe-68eac1a350f1\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 14 07:01:40 crc kubenswrapper[4893]: I0314 07:01:40.982201 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c5e9e7e9-82a8-4fe5-87fe-68eac1a350f1-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c5e9e7e9-82a8-4fe5-87fe-68eac1a350f1\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 14 07:01:41 crc kubenswrapper[4893]: I0314 07:01:41.082961 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c5e9e7e9-82a8-4fe5-87fe-68eac1a350f1-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c5e9e7e9-82a8-4fe5-87fe-68eac1a350f1\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 14 07:01:41 crc kubenswrapper[4893]: I0314 07:01:41.083039 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c5e9e7e9-82a8-4fe5-87fe-68eac1a350f1-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c5e9e7e9-82a8-4fe5-87fe-68eac1a350f1\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 14 07:01:41 crc kubenswrapper[4893]: I0314 07:01:41.083175 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c5e9e7e9-82a8-4fe5-87fe-68eac1a350f1-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c5e9e7e9-82a8-4fe5-87fe-68eac1a350f1\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 14 07:01:41 crc kubenswrapper[4893]: I0314 07:01:41.102083 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c5e9e7e9-82a8-4fe5-87fe-68eac1a350f1-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c5e9e7e9-82a8-4fe5-87fe-68eac1a350f1\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 14 07:01:41 crc kubenswrapper[4893]: I0314 07:01:41.153022 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 14 07:01:41 crc kubenswrapper[4893]: I0314 07:01:41.511339 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:01:43 crc kubenswrapper[4893]: E0314 07:01:43.499051 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-gn5xf" podUID="212d2416-8201-4cae-a8b9-3121de2e8348" Mar 14 07:01:43 crc kubenswrapper[4893]: E0314 07:01:43.499478 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-84hsf" podUID="00db91c2-32b1-4b21-9795-e47042b4b9a4" Mar 14 07:01:43 crc kubenswrapper[4893]: E0314 07:01:43.548626 4893 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 14 07:01:43 crc kubenswrapper[4893]: E0314 07:01:43.549042 4893 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ldjfm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-xsf27_openshift-marketplace(3a28de63-7c73-4b79-9242-7dda511afc68): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 14 07:01:43 crc kubenswrapper[4893]: E0314 07:01:43.550238 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-xsf27" podUID="3a28de63-7c73-4b79-9242-7dda511afc68" Mar 14 07:01:43 crc kubenswrapper[4893]: E0314 07:01:43.589880 4893 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 14 07:01:43 crc kubenswrapper[4893]: E0314 07:01:43.590267 4893 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-227db,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-6hlsj_openshift-marketplace(250e7aeb-ae56-47aa-99b5-47e01b338fd1): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 14 07:01:43 crc kubenswrapper[4893]: E0314 07:01:43.591789 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-6hlsj" podUID="250e7aeb-ae56-47aa-99b5-47e01b338fd1" Mar 14 07:01:43 crc kubenswrapper[4893]: I0314 07:01:43.612342 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f8c694bcf-tk89h" Mar 14 07:01:43 crc kubenswrapper[4893]: I0314 07:01:43.626538 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-6xtr4_cf444907-0c34-43ab-9bbd-b9ef0773743c/kube-multus-additional-cni-plugins/0.log" Mar 14 07:01:43 crc kubenswrapper[4893]: I0314 07:01:43.626673 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-6xtr4" Mar 14 07:01:43 crc kubenswrapper[4893]: I0314 07:01:43.647357 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d5584569d-z5tfn"] Mar 14 07:01:43 crc kubenswrapper[4893]: E0314 07:01:43.647652 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf444907-0c34-43ab-9bbd-b9ef0773743c" containerName="kube-multus-additional-cni-plugins" Mar 14 07:01:43 crc kubenswrapper[4893]: I0314 07:01:43.647669 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf444907-0c34-43ab-9bbd-b9ef0773743c" containerName="kube-multus-additional-cni-plugins" Mar 14 07:01:43 crc kubenswrapper[4893]: E0314 07:01:43.647691 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8531aa7-1637-458e-9477-41aa6f89ac67" containerName="route-controller-manager" Mar 14 07:01:43 crc kubenswrapper[4893]: I0314 07:01:43.647700 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8531aa7-1637-458e-9477-41aa6f89ac67" containerName="route-controller-manager" Mar 14 07:01:43 crc kubenswrapper[4893]: I0314 07:01:43.647859 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf444907-0c34-43ab-9bbd-b9ef0773743c" containerName="kube-multus-additional-cni-plugins" Mar 14 07:01:43 crc kubenswrapper[4893]: I0314 07:01:43.647872 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8531aa7-1637-458e-9477-41aa6f89ac67" containerName="route-controller-manager" Mar 14 07:01:43 crc kubenswrapper[4893]: I0314 07:01:43.648310 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d5584569d-z5tfn" Mar 14 07:01:43 crc kubenswrapper[4893]: I0314 07:01:43.665430 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d5584569d-z5tfn"] Mar 14 07:01:43 crc kubenswrapper[4893]: I0314 07:01:43.717863 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/cf444907-0c34-43ab-9bbd-b9ef0773743c-ready\") pod \"cf444907-0c34-43ab-9bbd-b9ef0773743c\" (UID: \"cf444907-0c34-43ab-9bbd-b9ef0773743c\") " Mar 14 07:01:43 crc kubenswrapper[4893]: I0314 07:01:43.717924 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8531aa7-1637-458e-9477-41aa6f89ac67-serving-cert\") pod \"a8531aa7-1637-458e-9477-41aa6f89ac67\" (UID: \"a8531aa7-1637-458e-9477-41aa6f89ac67\") " Mar 14 07:01:43 crc kubenswrapper[4893]: I0314 07:01:43.717952 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a8531aa7-1637-458e-9477-41aa6f89ac67-client-ca\") pod \"a8531aa7-1637-458e-9477-41aa6f89ac67\" (UID: \"a8531aa7-1637-458e-9477-41aa6f89ac67\") " Mar 14 07:01:43 crc kubenswrapper[4893]: I0314 07:01:43.717993 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8531aa7-1637-458e-9477-41aa6f89ac67-config\") pod \"a8531aa7-1637-458e-9477-41aa6f89ac67\" (UID: \"a8531aa7-1637-458e-9477-41aa6f89ac67\") " Mar 14 07:01:43 crc kubenswrapper[4893]: I0314 07:01:43.718018 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrqmf\" (UniqueName: \"kubernetes.io/projected/cf444907-0c34-43ab-9bbd-b9ef0773743c-kube-api-access-wrqmf\") pod \"cf444907-0c34-43ab-9bbd-b9ef0773743c\" (UID: \"cf444907-0c34-43ab-9bbd-b9ef0773743c\") " Mar 14 07:01:43 crc kubenswrapper[4893]: I0314 07:01:43.718045 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/cf444907-0c34-43ab-9bbd-b9ef0773743c-cni-sysctl-allowlist\") pod \"cf444907-0c34-43ab-9bbd-b9ef0773743c\" (UID: \"cf444907-0c34-43ab-9bbd-b9ef0773743c\") " Mar 14 07:01:43 crc kubenswrapper[4893]: I0314 07:01:43.718090 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6skhg\" (UniqueName: \"kubernetes.io/projected/a8531aa7-1637-458e-9477-41aa6f89ac67-kube-api-access-6skhg\") pod \"a8531aa7-1637-458e-9477-41aa6f89ac67\" (UID: \"a8531aa7-1637-458e-9477-41aa6f89ac67\") " Mar 14 07:01:43 crc kubenswrapper[4893]: I0314 07:01:43.718110 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cf444907-0c34-43ab-9bbd-b9ef0773743c-tuning-conf-dir\") pod \"cf444907-0c34-43ab-9bbd-b9ef0773743c\" (UID: \"cf444907-0c34-43ab-9bbd-b9ef0773743c\") " Mar 14 07:01:43 crc kubenswrapper[4893]: I0314 07:01:43.718319 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xr99l\" (UniqueName: \"kubernetes.io/projected/b2f826d8-aa31-498d-8956-d2878fceba76-kube-api-access-xr99l\") pod \"route-controller-manager-6d5584569d-z5tfn\" (UID: \"b2f826d8-aa31-498d-8956-d2878fceba76\") " pod="openshift-route-controller-manager/route-controller-manager-6d5584569d-z5tfn" Mar 14 07:01:43 crc kubenswrapper[4893]: I0314 07:01:43.718351 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2f826d8-aa31-498d-8956-d2878fceba76-config\") pod \"route-controller-manager-6d5584569d-z5tfn\" (UID: \"b2f826d8-aa31-498d-8956-d2878fceba76\") " pod="openshift-route-controller-manager/route-controller-manager-6d5584569d-z5tfn" Mar 14 07:01:43 crc kubenswrapper[4893]: I0314 07:01:43.718370 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b2f826d8-aa31-498d-8956-d2878fceba76-serving-cert\") pod \"route-controller-manager-6d5584569d-z5tfn\" (UID: \"b2f826d8-aa31-498d-8956-d2878fceba76\") " pod="openshift-route-controller-manager/route-controller-manager-6d5584569d-z5tfn" Mar 14 07:01:43 crc kubenswrapper[4893]: I0314 07:01:43.718395 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b2f826d8-aa31-498d-8956-d2878fceba76-client-ca\") pod \"route-controller-manager-6d5584569d-z5tfn\" (UID: \"b2f826d8-aa31-498d-8956-d2878fceba76\") " pod="openshift-route-controller-manager/route-controller-manager-6d5584569d-z5tfn" Mar 14 07:01:43 crc kubenswrapper[4893]: I0314 07:01:43.721027 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8531aa7-1637-458e-9477-41aa6f89ac67-config" (OuterVolumeSpecName: "config") pod "a8531aa7-1637-458e-9477-41aa6f89ac67" (UID: "a8531aa7-1637-458e-9477-41aa6f89ac67"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:01:43 crc kubenswrapper[4893]: I0314 07:01:43.721438 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8531aa7-1637-458e-9477-41aa6f89ac67-client-ca" (OuterVolumeSpecName: "client-ca") pod "a8531aa7-1637-458e-9477-41aa6f89ac67" (UID: "a8531aa7-1637-458e-9477-41aa6f89ac67"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:01:43 crc kubenswrapper[4893]: I0314 07:01:43.721868 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cf444907-0c34-43ab-9bbd-b9ef0773743c-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "cf444907-0c34-43ab-9bbd-b9ef0773743c" (UID: "cf444907-0c34-43ab-9bbd-b9ef0773743c"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:01:43 crc kubenswrapper[4893]: I0314 07:01:43.722598 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf444907-0c34-43ab-9bbd-b9ef0773743c-ready" (OuterVolumeSpecName: "ready") pod "cf444907-0c34-43ab-9bbd-b9ef0773743c" (UID: "cf444907-0c34-43ab-9bbd-b9ef0773743c"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:01:43 crc kubenswrapper[4893]: I0314 07:01:43.722692 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf444907-0c34-43ab-9bbd-b9ef0773743c-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "cf444907-0c34-43ab-9bbd-b9ef0773743c" (UID: "cf444907-0c34-43ab-9bbd-b9ef0773743c"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:01:43 crc kubenswrapper[4893]: I0314 07:01:43.731566 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8531aa7-1637-458e-9477-41aa6f89ac67-kube-api-access-6skhg" (OuterVolumeSpecName: "kube-api-access-6skhg") pod "a8531aa7-1637-458e-9477-41aa6f89ac67" (UID: "a8531aa7-1637-458e-9477-41aa6f89ac67"). InnerVolumeSpecName "kube-api-access-6skhg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:01:43 crc kubenswrapper[4893]: I0314 07:01:43.731782 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8531aa7-1637-458e-9477-41aa6f89ac67-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a8531aa7-1637-458e-9477-41aa6f89ac67" (UID: "a8531aa7-1637-458e-9477-41aa6f89ac67"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:01:43 crc kubenswrapper[4893]: I0314 07:01:43.732544 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf444907-0c34-43ab-9bbd-b9ef0773743c-kube-api-access-wrqmf" (OuterVolumeSpecName: "kube-api-access-wrqmf") pod "cf444907-0c34-43ab-9bbd-b9ef0773743c" (UID: "cf444907-0c34-43ab-9bbd-b9ef0773743c"). InnerVolumeSpecName "kube-api-access-wrqmf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:01:43 crc kubenswrapper[4893]: I0314 07:01:43.819222 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b2f826d8-aa31-498d-8956-d2878fceba76-client-ca\") pod \"route-controller-manager-6d5584569d-z5tfn\" (UID: \"b2f826d8-aa31-498d-8956-d2878fceba76\") " pod="openshift-route-controller-manager/route-controller-manager-6d5584569d-z5tfn" Mar 14 07:01:43 crc kubenswrapper[4893]: I0314 07:01:43.819324 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xr99l\" (UniqueName: \"kubernetes.io/projected/b2f826d8-aa31-498d-8956-d2878fceba76-kube-api-access-xr99l\") pod \"route-controller-manager-6d5584569d-z5tfn\" (UID: \"b2f826d8-aa31-498d-8956-d2878fceba76\") " pod="openshift-route-controller-manager/route-controller-manager-6d5584569d-z5tfn" Mar 14 07:01:43 crc kubenswrapper[4893]: I0314 07:01:43.819351 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2f826d8-aa31-498d-8956-d2878fceba76-config\") pod \"route-controller-manager-6d5584569d-z5tfn\" (UID: \"b2f826d8-aa31-498d-8956-d2878fceba76\") " pod="openshift-route-controller-manager/route-controller-manager-6d5584569d-z5tfn" Mar 14 07:01:43 crc kubenswrapper[4893]: I0314 07:01:43.819368 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b2f826d8-aa31-498d-8956-d2878fceba76-serving-cert\") pod \"route-controller-manager-6d5584569d-z5tfn\" (UID: \"b2f826d8-aa31-498d-8956-d2878fceba76\") " pod="openshift-route-controller-manager/route-controller-manager-6d5584569d-z5tfn" Mar 14 07:01:43 crc kubenswrapper[4893]: I0314 07:01:43.819415 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrqmf\" (UniqueName: \"kubernetes.io/projected/cf444907-0c34-43ab-9bbd-b9ef0773743c-kube-api-access-wrqmf\") on node \"crc\" DevicePath \"\"" Mar 14 07:01:43 crc kubenswrapper[4893]: I0314 07:01:43.819428 4893 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/cf444907-0c34-43ab-9bbd-b9ef0773743c-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 14 07:01:43 crc kubenswrapper[4893]: I0314 07:01:43.819438 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6skhg\" (UniqueName: \"kubernetes.io/projected/a8531aa7-1637-458e-9477-41aa6f89ac67-kube-api-access-6skhg\") on node \"crc\" DevicePath \"\"" Mar 14 07:01:43 crc kubenswrapper[4893]: I0314 07:01:43.819447 4893 reconciler_common.go:293] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cf444907-0c34-43ab-9bbd-b9ef0773743c-tuning-conf-dir\") on node \"crc\" DevicePath \"\"" Mar 14 07:01:43 crc kubenswrapper[4893]: I0314 07:01:43.819456 4893 reconciler_common.go:293] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/cf444907-0c34-43ab-9bbd-b9ef0773743c-ready\") on node \"crc\" DevicePath \"\"" Mar 14 07:01:43 crc kubenswrapper[4893]: I0314 07:01:43.819464 4893 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8531aa7-1637-458e-9477-41aa6f89ac67-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:01:43 crc kubenswrapper[4893]: I0314 07:01:43.819472 4893 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a8531aa7-1637-458e-9477-41aa6f89ac67-client-ca\") on node \"crc\" DevicePath \"\"" Mar 14 07:01:43 crc kubenswrapper[4893]: I0314 07:01:43.819479 4893 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8531aa7-1637-458e-9477-41aa6f89ac67-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:01:43 crc kubenswrapper[4893]: I0314 07:01:43.820940 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b2f826d8-aa31-498d-8956-d2878fceba76-client-ca\") pod \"route-controller-manager-6d5584569d-z5tfn\" (UID: \"b2f826d8-aa31-498d-8956-d2878fceba76\") " pod="openshift-route-controller-manager/route-controller-manager-6d5584569d-z5tfn" Mar 14 07:01:43 crc kubenswrapper[4893]: I0314 07:01:43.821931 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2f826d8-aa31-498d-8956-d2878fceba76-config\") pod \"route-controller-manager-6d5584569d-z5tfn\" (UID: \"b2f826d8-aa31-498d-8956-d2878fceba76\") " pod="openshift-route-controller-manager/route-controller-manager-6d5584569d-z5tfn" Mar 14 07:01:43 crc kubenswrapper[4893]: I0314 07:01:43.823174 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b2f826d8-aa31-498d-8956-d2878fceba76-serving-cert\") pod \"route-controller-manager-6d5584569d-z5tfn\" (UID: \"b2f826d8-aa31-498d-8956-d2878fceba76\") " pod="openshift-route-controller-manager/route-controller-manager-6d5584569d-z5tfn" Mar 14 07:01:43 crc kubenswrapper[4893]: I0314 07:01:43.835822 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xr99l\" (UniqueName: \"kubernetes.io/projected/b2f826d8-aa31-498d-8956-d2878fceba76-kube-api-access-xr99l\") pod \"route-controller-manager-6d5584569d-z5tfn\" (UID: \"b2f826d8-aa31-498d-8956-d2878fceba76\") " pod="openshift-route-controller-manager/route-controller-manager-6d5584569d-z5tfn" Mar 14 07:01:43 crc kubenswrapper[4893]: I0314 07:01:43.985651 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-84b49d764f-vrgs2"] Mar 14 07:01:43 crc kubenswrapper[4893]: W0314 07:01:43.989144 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod023e988c_ee82_44cf_aef6_235e9f4dbce0.slice/crio-f58bfba1615044406339b45604c2ddc84c23c5ed6535da8086919c497769a3f3 WatchSource:0}: Error finding container f58bfba1615044406339b45604c2ddc84c23c5ed6535da8086919c497769a3f3: Status 404 returned error can't find the container with id f58bfba1615044406339b45604c2ddc84c23c5ed6535da8086919c497769a3f3 Mar 14 07:01:43 crc kubenswrapper[4893]: I0314 07:01:43.996488 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d5584569d-z5tfn" Mar 14 07:01:44 crc kubenswrapper[4893]: I0314 07:01:44.040582 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 14 07:01:44 crc kubenswrapper[4893]: I0314 07:01:44.070500 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f8c694bcf-tk89h" event={"ID":"a8531aa7-1637-458e-9477-41aa6f89ac67","Type":"ContainerDied","Data":"995caf1824cf15547438e0690e2f7ad44a0b1e4a9c93e72f85c7aeda7939a8ad"} Mar 14 07:01:44 crc kubenswrapper[4893]: I0314 07:01:44.070564 4893 scope.go:117] "RemoveContainer" containerID="abe1124fdb72efcc0b1c3cbafeeaa4fefb56d790b52a9835d17c03e70b930a07" Mar 14 07:01:44 crc kubenswrapper[4893]: I0314 07:01:44.070651 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f8c694bcf-tk89h" Mar 14 07:01:44 crc kubenswrapper[4893]: I0314 07:01:44.081625 4893 generic.go:334] "Generic (PLEG): container finished" podID="bd723b75-2439-4310-af24-f180d494ec68" containerID="e7a557e064b03c6e3f5ac7058570ea6e240cae8d9e7bd7b41c4d3aa4504aaf06" exitCode=0 Mar 14 07:01:44 crc kubenswrapper[4893]: I0314 07:01:44.081699 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dkbml" event={"ID":"bd723b75-2439-4310-af24-f180d494ec68","Type":"ContainerDied","Data":"e7a557e064b03c6e3f5ac7058570ea6e240cae8d9e7bd7b41c4d3aa4504aaf06"} Mar 14 07:01:44 crc kubenswrapper[4893]: I0314 07:01:44.085896 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-6xtr4_cf444907-0c34-43ab-9bbd-b9ef0773743c/kube-multus-additional-cni-plugins/0.log" Mar 14 07:01:44 crc kubenswrapper[4893]: I0314 07:01:44.086196 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-6xtr4" event={"ID":"cf444907-0c34-43ab-9bbd-b9ef0773743c","Type":"ContainerDied","Data":"65fff7269dcb3b043d060705f66ef7e6b97c8cbfbb1957f67070736938bf0e58"} Mar 14 07:01:44 crc kubenswrapper[4893]: I0314 07:01:44.086257 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-6xtr4" Mar 14 07:01:44 crc kubenswrapper[4893]: I0314 07:01:44.116986 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-84b49d764f-vrgs2" event={"ID":"023e988c-ee82-44cf-aef6-235e9f4dbce0","Type":"ContainerStarted","Data":"f58bfba1615044406339b45604c2ddc84c23c5ed6535da8086919c497769a3f3"} Mar 14 07:01:44 crc kubenswrapper[4893]: I0314 07:01:44.119894 4893 scope.go:117] "RemoveContainer" containerID="5d47d584e62acecf57128aff773b22cb948e4d61adb20657341d87d377075cca" Mar 14 07:01:44 crc kubenswrapper[4893]: I0314 07:01:44.121446 4893 generic.go:334] "Generic (PLEG): container finished" podID="b1c55410-c44f-483c-801a-de26ae05a415" containerID="7a8758cb5b00521d19942b2c467f2f9ad10bfb18ff5ef123d694466e62bee10e" exitCode=0 Mar 14 07:01:44 crc kubenswrapper[4893]: I0314 07:01:44.121513 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cmmld" event={"ID":"b1c55410-c44f-483c-801a-de26ae05a415","Type":"ContainerDied","Data":"7a8758cb5b00521d19942b2c467f2f9ad10bfb18ff5ef123d694466e62bee10e"} Mar 14 07:01:44 crc kubenswrapper[4893]: E0314 07:01:44.123503 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-6hlsj" podUID="250e7aeb-ae56-47aa-99b5-47e01b338fd1" Mar 14 07:01:44 crc kubenswrapper[4893]: E0314 07:01:44.123536 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-xsf27" podUID="3a28de63-7c73-4b79-9242-7dda511afc68" Mar 14 07:01:44 crc kubenswrapper[4893]: I0314 07:01:44.172825 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-6xtr4"] Mar 14 07:01:44 crc kubenswrapper[4893]: I0314 07:01:44.175749 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-6xtr4"] Mar 14 07:01:44 crc kubenswrapper[4893]: I0314 07:01:44.227078 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f8c694bcf-tk89h"] Mar 14 07:01:44 crc kubenswrapper[4893]: I0314 07:01:44.232671 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f8c694bcf-tk89h"] Mar 14 07:01:44 crc kubenswrapper[4893]: I0314 07:01:44.234354 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d5584569d-z5tfn"] Mar 14 07:01:45 crc kubenswrapper[4893]: I0314 07:01:45.128854 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d5584569d-z5tfn" event={"ID":"b2f826d8-aa31-498d-8956-d2878fceba76","Type":"ContainerStarted","Data":"419e6eb5d09defdc7ecd474314fe227f1b6a009ae07361cb9820a7f077bfc568"} Mar 14 07:01:45 crc kubenswrapper[4893]: I0314 07:01:45.129462 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6d5584569d-z5tfn" Mar 14 07:01:45 crc kubenswrapper[4893]: I0314 07:01:45.129475 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d5584569d-z5tfn" event={"ID":"b2f826d8-aa31-498d-8956-d2878fceba76","Type":"ContainerStarted","Data":"df52ebcb46356dbd7fe2e27e4e425c160203e1c8e84054c5d7245796ee96ad76"} Mar 14 07:01:45 crc kubenswrapper[4893]: I0314 07:01:45.131919 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-84b49d764f-vrgs2" event={"ID":"023e988c-ee82-44cf-aef6-235e9f4dbce0","Type":"ContainerStarted","Data":"3b871d5b6b67f6c189a8d2e028d72742cf6e11954caa09caa990d54848d8c0a3"} Mar 14 07:01:45 crc kubenswrapper[4893]: I0314 07:01:45.132777 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-84b49d764f-vrgs2" Mar 14 07:01:45 crc kubenswrapper[4893]: I0314 07:01:45.136110 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6d5584569d-z5tfn" Mar 14 07:01:45 crc kubenswrapper[4893]: I0314 07:01:45.137955 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-84b49d764f-vrgs2" Mar 14 07:01:45 crc kubenswrapper[4893]: I0314 07:01:45.139031 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cmmld" event={"ID":"b1c55410-c44f-483c-801a-de26ae05a415","Type":"ContainerStarted","Data":"a87c86d88bef580462a8e9ebed7a0ba4ca8c00813eaf95769652acf77b42e80e"} Mar 14 07:01:45 crc kubenswrapper[4893]: I0314 07:01:45.142407 4893 generic.go:334] "Generic (PLEG): container finished" podID="c5e9e7e9-82a8-4fe5-87fe-68eac1a350f1" containerID="700f8976879d26662dd962a376f68a8273479478d6c4f0dcf649f0dccb8ace98" exitCode=0 Mar 14 07:01:45 crc kubenswrapper[4893]: I0314 07:01:45.142550 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"c5e9e7e9-82a8-4fe5-87fe-68eac1a350f1","Type":"ContainerDied","Data":"700f8976879d26662dd962a376f68a8273479478d6c4f0dcf649f0dccb8ace98"} Mar 14 07:01:45 crc kubenswrapper[4893]: I0314 07:01:45.142571 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"c5e9e7e9-82a8-4fe5-87fe-68eac1a350f1","Type":"ContainerStarted","Data":"e986271e61d2ac10025ed29eaeb78c1fdcd020b2a738253916c0d35f5b08f027"} Mar 14 07:01:45 crc kubenswrapper[4893]: I0314 07:01:45.144890 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dkbml" event={"ID":"bd723b75-2439-4310-af24-f180d494ec68","Type":"ContainerStarted","Data":"1f7d49c48aa5c7d30acee1a33d7196b30b4dc7844d32fa49ed1a1426d0fd9774"} Mar 14 07:01:45 crc kubenswrapper[4893]: I0314 07:01:45.149538 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6d5584569d-z5tfn" podStartSLOduration=20.14950832 podStartE2EDuration="20.14950832s" podCreationTimestamp="2026-03-14 07:01:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:01:45.145838552 +0000 UTC m=+184.408015364" watchObservedRunningTime="2026-03-14 07:01:45.14950832 +0000 UTC m=+184.411685112" Mar 14 07:01:45 crc kubenswrapper[4893]: I0314 07:01:45.168987 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cmmld" podStartSLOduration=2.132663618 podStartE2EDuration="40.168969203s" podCreationTimestamp="2026-03-14 07:01:05 +0000 UTC" firstStartedPulling="2026-03-14 07:01:06.546855288 +0000 UTC m=+145.809032080" lastFinishedPulling="2026-03-14 07:01:44.583160873 +0000 UTC m=+183.845337665" observedRunningTime="2026-03-14 07:01:45.166724508 +0000 UTC m=+184.428901300" watchObservedRunningTime="2026-03-14 07:01:45.168969203 +0000 UTC m=+184.431145995" Mar 14 07:01:45 crc kubenswrapper[4893]: I0314 07:01:45.215003 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dkbml" podStartSLOduration=3.004824619 podStartE2EDuration="40.214981102s" podCreationTimestamp="2026-03-14 07:01:05 +0000 UTC" firstStartedPulling="2026-03-14 07:01:07.614811548 +0000 UTC m=+146.876988340" lastFinishedPulling="2026-03-14 07:01:44.824968041 +0000 UTC m=+184.087144823" observedRunningTime="2026-03-14 07:01:45.213322932 +0000 UTC m=+184.475499744" watchObservedRunningTime="2026-03-14 07:01:45.214981102 +0000 UTC m=+184.477157894" Mar 14 07:01:45 crc kubenswrapper[4893]: I0314 07:01:45.247506 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-84b49d764f-vrgs2" podStartSLOduration=20.247489072 podStartE2EDuration="20.247489072s" podCreationTimestamp="2026-03-14 07:01:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:01:45.24697057 +0000 UTC m=+184.509147382" watchObservedRunningTime="2026-03-14 07:01:45.247489072 +0000 UTC m=+184.509665864" Mar 14 07:01:45 crc kubenswrapper[4893]: I0314 07:01:45.385454 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8531aa7-1637-458e-9477-41aa6f89ac67" path="/var/lib/kubelet/pods/a8531aa7-1637-458e-9477-41aa6f89ac67/volumes" Mar 14 07:01:45 crc kubenswrapper[4893]: I0314 07:01:45.386354 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf444907-0c34-43ab-9bbd-b9ef0773743c" path="/var/lib/kubelet/pods/cf444907-0c34-43ab-9bbd-b9ef0773743c/volumes" Mar 14 07:01:45 crc kubenswrapper[4893]: I0314 07:01:45.684700 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-cmmld" Mar 14 07:01:45 crc kubenswrapper[4893]: I0314 07:01:45.684884 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cmmld" Mar 14 07:01:45 crc kubenswrapper[4893]: I0314 07:01:45.800068 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-84b49d764f-vrgs2"] Mar 14 07:01:45 crc kubenswrapper[4893]: I0314 07:01:45.807110 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 14 07:01:45 crc kubenswrapper[4893]: I0314 07:01:45.807778 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 14 07:01:45 crc kubenswrapper[4893]: I0314 07:01:45.824645 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 14 07:01:45 crc kubenswrapper[4893]: I0314 07:01:45.919116 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d5584569d-z5tfn"] Mar 14 07:01:45 crc kubenswrapper[4893]: I0314 07:01:45.946004 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c077ca86-7535-42ff-b1c4-2acec5092071-kubelet-dir\") pod \"installer-9-crc\" (UID: \"c077ca86-7535-42ff-b1c4-2acec5092071\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 14 07:01:45 crc kubenswrapper[4893]: I0314 07:01:45.946248 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c077ca86-7535-42ff-b1c4-2acec5092071-kube-api-access\") pod \"installer-9-crc\" (UID: \"c077ca86-7535-42ff-b1c4-2acec5092071\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 14 07:01:45 crc kubenswrapper[4893]: I0314 07:01:45.946305 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c077ca86-7535-42ff-b1c4-2acec5092071-var-lock\") pod \"installer-9-crc\" (UID: \"c077ca86-7535-42ff-b1c4-2acec5092071\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 14 07:01:46 crc kubenswrapper[4893]: I0314 07:01:46.046983 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c077ca86-7535-42ff-b1c4-2acec5092071-var-lock\") pod \"installer-9-crc\" (UID: \"c077ca86-7535-42ff-b1c4-2acec5092071\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 14 07:01:46 crc kubenswrapper[4893]: I0314 07:01:46.047050 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c077ca86-7535-42ff-b1c4-2acec5092071-kubelet-dir\") pod \"installer-9-crc\" (UID: \"c077ca86-7535-42ff-b1c4-2acec5092071\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 14 07:01:46 crc kubenswrapper[4893]: I0314 07:01:46.047081 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c077ca86-7535-42ff-b1c4-2acec5092071-kube-api-access\") pod \"installer-9-crc\" (UID: \"c077ca86-7535-42ff-b1c4-2acec5092071\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 14 07:01:46 crc kubenswrapper[4893]: I0314 07:01:46.047133 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c077ca86-7535-42ff-b1c4-2acec5092071-var-lock\") pod \"installer-9-crc\" (UID: \"c077ca86-7535-42ff-b1c4-2acec5092071\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 14 07:01:46 crc kubenswrapper[4893]: I0314 07:01:46.047193 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c077ca86-7535-42ff-b1c4-2acec5092071-kubelet-dir\") pod \"installer-9-crc\" (UID: \"c077ca86-7535-42ff-b1c4-2acec5092071\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 14 07:01:46 crc kubenswrapper[4893]: I0314 07:01:46.065641 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c077ca86-7535-42ff-b1c4-2acec5092071-kube-api-access\") pod \"installer-9-crc\" (UID: \"c077ca86-7535-42ff-b1c4-2acec5092071\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 14 07:01:46 crc kubenswrapper[4893]: I0314 07:01:46.125497 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 14 07:01:46 crc kubenswrapper[4893]: I0314 07:01:46.352167 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dkbml" Mar 14 07:01:46 crc kubenswrapper[4893]: I0314 07:01:46.352514 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dkbml" Mar 14 07:01:46 crc kubenswrapper[4893]: I0314 07:01:46.454879 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 14 07:01:46 crc kubenswrapper[4893]: I0314 07:01:46.553280 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c5e9e7e9-82a8-4fe5-87fe-68eac1a350f1-kubelet-dir\") pod \"c5e9e7e9-82a8-4fe5-87fe-68eac1a350f1\" (UID: \"c5e9e7e9-82a8-4fe5-87fe-68eac1a350f1\") " Mar 14 07:01:46 crc kubenswrapper[4893]: I0314 07:01:46.553349 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c5e9e7e9-82a8-4fe5-87fe-68eac1a350f1-kube-api-access\") pod \"c5e9e7e9-82a8-4fe5-87fe-68eac1a350f1\" (UID: \"c5e9e7e9-82a8-4fe5-87fe-68eac1a350f1\") " Mar 14 07:01:46 crc kubenswrapper[4893]: I0314 07:01:46.553513 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c5e9e7e9-82a8-4fe5-87fe-68eac1a350f1-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "c5e9e7e9-82a8-4fe5-87fe-68eac1a350f1" (UID: "c5e9e7e9-82a8-4fe5-87fe-68eac1a350f1"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:01:46 crc kubenswrapper[4893]: I0314 07:01:46.553754 4893 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c5e9e7e9-82a8-4fe5-87fe-68eac1a350f1-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 14 07:01:46 crc kubenswrapper[4893]: I0314 07:01:46.559634 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5e9e7e9-82a8-4fe5-87fe-68eac1a350f1-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "c5e9e7e9-82a8-4fe5-87fe-68eac1a350f1" (UID: "c5e9e7e9-82a8-4fe5-87fe-68eac1a350f1"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:01:46 crc kubenswrapper[4893]: W0314 07:01:46.609611 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podc077ca86_7535_42ff_b1c4_2acec5092071.slice/crio-f35e1d81f7308f95f217c94c114ccf3aadc7a55410b42e727ee101a34a9885db WatchSource:0}: Error finding container f35e1d81f7308f95f217c94c114ccf3aadc7a55410b42e727ee101a34a9885db: Status 404 returned error can't find the container with id f35e1d81f7308f95f217c94c114ccf3aadc7a55410b42e727ee101a34a9885db Mar 14 07:01:46 crc kubenswrapper[4893]: I0314 07:01:46.611049 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 14 07:01:46 crc kubenswrapper[4893]: I0314 07:01:46.654989 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c5e9e7e9-82a8-4fe5-87fe-68eac1a350f1-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 14 07:01:46 crc kubenswrapper[4893]: I0314 07:01:46.853029 4893 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-cmmld" podUID="b1c55410-c44f-483c-801a-de26ae05a415" containerName="registry-server" probeResult="failure" output=< Mar 14 07:01:46 crc kubenswrapper[4893]: timeout: failed to connect service ":50051" within 1s Mar 14 07:01:46 crc kubenswrapper[4893]: > Mar 14 07:01:47 crc kubenswrapper[4893]: I0314 07:01:47.155137 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"c5e9e7e9-82a8-4fe5-87fe-68eac1a350f1","Type":"ContainerDied","Data":"e986271e61d2ac10025ed29eaeb78c1fdcd020b2a738253916c0d35f5b08f027"} Mar 14 07:01:47 crc kubenswrapper[4893]: I0314 07:01:47.155410 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e986271e61d2ac10025ed29eaeb78c1fdcd020b2a738253916c0d35f5b08f027" Mar 14 07:01:47 crc kubenswrapper[4893]: I0314 07:01:47.155144 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 14 07:01:47 crc kubenswrapper[4893]: I0314 07:01:47.156429 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"c077ca86-7535-42ff-b1c4-2acec5092071","Type":"ContainerStarted","Data":"a36a94358c8a537c018de2194283954daf43f5f034f747459589670389e03d57"} Mar 14 07:01:47 crc kubenswrapper[4893]: I0314 07:01:47.156465 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"c077ca86-7535-42ff-b1c4-2acec5092071","Type":"ContainerStarted","Data":"f35e1d81f7308f95f217c94c114ccf3aadc7a55410b42e727ee101a34a9885db"} Mar 14 07:01:47 crc kubenswrapper[4893]: I0314 07:01:47.156749 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-84b49d764f-vrgs2" podUID="023e988c-ee82-44cf-aef6-235e9f4dbce0" containerName="controller-manager" containerID="cri-o://3b871d5b6b67f6c189a8d2e028d72742cf6e11954caa09caa990d54848d8c0a3" gracePeriod=30 Mar 14 07:01:47 crc kubenswrapper[4893]: I0314 07:01:47.157252 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6d5584569d-z5tfn" podUID="b2f826d8-aa31-498d-8956-d2878fceba76" containerName="route-controller-manager" containerID="cri-o://419e6eb5d09defdc7ecd474314fe227f1b6a009ae07361cb9820a7f077bfc568" gracePeriod=30 Mar 14 07:01:47 crc kubenswrapper[4893]: I0314 07:01:47.177202 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.177183931 podStartE2EDuration="2.177183931s" podCreationTimestamp="2026-03-14 07:01:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:01:47.174330482 +0000 UTC m=+186.436507274" watchObservedRunningTime="2026-03-14 07:01:47.177183931 +0000 UTC m=+186.439360723" Mar 14 07:01:47 crc kubenswrapper[4893]: I0314 07:01:47.394673 4893 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-dkbml" podUID="bd723b75-2439-4310-af24-f180d494ec68" containerName="registry-server" probeResult="failure" output=< Mar 14 07:01:47 crc kubenswrapper[4893]: timeout: failed to connect service ":50051" within 1s Mar 14 07:01:47 crc kubenswrapper[4893]: > Mar 14 07:01:47 crc kubenswrapper[4893]: I0314 07:01:47.564256 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d5584569d-z5tfn" Mar 14 07:01:47 crc kubenswrapper[4893]: I0314 07:01:47.593273 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-84b49d764f-vrgs2" Mar 14 07:01:47 crc kubenswrapper[4893]: I0314 07:01:47.667130 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/023e988c-ee82-44cf-aef6-235e9f4dbce0-config\") pod \"023e988c-ee82-44cf-aef6-235e9f4dbce0\" (UID: \"023e988c-ee82-44cf-aef6-235e9f4dbce0\") " Mar 14 07:01:47 crc kubenswrapper[4893]: I0314 07:01:47.667403 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2f826d8-aa31-498d-8956-d2878fceba76-config\") pod \"b2f826d8-aa31-498d-8956-d2878fceba76\" (UID: \"b2f826d8-aa31-498d-8956-d2878fceba76\") " Mar 14 07:01:47 crc kubenswrapper[4893]: I0314 07:01:47.667472 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xr99l\" (UniqueName: \"kubernetes.io/projected/b2f826d8-aa31-498d-8956-d2878fceba76-kube-api-access-xr99l\") pod \"b2f826d8-aa31-498d-8956-d2878fceba76\" (UID: \"b2f826d8-aa31-498d-8956-d2878fceba76\") " Mar 14 07:01:47 crc kubenswrapper[4893]: I0314 07:01:47.667499 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkgj8\" (UniqueName: \"kubernetes.io/projected/023e988c-ee82-44cf-aef6-235e9f4dbce0-kube-api-access-rkgj8\") pod \"023e988c-ee82-44cf-aef6-235e9f4dbce0\" (UID: \"023e988c-ee82-44cf-aef6-235e9f4dbce0\") " Mar 14 07:01:47 crc kubenswrapper[4893]: I0314 07:01:47.667546 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b2f826d8-aa31-498d-8956-d2878fceba76-serving-cert\") pod \"b2f826d8-aa31-498d-8956-d2878fceba76\" (UID: \"b2f826d8-aa31-498d-8956-d2878fceba76\") " Mar 14 07:01:47 crc kubenswrapper[4893]: I0314 07:01:47.667568 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/023e988c-ee82-44cf-aef6-235e9f4dbce0-serving-cert\") pod \"023e988c-ee82-44cf-aef6-235e9f4dbce0\" (UID: \"023e988c-ee82-44cf-aef6-235e9f4dbce0\") " Mar 14 07:01:47 crc kubenswrapper[4893]: I0314 07:01:47.667600 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/023e988c-ee82-44cf-aef6-235e9f4dbce0-proxy-ca-bundles\") pod \"023e988c-ee82-44cf-aef6-235e9f4dbce0\" (UID: \"023e988c-ee82-44cf-aef6-235e9f4dbce0\") " Mar 14 07:01:47 crc kubenswrapper[4893]: I0314 07:01:47.667613 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/023e988c-ee82-44cf-aef6-235e9f4dbce0-client-ca\") pod \"023e988c-ee82-44cf-aef6-235e9f4dbce0\" (UID: \"023e988c-ee82-44cf-aef6-235e9f4dbce0\") " Mar 14 07:01:47 crc kubenswrapper[4893]: I0314 07:01:47.667641 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b2f826d8-aa31-498d-8956-d2878fceba76-client-ca\") pod \"b2f826d8-aa31-498d-8956-d2878fceba76\" (UID: \"b2f826d8-aa31-498d-8956-d2878fceba76\") " Mar 14 07:01:47 crc kubenswrapper[4893]: I0314 07:01:47.668374 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2f826d8-aa31-498d-8956-d2878fceba76-client-ca" (OuterVolumeSpecName: "client-ca") pod "b2f826d8-aa31-498d-8956-d2878fceba76" (UID: "b2f826d8-aa31-498d-8956-d2878fceba76"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:01:47 crc kubenswrapper[4893]: I0314 07:01:47.668483 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2f826d8-aa31-498d-8956-d2878fceba76-config" (OuterVolumeSpecName: "config") pod "b2f826d8-aa31-498d-8956-d2878fceba76" (UID: "b2f826d8-aa31-498d-8956-d2878fceba76"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:01:47 crc kubenswrapper[4893]: I0314 07:01:47.668960 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/023e988c-ee82-44cf-aef6-235e9f4dbce0-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "023e988c-ee82-44cf-aef6-235e9f4dbce0" (UID: "023e988c-ee82-44cf-aef6-235e9f4dbce0"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:01:47 crc kubenswrapper[4893]: I0314 07:01:47.668988 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/023e988c-ee82-44cf-aef6-235e9f4dbce0-client-ca" (OuterVolumeSpecName: "client-ca") pod "023e988c-ee82-44cf-aef6-235e9f4dbce0" (UID: "023e988c-ee82-44cf-aef6-235e9f4dbce0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:01:47 crc kubenswrapper[4893]: I0314 07:01:47.669336 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/023e988c-ee82-44cf-aef6-235e9f4dbce0-config" (OuterVolumeSpecName: "config") pod "023e988c-ee82-44cf-aef6-235e9f4dbce0" (UID: "023e988c-ee82-44cf-aef6-235e9f4dbce0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:01:47 crc kubenswrapper[4893]: I0314 07:01:47.673210 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/023e988c-ee82-44cf-aef6-235e9f4dbce0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "023e988c-ee82-44cf-aef6-235e9f4dbce0" (UID: "023e988c-ee82-44cf-aef6-235e9f4dbce0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:01:47 crc kubenswrapper[4893]: I0314 07:01:47.673269 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/023e988c-ee82-44cf-aef6-235e9f4dbce0-kube-api-access-rkgj8" (OuterVolumeSpecName: "kube-api-access-rkgj8") pod "023e988c-ee82-44cf-aef6-235e9f4dbce0" (UID: "023e988c-ee82-44cf-aef6-235e9f4dbce0"). InnerVolumeSpecName "kube-api-access-rkgj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:01:47 crc kubenswrapper[4893]: I0314 07:01:47.673399 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2f826d8-aa31-498d-8956-d2878fceba76-kube-api-access-xr99l" (OuterVolumeSpecName: "kube-api-access-xr99l") pod "b2f826d8-aa31-498d-8956-d2878fceba76" (UID: "b2f826d8-aa31-498d-8956-d2878fceba76"). InnerVolumeSpecName "kube-api-access-xr99l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:01:47 crc kubenswrapper[4893]: I0314 07:01:47.673612 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2f826d8-aa31-498d-8956-d2878fceba76-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b2f826d8-aa31-498d-8956-d2878fceba76" (UID: "b2f826d8-aa31-498d-8956-d2878fceba76"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:01:47 crc kubenswrapper[4893]: I0314 07:01:47.769544 4893 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b2f826d8-aa31-498d-8956-d2878fceba76-client-ca\") on node \"crc\" DevicePath \"\"" Mar 14 07:01:47 crc kubenswrapper[4893]: I0314 07:01:47.769608 4893 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/023e988c-ee82-44cf-aef6-235e9f4dbce0-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:01:47 crc kubenswrapper[4893]: I0314 07:01:47.769619 4893 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2f826d8-aa31-498d-8956-d2878fceba76-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:01:47 crc kubenswrapper[4893]: I0314 07:01:47.769630 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xr99l\" (UniqueName: \"kubernetes.io/projected/b2f826d8-aa31-498d-8956-d2878fceba76-kube-api-access-xr99l\") on node \"crc\" DevicePath \"\"" Mar 14 07:01:47 crc kubenswrapper[4893]: I0314 07:01:47.769644 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rkgj8\" (UniqueName: \"kubernetes.io/projected/023e988c-ee82-44cf-aef6-235e9f4dbce0-kube-api-access-rkgj8\") on node \"crc\" DevicePath \"\"" Mar 14 07:01:47 crc kubenswrapper[4893]: I0314 07:01:47.769656 4893 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b2f826d8-aa31-498d-8956-d2878fceba76-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:01:47 crc kubenswrapper[4893]: I0314 07:01:47.769664 4893 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/023e988c-ee82-44cf-aef6-235e9f4dbce0-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:01:47 crc kubenswrapper[4893]: I0314 07:01:47.769673 4893 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/023e988c-ee82-44cf-aef6-235e9f4dbce0-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 14 07:01:47 crc kubenswrapper[4893]: I0314 07:01:47.769682 4893 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/023e988c-ee82-44cf-aef6-235e9f4dbce0-client-ca\") on node \"crc\" DevicePath \"\"" Mar 14 07:01:48 crc kubenswrapper[4893]: I0314 07:01:48.163303 4893 generic.go:334] "Generic (PLEG): container finished" podID="b2f826d8-aa31-498d-8956-d2878fceba76" containerID="419e6eb5d09defdc7ecd474314fe227f1b6a009ae07361cb9820a7f077bfc568" exitCode=0 Mar 14 07:01:48 crc kubenswrapper[4893]: I0314 07:01:48.163342 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d5584569d-z5tfn" event={"ID":"b2f826d8-aa31-498d-8956-d2878fceba76","Type":"ContainerDied","Data":"419e6eb5d09defdc7ecd474314fe227f1b6a009ae07361cb9820a7f077bfc568"} Mar 14 07:01:48 crc kubenswrapper[4893]: I0314 07:01:48.163373 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d5584569d-z5tfn" Mar 14 07:01:48 crc kubenswrapper[4893]: I0314 07:01:48.163395 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d5584569d-z5tfn" event={"ID":"b2f826d8-aa31-498d-8956-d2878fceba76","Type":"ContainerDied","Data":"df52ebcb46356dbd7fe2e27e4e425c160203e1c8e84054c5d7245796ee96ad76"} Mar 14 07:01:48 crc kubenswrapper[4893]: I0314 07:01:48.163413 4893 scope.go:117] "RemoveContainer" containerID="419e6eb5d09defdc7ecd474314fe227f1b6a009ae07361cb9820a7f077bfc568" Mar 14 07:01:48 crc kubenswrapper[4893]: I0314 07:01:48.165823 4893 generic.go:334] "Generic (PLEG): container finished" podID="023e988c-ee82-44cf-aef6-235e9f4dbce0" containerID="3b871d5b6b67f6c189a8d2e028d72742cf6e11954caa09caa990d54848d8c0a3" exitCode=0 Mar 14 07:01:48 crc kubenswrapper[4893]: I0314 07:01:48.166469 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-84b49d764f-vrgs2" Mar 14 07:01:48 crc kubenswrapper[4893]: I0314 07:01:48.170733 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-84b49d764f-vrgs2" event={"ID":"023e988c-ee82-44cf-aef6-235e9f4dbce0","Type":"ContainerDied","Data":"3b871d5b6b67f6c189a8d2e028d72742cf6e11954caa09caa990d54848d8c0a3"} Mar 14 07:01:48 crc kubenswrapper[4893]: I0314 07:01:48.170782 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-84b49d764f-vrgs2" event={"ID":"023e988c-ee82-44cf-aef6-235e9f4dbce0","Type":"ContainerDied","Data":"f58bfba1615044406339b45604c2ddc84c23c5ed6535da8086919c497769a3f3"} Mar 14 07:01:48 crc kubenswrapper[4893]: I0314 07:01:48.185587 4893 scope.go:117] "RemoveContainer" containerID="419e6eb5d09defdc7ecd474314fe227f1b6a009ae07361cb9820a7f077bfc568" Mar 14 07:01:48 crc kubenswrapper[4893]: E0314 07:01:48.186054 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"419e6eb5d09defdc7ecd474314fe227f1b6a009ae07361cb9820a7f077bfc568\": container with ID starting with 419e6eb5d09defdc7ecd474314fe227f1b6a009ae07361cb9820a7f077bfc568 not found: ID does not exist" containerID="419e6eb5d09defdc7ecd474314fe227f1b6a009ae07361cb9820a7f077bfc568" Mar 14 07:01:48 crc kubenswrapper[4893]: I0314 07:01:48.186097 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"419e6eb5d09defdc7ecd474314fe227f1b6a009ae07361cb9820a7f077bfc568"} err="failed to get container status \"419e6eb5d09defdc7ecd474314fe227f1b6a009ae07361cb9820a7f077bfc568\": rpc error: code = NotFound desc = could not find container \"419e6eb5d09defdc7ecd474314fe227f1b6a009ae07361cb9820a7f077bfc568\": container with ID starting with 419e6eb5d09defdc7ecd474314fe227f1b6a009ae07361cb9820a7f077bfc568 not found: ID does not exist" Mar 14 07:01:48 crc kubenswrapper[4893]: I0314 07:01:48.186120 4893 scope.go:117] "RemoveContainer" containerID="3b871d5b6b67f6c189a8d2e028d72742cf6e11954caa09caa990d54848d8c0a3" Mar 14 07:01:48 crc kubenswrapper[4893]: I0314 07:01:48.203361 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d5584569d-z5tfn"] Mar 14 07:01:48 crc kubenswrapper[4893]: I0314 07:01:48.211409 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d5584569d-z5tfn"] Mar 14 07:01:48 crc kubenswrapper[4893]: I0314 07:01:48.220770 4893 scope.go:117] "RemoveContainer" containerID="3b871d5b6b67f6c189a8d2e028d72742cf6e11954caa09caa990d54848d8c0a3" Mar 14 07:01:48 crc kubenswrapper[4893]: I0314 07:01:48.220808 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-84b49d764f-vrgs2"] Mar 14 07:01:48 crc kubenswrapper[4893]: I0314 07:01:48.223074 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-84b49d764f-vrgs2"] Mar 14 07:01:48 crc kubenswrapper[4893]: E0314 07:01:48.223934 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b871d5b6b67f6c189a8d2e028d72742cf6e11954caa09caa990d54848d8c0a3\": container with ID starting with 3b871d5b6b67f6c189a8d2e028d72742cf6e11954caa09caa990d54848d8c0a3 not found: ID does not exist" containerID="3b871d5b6b67f6c189a8d2e028d72742cf6e11954caa09caa990d54848d8c0a3" Mar 14 07:01:48 crc kubenswrapper[4893]: I0314 07:01:48.223972 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b871d5b6b67f6c189a8d2e028d72742cf6e11954caa09caa990d54848d8c0a3"} err="failed to get container status \"3b871d5b6b67f6c189a8d2e028d72742cf6e11954caa09caa990d54848d8c0a3\": rpc error: code = NotFound desc = could not find container \"3b871d5b6b67f6c189a8d2e028d72742cf6e11954caa09caa990d54848d8c0a3\": container with ID starting with 3b871d5b6b67f6c189a8d2e028d72742cf6e11954caa09caa990d54848d8c0a3 not found: ID does not exist" Mar 14 07:01:49 crc kubenswrapper[4893]: I0314 07:01:49.067368 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-db54787bb-pvcjl"] Mar 14 07:01:49 crc kubenswrapper[4893]: E0314 07:01:49.067934 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5e9e7e9-82a8-4fe5-87fe-68eac1a350f1" containerName="pruner" Mar 14 07:01:49 crc kubenswrapper[4893]: I0314 07:01:49.067949 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5e9e7e9-82a8-4fe5-87fe-68eac1a350f1" containerName="pruner" Mar 14 07:01:49 crc kubenswrapper[4893]: E0314 07:01:49.067961 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2f826d8-aa31-498d-8956-d2878fceba76" containerName="route-controller-manager" Mar 14 07:01:49 crc kubenswrapper[4893]: I0314 07:01:49.067968 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2f826d8-aa31-498d-8956-d2878fceba76" containerName="route-controller-manager" Mar 14 07:01:49 crc kubenswrapper[4893]: E0314 07:01:49.067980 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="023e988c-ee82-44cf-aef6-235e9f4dbce0" containerName="controller-manager" Mar 14 07:01:49 crc kubenswrapper[4893]: I0314 07:01:49.067987 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="023e988c-ee82-44cf-aef6-235e9f4dbce0" containerName="controller-manager" Mar 14 07:01:49 crc kubenswrapper[4893]: I0314 07:01:49.068095 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="023e988c-ee82-44cf-aef6-235e9f4dbce0" containerName="controller-manager" Mar 14 07:01:49 crc kubenswrapper[4893]: I0314 07:01:49.068110 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2f826d8-aa31-498d-8956-d2878fceba76" containerName="route-controller-manager" Mar 14 07:01:49 crc kubenswrapper[4893]: I0314 07:01:49.068124 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5e9e7e9-82a8-4fe5-87fe-68eac1a350f1" containerName="pruner" Mar 14 07:01:49 crc kubenswrapper[4893]: I0314 07:01:49.068574 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-db54787bb-pvcjl" Mar 14 07:01:49 crc kubenswrapper[4893]: I0314 07:01:49.070000 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-58ffcf8686-lkg7m"] Mar 14 07:01:49 crc kubenswrapper[4893]: I0314 07:01:49.070460 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 14 07:01:49 crc kubenswrapper[4893]: I0314 07:01:49.070681 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-58ffcf8686-lkg7m" Mar 14 07:01:49 crc kubenswrapper[4893]: I0314 07:01:49.070879 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 14 07:01:49 crc kubenswrapper[4893]: I0314 07:01:49.071487 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 14 07:01:49 crc kubenswrapper[4893]: I0314 07:01:49.072346 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 14 07:01:49 crc kubenswrapper[4893]: I0314 07:01:49.073110 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 14 07:01:49 crc kubenswrapper[4893]: I0314 07:01:49.073369 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 14 07:01:49 crc kubenswrapper[4893]: I0314 07:01:49.077218 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 14 07:01:49 crc kubenswrapper[4893]: I0314 07:01:49.077857 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-58ffcf8686-lkg7m"] Mar 14 07:01:49 crc kubenswrapper[4893]: I0314 07:01:49.078112 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 14 07:01:49 crc kubenswrapper[4893]: I0314 07:01:49.078307 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 14 07:01:49 crc kubenswrapper[4893]: I0314 07:01:49.078506 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 14 07:01:49 crc kubenswrapper[4893]: I0314 07:01:49.078670 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 14 07:01:49 crc kubenswrapper[4893]: I0314 07:01:49.080851 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 14 07:01:49 crc kubenswrapper[4893]: I0314 07:01:49.081547 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-db54787bb-pvcjl"] Mar 14 07:01:49 crc kubenswrapper[4893]: I0314 07:01:49.083781 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 14 07:01:49 crc kubenswrapper[4893]: I0314 07:01:49.189341 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52f2f9c0-c74e-4918-bae7-0c3656513ba4-serving-cert\") pod \"controller-manager-58ffcf8686-lkg7m\" (UID: \"52f2f9c0-c74e-4918-bae7-0c3656513ba4\") " pod="openshift-controller-manager/controller-manager-58ffcf8686-lkg7m" Mar 14 07:01:49 crc kubenswrapper[4893]: I0314 07:01:49.189400 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/686355b3-8ecb-49d4-9230-c9d64d5ca268-client-ca\") pod \"route-controller-manager-db54787bb-pvcjl\" (UID: \"686355b3-8ecb-49d4-9230-c9d64d5ca268\") " pod="openshift-route-controller-manager/route-controller-manager-db54787bb-pvcjl" Mar 14 07:01:49 crc kubenswrapper[4893]: I0314 07:01:49.189465 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/686355b3-8ecb-49d4-9230-c9d64d5ca268-config\") pod \"route-controller-manager-db54787bb-pvcjl\" (UID: \"686355b3-8ecb-49d4-9230-c9d64d5ca268\") " pod="openshift-route-controller-manager/route-controller-manager-db54787bb-pvcjl" Mar 14 07:01:49 crc kubenswrapper[4893]: I0314 07:01:49.189496 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67vtv\" (UniqueName: \"kubernetes.io/projected/52f2f9c0-c74e-4918-bae7-0c3656513ba4-kube-api-access-67vtv\") pod \"controller-manager-58ffcf8686-lkg7m\" (UID: \"52f2f9c0-c74e-4918-bae7-0c3656513ba4\") " pod="openshift-controller-manager/controller-manager-58ffcf8686-lkg7m" Mar 14 07:01:49 crc kubenswrapper[4893]: I0314 07:01:49.189542 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/686355b3-8ecb-49d4-9230-c9d64d5ca268-serving-cert\") pod \"route-controller-manager-db54787bb-pvcjl\" (UID: \"686355b3-8ecb-49d4-9230-c9d64d5ca268\") " pod="openshift-route-controller-manager/route-controller-manager-db54787bb-pvcjl" Mar 14 07:01:49 crc kubenswrapper[4893]: I0314 07:01:49.189610 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/52f2f9c0-c74e-4918-bae7-0c3656513ba4-client-ca\") pod \"controller-manager-58ffcf8686-lkg7m\" (UID: \"52f2f9c0-c74e-4918-bae7-0c3656513ba4\") " pod="openshift-controller-manager/controller-manager-58ffcf8686-lkg7m" Mar 14 07:01:49 crc kubenswrapper[4893]: I0314 07:01:49.189642 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52f2f9c0-c74e-4918-bae7-0c3656513ba4-config\") pod \"controller-manager-58ffcf8686-lkg7m\" (UID: \"52f2f9c0-c74e-4918-bae7-0c3656513ba4\") " pod="openshift-controller-manager/controller-manager-58ffcf8686-lkg7m" Mar 14 07:01:49 crc kubenswrapper[4893]: I0314 07:01:49.189672 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/52f2f9c0-c74e-4918-bae7-0c3656513ba4-proxy-ca-bundles\") pod \"controller-manager-58ffcf8686-lkg7m\" (UID: \"52f2f9c0-c74e-4918-bae7-0c3656513ba4\") " pod="openshift-controller-manager/controller-manager-58ffcf8686-lkg7m" Mar 14 07:01:49 crc kubenswrapper[4893]: I0314 07:01:49.189775 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgcq6\" (UniqueName: \"kubernetes.io/projected/686355b3-8ecb-49d4-9230-c9d64d5ca268-kube-api-access-fgcq6\") pod \"route-controller-manager-db54787bb-pvcjl\" (UID: \"686355b3-8ecb-49d4-9230-c9d64d5ca268\") " pod="openshift-route-controller-manager/route-controller-manager-db54787bb-pvcjl" Mar 14 07:01:49 crc kubenswrapper[4893]: I0314 07:01:49.290821 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/686355b3-8ecb-49d4-9230-c9d64d5ca268-serving-cert\") pod \"route-controller-manager-db54787bb-pvcjl\" (UID: \"686355b3-8ecb-49d4-9230-c9d64d5ca268\") " pod="openshift-route-controller-manager/route-controller-manager-db54787bb-pvcjl" Mar 14 07:01:49 crc kubenswrapper[4893]: I0314 07:01:49.290888 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/52f2f9c0-c74e-4918-bae7-0c3656513ba4-client-ca\") pod \"controller-manager-58ffcf8686-lkg7m\" (UID: \"52f2f9c0-c74e-4918-bae7-0c3656513ba4\") " pod="openshift-controller-manager/controller-manager-58ffcf8686-lkg7m" Mar 14 07:01:49 crc kubenswrapper[4893]: I0314 07:01:49.290908 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52f2f9c0-c74e-4918-bae7-0c3656513ba4-config\") pod \"controller-manager-58ffcf8686-lkg7m\" (UID: \"52f2f9c0-c74e-4918-bae7-0c3656513ba4\") " pod="openshift-controller-manager/controller-manager-58ffcf8686-lkg7m" Mar 14 07:01:49 crc kubenswrapper[4893]: I0314 07:01:49.290930 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/52f2f9c0-c74e-4918-bae7-0c3656513ba4-proxy-ca-bundles\") pod \"controller-manager-58ffcf8686-lkg7m\" (UID: \"52f2f9c0-c74e-4918-bae7-0c3656513ba4\") " pod="openshift-controller-manager/controller-manager-58ffcf8686-lkg7m" Mar 14 07:01:49 crc kubenswrapper[4893]: I0314 07:01:49.290952 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgcq6\" (UniqueName: \"kubernetes.io/projected/686355b3-8ecb-49d4-9230-c9d64d5ca268-kube-api-access-fgcq6\") pod \"route-controller-manager-db54787bb-pvcjl\" (UID: \"686355b3-8ecb-49d4-9230-c9d64d5ca268\") " pod="openshift-route-controller-manager/route-controller-manager-db54787bb-pvcjl" Mar 14 07:01:49 crc kubenswrapper[4893]: I0314 07:01:49.292110 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/52f2f9c0-c74e-4918-bae7-0c3656513ba4-client-ca\") pod \"controller-manager-58ffcf8686-lkg7m\" (UID: \"52f2f9c0-c74e-4918-bae7-0c3656513ba4\") " pod="openshift-controller-manager/controller-manager-58ffcf8686-lkg7m" Mar 14 07:01:49 crc kubenswrapper[4893]: I0314 07:01:49.292292 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52f2f9c0-c74e-4918-bae7-0c3656513ba4-config\") pod \"controller-manager-58ffcf8686-lkg7m\" (UID: \"52f2f9c0-c74e-4918-bae7-0c3656513ba4\") " pod="openshift-controller-manager/controller-manager-58ffcf8686-lkg7m" Mar 14 07:01:49 crc kubenswrapper[4893]: I0314 07:01:49.292868 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52f2f9c0-c74e-4918-bae7-0c3656513ba4-serving-cert\") pod \"controller-manager-58ffcf8686-lkg7m\" (UID: \"52f2f9c0-c74e-4918-bae7-0c3656513ba4\") " pod="openshift-controller-manager/controller-manager-58ffcf8686-lkg7m" Mar 14 07:01:49 crc kubenswrapper[4893]: I0314 07:01:49.292994 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/686355b3-8ecb-49d4-9230-c9d64d5ca268-client-ca\") pod \"route-controller-manager-db54787bb-pvcjl\" (UID: \"686355b3-8ecb-49d4-9230-c9d64d5ca268\") " pod="openshift-route-controller-manager/route-controller-manager-db54787bb-pvcjl" Mar 14 07:01:49 crc kubenswrapper[4893]: I0314 07:01:49.293031 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/686355b3-8ecb-49d4-9230-c9d64d5ca268-config\") pod \"route-controller-manager-db54787bb-pvcjl\" (UID: \"686355b3-8ecb-49d4-9230-c9d64d5ca268\") " pod="openshift-route-controller-manager/route-controller-manager-db54787bb-pvcjl" Mar 14 07:01:49 crc kubenswrapper[4893]: I0314 07:01:49.293089 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67vtv\" (UniqueName: \"kubernetes.io/projected/52f2f9c0-c74e-4918-bae7-0c3656513ba4-kube-api-access-67vtv\") pod \"controller-manager-58ffcf8686-lkg7m\" (UID: \"52f2f9c0-c74e-4918-bae7-0c3656513ba4\") " pod="openshift-controller-manager/controller-manager-58ffcf8686-lkg7m" Mar 14 07:01:49 crc kubenswrapper[4893]: I0314 07:01:49.293192 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/52f2f9c0-c74e-4918-bae7-0c3656513ba4-proxy-ca-bundles\") pod \"controller-manager-58ffcf8686-lkg7m\" (UID: \"52f2f9c0-c74e-4918-bae7-0c3656513ba4\") " pod="openshift-controller-manager/controller-manager-58ffcf8686-lkg7m" Mar 14 07:01:49 crc kubenswrapper[4893]: I0314 07:01:49.294432 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/686355b3-8ecb-49d4-9230-c9d64d5ca268-client-ca\") pod \"route-controller-manager-db54787bb-pvcjl\" (UID: \"686355b3-8ecb-49d4-9230-c9d64d5ca268\") " pod="openshift-route-controller-manager/route-controller-manager-db54787bb-pvcjl" Mar 14 07:01:49 crc kubenswrapper[4893]: I0314 07:01:49.295241 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/686355b3-8ecb-49d4-9230-c9d64d5ca268-config\") pod \"route-controller-manager-db54787bb-pvcjl\" (UID: \"686355b3-8ecb-49d4-9230-c9d64d5ca268\") " pod="openshift-route-controller-manager/route-controller-manager-db54787bb-pvcjl" Mar 14 07:01:49 crc kubenswrapper[4893]: I0314 07:01:49.296263 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52f2f9c0-c74e-4918-bae7-0c3656513ba4-serving-cert\") pod \"controller-manager-58ffcf8686-lkg7m\" (UID: \"52f2f9c0-c74e-4918-bae7-0c3656513ba4\") " pod="openshift-controller-manager/controller-manager-58ffcf8686-lkg7m" Mar 14 07:01:49 crc kubenswrapper[4893]: I0314 07:01:49.303879 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/686355b3-8ecb-49d4-9230-c9d64d5ca268-serving-cert\") pod \"route-controller-manager-db54787bb-pvcjl\" (UID: \"686355b3-8ecb-49d4-9230-c9d64d5ca268\") " pod="openshift-route-controller-manager/route-controller-manager-db54787bb-pvcjl" Mar 14 07:01:49 crc kubenswrapper[4893]: I0314 07:01:49.306299 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgcq6\" (UniqueName: \"kubernetes.io/projected/686355b3-8ecb-49d4-9230-c9d64d5ca268-kube-api-access-fgcq6\") pod \"route-controller-manager-db54787bb-pvcjl\" (UID: \"686355b3-8ecb-49d4-9230-c9d64d5ca268\") " pod="openshift-route-controller-manager/route-controller-manager-db54787bb-pvcjl" Mar 14 07:01:49 crc kubenswrapper[4893]: I0314 07:01:49.306553 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67vtv\" (UniqueName: \"kubernetes.io/projected/52f2f9c0-c74e-4918-bae7-0c3656513ba4-kube-api-access-67vtv\") pod \"controller-manager-58ffcf8686-lkg7m\" (UID: \"52f2f9c0-c74e-4918-bae7-0c3656513ba4\") " pod="openshift-controller-manager/controller-manager-58ffcf8686-lkg7m" Mar 14 07:01:49 crc kubenswrapper[4893]: I0314 07:01:49.383439 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="023e988c-ee82-44cf-aef6-235e9f4dbce0" path="/var/lib/kubelet/pods/023e988c-ee82-44cf-aef6-235e9f4dbce0/volumes" Mar 14 07:01:49 crc kubenswrapper[4893]: I0314 07:01:49.383969 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2f826d8-aa31-498d-8956-d2878fceba76" path="/var/lib/kubelet/pods/b2f826d8-aa31-498d-8956-d2878fceba76/volumes" Mar 14 07:01:49 crc kubenswrapper[4893]: I0314 07:01:49.397769 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-db54787bb-pvcjl" Mar 14 07:01:49 crc kubenswrapper[4893]: I0314 07:01:49.408963 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-58ffcf8686-lkg7m" Mar 14 07:01:49 crc kubenswrapper[4893]: I0314 07:01:49.672533 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-db54787bb-pvcjl"] Mar 14 07:01:49 crc kubenswrapper[4893]: W0314 07:01:49.677911 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod686355b3_8ecb_49d4_9230_c9d64d5ca268.slice/crio-ee8cd669b5997e78d407e45c4402d8921a1f59c0a237ff1e24557b9fc279c6a0 WatchSource:0}: Error finding container ee8cd669b5997e78d407e45c4402d8921a1f59c0a237ff1e24557b9fc279c6a0: Status 404 returned error can't find the container with id ee8cd669b5997e78d407e45c4402d8921a1f59c0a237ff1e24557b9fc279c6a0 Mar 14 07:01:49 crc kubenswrapper[4893]: I0314 07:01:49.916382 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-58ffcf8686-lkg7m"] Mar 14 07:01:49 crc kubenswrapper[4893]: W0314 07:01:49.925209 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52f2f9c0_c74e_4918_bae7_0c3656513ba4.slice/crio-d872ceab733f38c564f567654da974ec64b264843e5aac24dcf42ff0227515c8 WatchSource:0}: Error finding container d872ceab733f38c564f567654da974ec64b264843e5aac24dcf42ff0227515c8: Status 404 returned error can't find the container with id d872ceab733f38c564f567654da974ec64b264843e5aac24dcf42ff0227515c8 Mar 14 07:01:50 crc kubenswrapper[4893]: I0314 07:01:50.179545 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-db54787bb-pvcjl" event={"ID":"686355b3-8ecb-49d4-9230-c9d64d5ca268","Type":"ContainerStarted","Data":"09f60905d4310bf58c24979769a4fdd89ad93f1df5da3de6071faae8c9b0257f"} Mar 14 07:01:50 crc kubenswrapper[4893]: I0314 07:01:50.179595 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-db54787bb-pvcjl" event={"ID":"686355b3-8ecb-49d4-9230-c9d64d5ca268","Type":"ContainerStarted","Data":"ee8cd669b5997e78d407e45c4402d8921a1f59c0a237ff1e24557b9fc279c6a0"} Mar 14 07:01:50 crc kubenswrapper[4893]: I0314 07:01:50.179752 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-db54787bb-pvcjl" Mar 14 07:01:50 crc kubenswrapper[4893]: I0314 07:01:50.181593 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-58ffcf8686-lkg7m" event={"ID":"52f2f9c0-c74e-4918-bae7-0c3656513ba4","Type":"ContainerStarted","Data":"3bc73172b0fc42fc9653ec721a109e28550474722584a71b341ccc25ee03382a"} Mar 14 07:01:50 crc kubenswrapper[4893]: I0314 07:01:50.181647 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-58ffcf8686-lkg7m" event={"ID":"52f2f9c0-c74e-4918-bae7-0c3656513ba4","Type":"ContainerStarted","Data":"d872ceab733f38c564f567654da974ec64b264843e5aac24dcf42ff0227515c8"} Mar 14 07:01:50 crc kubenswrapper[4893]: I0314 07:01:50.181777 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-58ffcf8686-lkg7m" Mar 14 07:01:50 crc kubenswrapper[4893]: I0314 07:01:50.186713 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-58ffcf8686-lkg7m" Mar 14 07:01:50 crc kubenswrapper[4893]: I0314 07:01:50.200204 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-db54787bb-pvcjl" podStartSLOduration=5.200183437 podStartE2EDuration="5.200183437s" podCreationTimestamp="2026-03-14 07:01:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:01:50.195927794 +0000 UTC m=+189.458104606" watchObservedRunningTime="2026-03-14 07:01:50.200183437 +0000 UTC m=+189.462360229" Mar 14 07:01:50 crc kubenswrapper[4893]: I0314 07:01:50.226649 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-58ffcf8686-lkg7m" podStartSLOduration=5.22662838 podStartE2EDuration="5.22662838s" podCreationTimestamp="2026-03-14 07:01:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:01:50.22540903 +0000 UTC m=+189.487585832" watchObservedRunningTime="2026-03-14 07:01:50.22662838 +0000 UTC m=+189.488805182" Mar 14 07:01:50 crc kubenswrapper[4893]: I0314 07:01:50.278340 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-db54787bb-pvcjl" Mar 14 07:01:53 crc kubenswrapper[4893]: I0314 07:01:53.200342 4893 generic.go:334] "Generic (PLEG): container finished" podID="6eb806cc-dc34-40ff-b7d5-c33a575822ec" containerID="3f802eda512867ab288d2cdd672194b391ec90d6fb96dc8820b75394b2a0478f" exitCode=0 Mar 14 07:01:53 crc kubenswrapper[4893]: I0314 07:01:53.200588 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-727v2" event={"ID":"6eb806cc-dc34-40ff-b7d5-c33a575822ec","Type":"ContainerDied","Data":"3f802eda512867ab288d2cdd672194b391ec90d6fb96dc8820b75394b2a0478f"} Mar 14 07:01:54 crc kubenswrapper[4893]: I0314 07:01:54.207480 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-727v2" event={"ID":"6eb806cc-dc34-40ff-b7d5-c33a575822ec","Type":"ContainerStarted","Data":"9d6100f269a5eee60da7afefd8b16d60b063b020809933d71539e771c2ce3e0c"} Mar 14 07:01:54 crc kubenswrapper[4893]: I0314 07:01:54.210569 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vhk5b" event={"ID":"28f15d4e-4de1-481e-bd52-d4dcada252a4","Type":"ContainerStarted","Data":"e900a302b46ca9e81acc00477f015190845447188d0df4f90cdbcd767bb583cd"} Mar 14 07:01:54 crc kubenswrapper[4893]: I0314 07:01:54.230870 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-727v2" podStartSLOduration=2.98348056 podStartE2EDuration="49.230849249s" podCreationTimestamp="2026-03-14 07:01:05 +0000 UTC" firstStartedPulling="2026-03-14 07:01:07.649760758 +0000 UTC m=+146.911937550" lastFinishedPulling="2026-03-14 07:01:53.897129447 +0000 UTC m=+193.159306239" observedRunningTime="2026-03-14 07:01:54.228709038 +0000 UTC m=+193.490885850" watchObservedRunningTime="2026-03-14 07:01:54.230849249 +0000 UTC m=+193.493026061" Mar 14 07:01:55 crc kubenswrapper[4893]: I0314 07:01:55.216400 4893 generic.go:334] "Generic (PLEG): container finished" podID="28f15d4e-4de1-481e-bd52-d4dcada252a4" containerID="e900a302b46ca9e81acc00477f015190845447188d0df4f90cdbcd767bb583cd" exitCode=0 Mar 14 07:01:55 crc kubenswrapper[4893]: I0314 07:01:55.216447 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vhk5b" event={"ID":"28f15d4e-4de1-481e-bd52-d4dcada252a4","Type":"ContainerDied","Data":"e900a302b46ca9e81acc00477f015190845447188d0df4f90cdbcd767bb583cd"} Mar 14 07:01:55 crc kubenswrapper[4893]: I0314 07:01:55.734168 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cmmld" Mar 14 07:01:55 crc kubenswrapper[4893]: I0314 07:01:55.773982 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cmmld" Mar 14 07:01:55 crc kubenswrapper[4893]: I0314 07:01:55.874871 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-727v2" Mar 14 07:01:55 crc kubenswrapper[4893]: I0314 07:01:55.874912 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-727v2" Mar 14 07:01:56 crc kubenswrapper[4893]: I0314 07:01:56.223124 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vhk5b" event={"ID":"28f15d4e-4de1-481e-bd52-d4dcada252a4","Type":"ContainerStarted","Data":"fc0cd124674798c21623e1b056821e8346e46a680c244ebe1bcb21cb03fade33"} Mar 14 07:01:56 crc kubenswrapper[4893]: I0314 07:01:56.244320 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vhk5b" podStartSLOduration=3.200136057 podStartE2EDuration="51.244299764s" podCreationTimestamp="2026-03-14 07:01:05 +0000 UTC" firstStartedPulling="2026-03-14 07:01:07.62308625 +0000 UTC m=+146.885263042" lastFinishedPulling="2026-03-14 07:01:55.667249967 +0000 UTC m=+194.929426749" observedRunningTime="2026-03-14 07:01:56.239388615 +0000 UTC m=+195.501565427" watchObservedRunningTime="2026-03-14 07:01:56.244299764 +0000 UTC m=+195.506476556" Mar 14 07:01:56 crc kubenswrapper[4893]: I0314 07:01:56.396168 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dkbml" Mar 14 07:01:56 crc kubenswrapper[4893]: I0314 07:01:56.434688 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dkbml" Mar 14 07:01:56 crc kubenswrapper[4893]: I0314 07:01:56.936654 4893 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-727v2" podUID="6eb806cc-dc34-40ff-b7d5-c33a575822ec" containerName="registry-server" probeResult="failure" output=< Mar 14 07:01:56 crc kubenswrapper[4893]: timeout: failed to connect service ":50051" within 1s Mar 14 07:01:56 crc kubenswrapper[4893]: > Mar 14 07:01:58 crc kubenswrapper[4893]: I0314 07:01:58.235205 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xsf27" event={"ID":"3a28de63-7c73-4b79-9242-7dda511afc68","Type":"ContainerStarted","Data":"fb9b75f3a9878dbe4059b9037fbce06120ae02a152cd6d13d0b1464cadce476d"} Mar 14 07:01:58 crc kubenswrapper[4893]: I0314 07:01:58.239132 4893 generic.go:334] "Generic (PLEG): container finished" podID="212d2416-8201-4cae-a8b9-3121de2e8348" containerID="c96dff5c36296511a0d43c9871a864cf06ebb3b69e83eb1d1a90b327b4e3d214" exitCode=0 Mar 14 07:01:58 crc kubenswrapper[4893]: I0314 07:01:58.239175 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gn5xf" event={"ID":"212d2416-8201-4cae-a8b9-3121de2e8348","Type":"ContainerDied","Data":"c96dff5c36296511a0d43c9871a864cf06ebb3b69e83eb1d1a90b327b4e3d214"} Mar 14 07:01:58 crc kubenswrapper[4893]: I0314 07:01:58.241688 4893 generic.go:334] "Generic (PLEG): container finished" podID="00db91c2-32b1-4b21-9795-e47042b4b9a4" containerID="70c4fa752470b944f0f1731115975db4874198018da4717d8692f5c49a456a6e" exitCode=0 Mar 14 07:01:58 crc kubenswrapper[4893]: I0314 07:01:58.241756 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-84hsf" event={"ID":"00db91c2-32b1-4b21-9795-e47042b4b9a4","Type":"ContainerDied","Data":"70c4fa752470b944f0f1731115975db4874198018da4717d8692f5c49a456a6e"} Mar 14 07:01:58 crc kubenswrapper[4893]: I0314 07:01:58.243687 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6hlsj" event={"ID":"250e7aeb-ae56-47aa-99b5-47e01b338fd1","Type":"ContainerStarted","Data":"d618272cb79a935dc611117a0c341159f37bf17353f666dfaabe1dea27e01f6a"} Mar 14 07:01:59 crc kubenswrapper[4893]: I0314 07:01:59.250041 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gn5xf" event={"ID":"212d2416-8201-4cae-a8b9-3121de2e8348","Type":"ContainerStarted","Data":"4546f9d0ef1636dd5954132141088ef1541985f7f0def6c3e49004f0e256679a"} Mar 14 07:01:59 crc kubenswrapper[4893]: I0314 07:01:59.252750 4893 generic.go:334] "Generic (PLEG): container finished" podID="250e7aeb-ae56-47aa-99b5-47e01b338fd1" containerID="d618272cb79a935dc611117a0c341159f37bf17353f666dfaabe1dea27e01f6a" exitCode=0 Mar 14 07:01:59 crc kubenswrapper[4893]: I0314 07:01:59.252806 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6hlsj" event={"ID":"250e7aeb-ae56-47aa-99b5-47e01b338fd1","Type":"ContainerDied","Data":"d618272cb79a935dc611117a0c341159f37bf17353f666dfaabe1dea27e01f6a"} Mar 14 07:01:59 crc kubenswrapper[4893]: I0314 07:01:59.255110 4893 generic.go:334] "Generic (PLEG): container finished" podID="3a28de63-7c73-4b79-9242-7dda511afc68" containerID="fb9b75f3a9878dbe4059b9037fbce06120ae02a152cd6d13d0b1464cadce476d" exitCode=0 Mar 14 07:01:59 crc kubenswrapper[4893]: I0314 07:01:59.255163 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xsf27" event={"ID":"3a28de63-7c73-4b79-9242-7dda511afc68","Type":"ContainerDied","Data":"fb9b75f3a9878dbe4059b9037fbce06120ae02a152cd6d13d0b1464cadce476d"} Mar 14 07:01:59 crc kubenswrapper[4893]: I0314 07:01:59.257127 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-84hsf" event={"ID":"00db91c2-32b1-4b21-9795-e47042b4b9a4","Type":"ContainerStarted","Data":"b1e95e25be1dc1fd7ee4ff0d6d851a112c9e18efd6bc3d83face4f1aefda1708"} Mar 14 07:01:59 crc kubenswrapper[4893]: I0314 07:01:59.295650 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-84hsf" podStartSLOduration=3.403225865 podStartE2EDuration="52.295630256s" podCreationTimestamp="2026-03-14 07:01:07 +0000 UTC" firstStartedPulling="2026-03-14 07:01:09.79142427 +0000 UTC m=+149.053601062" lastFinishedPulling="2026-03-14 07:01:58.683828661 +0000 UTC m=+197.946005453" observedRunningTime="2026-03-14 07:01:59.294475593 +0000 UTC m=+198.556652405" watchObservedRunningTime="2026-03-14 07:01:59.295630256 +0000 UTC m=+198.557807048" Mar 14 07:01:59 crc kubenswrapper[4893]: I0314 07:01:59.302148 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gn5xf" podStartSLOduration=2.279146838 podStartE2EDuration="52.302122877s" podCreationTimestamp="2026-03-14 07:01:07 +0000 UTC" firstStartedPulling="2026-03-14 07:01:08.673475703 +0000 UTC m=+147.935652495" lastFinishedPulling="2026-03-14 07:01:58.696451742 +0000 UTC m=+197.958628534" observedRunningTime="2026-03-14 07:01:59.275763571 +0000 UTC m=+198.537940363" watchObservedRunningTime="2026-03-14 07:01:59.302122877 +0000 UTC m=+198.564299679" Mar 14 07:02:00 crc kubenswrapper[4893]: I0314 07:02:00.136724 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557862-tmqkz"] Mar 14 07:02:00 crc kubenswrapper[4893]: I0314 07:02:00.137718 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557862-tmqkz" Mar 14 07:02:00 crc kubenswrapper[4893]: I0314 07:02:00.139493 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-44qb7" Mar 14 07:02:00 crc kubenswrapper[4893]: I0314 07:02:00.139960 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 07:02:00 crc kubenswrapper[4893]: I0314 07:02:00.140000 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 07:02:00 crc kubenswrapper[4893]: I0314 07:02:00.146253 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557862-tmqkz"] Mar 14 07:02:00 crc kubenswrapper[4893]: I0314 07:02:00.236026 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kkc5\" (UniqueName: \"kubernetes.io/projected/19c5a27c-6f7c-46d9-bc16-27796b0bd030-kube-api-access-2kkc5\") pod \"auto-csr-approver-29557862-tmqkz\" (UID: \"19c5a27c-6f7c-46d9-bc16-27796b0bd030\") " pod="openshift-infra/auto-csr-approver-29557862-tmqkz" Mar 14 07:02:00 crc kubenswrapper[4893]: I0314 07:02:00.267875 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6hlsj" event={"ID":"250e7aeb-ae56-47aa-99b5-47e01b338fd1","Type":"ContainerStarted","Data":"e4ddcf34d82f5565176166fb88eb31ccc6cc6e954e24a621657a20759900a54b"} Mar 14 07:02:00 crc kubenswrapper[4893]: I0314 07:02:00.272508 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xsf27" event={"ID":"3a28de63-7c73-4b79-9242-7dda511afc68","Type":"ContainerStarted","Data":"0698659b76e1744d7d3996eb027486edc06d5e35daf95ff74aefeffb4d220573"} Mar 14 07:02:00 crc kubenswrapper[4893]: I0314 07:02:00.286306 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6hlsj" podStartSLOduration=2.222271434 podStartE2EDuration="51.286288772s" podCreationTimestamp="2026-03-14 07:01:09 +0000 UTC" firstStartedPulling="2026-03-14 07:01:10.808215677 +0000 UTC m=+150.070392469" lastFinishedPulling="2026-03-14 07:01:59.872233015 +0000 UTC m=+199.134409807" observedRunningTime="2026-03-14 07:02:00.283965893 +0000 UTC m=+199.546142695" watchObservedRunningTime="2026-03-14 07:02:00.286288772 +0000 UTC m=+199.548465564" Mar 14 07:02:00 crc kubenswrapper[4893]: I0314 07:02:00.306889 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xsf27" podStartSLOduration=3.173775583 podStartE2EDuration="52.306867778s" podCreationTimestamp="2026-03-14 07:01:08 +0000 UTC" firstStartedPulling="2026-03-14 07:01:10.803227357 +0000 UTC m=+150.065404149" lastFinishedPulling="2026-03-14 07:01:59.936319552 +0000 UTC m=+199.198496344" observedRunningTime="2026-03-14 07:02:00.304849658 +0000 UTC m=+199.567026450" watchObservedRunningTime="2026-03-14 07:02:00.306867778 +0000 UTC m=+199.569044570" Mar 14 07:02:00 crc kubenswrapper[4893]: I0314 07:02:00.337160 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kkc5\" (UniqueName: \"kubernetes.io/projected/19c5a27c-6f7c-46d9-bc16-27796b0bd030-kube-api-access-2kkc5\") pod \"auto-csr-approver-29557862-tmqkz\" (UID: \"19c5a27c-6f7c-46d9-bc16-27796b0bd030\") " pod="openshift-infra/auto-csr-approver-29557862-tmqkz" Mar 14 07:02:00 crc kubenswrapper[4893]: I0314 07:02:00.356889 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kkc5\" (UniqueName: \"kubernetes.io/projected/19c5a27c-6f7c-46d9-bc16-27796b0bd030-kube-api-access-2kkc5\") pod \"auto-csr-approver-29557862-tmqkz\" (UID: \"19c5a27c-6f7c-46d9-bc16-27796b0bd030\") " pod="openshift-infra/auto-csr-approver-29557862-tmqkz" Mar 14 07:02:00 crc kubenswrapper[4893]: I0314 07:02:00.431785 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dkbml"] Mar 14 07:02:00 crc kubenswrapper[4893]: I0314 07:02:00.432001 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dkbml" podUID="bd723b75-2439-4310-af24-f180d494ec68" containerName="registry-server" containerID="cri-o://1f7d49c48aa5c7d30acee1a33d7196b30b4dc7844d32fa49ed1a1426d0fd9774" gracePeriod=2 Mar 14 07:02:00 crc kubenswrapper[4893]: I0314 07:02:00.452348 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557862-tmqkz" Mar 14 07:02:00 crc kubenswrapper[4893]: I0314 07:02:00.889354 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557862-tmqkz"] Mar 14 07:02:00 crc kubenswrapper[4893]: W0314 07:02:00.896594 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod19c5a27c_6f7c_46d9_bc16_27796b0bd030.slice/crio-c2a0b88ec44a9ed322d24d3afd2484ace38c7c8208c3a26ca94f426b59109938 WatchSource:0}: Error finding container c2a0b88ec44a9ed322d24d3afd2484ace38c7c8208c3a26ca94f426b59109938: Status 404 returned error can't find the container with id c2a0b88ec44a9ed322d24d3afd2484ace38c7c8208c3a26ca94f426b59109938 Mar 14 07:02:00 crc kubenswrapper[4893]: I0314 07:02:00.998433 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dkbml" Mar 14 07:02:01 crc kubenswrapper[4893]: I0314 07:02:01.046581 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd723b75-2439-4310-af24-f180d494ec68-catalog-content\") pod \"bd723b75-2439-4310-af24-f180d494ec68\" (UID: \"bd723b75-2439-4310-af24-f180d494ec68\") " Mar 14 07:02:01 crc kubenswrapper[4893]: I0314 07:02:01.046643 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dzwh\" (UniqueName: \"kubernetes.io/projected/bd723b75-2439-4310-af24-f180d494ec68-kube-api-access-4dzwh\") pod \"bd723b75-2439-4310-af24-f180d494ec68\" (UID: \"bd723b75-2439-4310-af24-f180d494ec68\") " Mar 14 07:02:01 crc kubenswrapper[4893]: I0314 07:02:01.046681 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd723b75-2439-4310-af24-f180d494ec68-utilities\") pod \"bd723b75-2439-4310-af24-f180d494ec68\" (UID: \"bd723b75-2439-4310-af24-f180d494ec68\") " Mar 14 07:02:01 crc kubenswrapper[4893]: I0314 07:02:01.047721 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd723b75-2439-4310-af24-f180d494ec68-utilities" (OuterVolumeSpecName: "utilities") pod "bd723b75-2439-4310-af24-f180d494ec68" (UID: "bd723b75-2439-4310-af24-f180d494ec68"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:02:01 crc kubenswrapper[4893]: I0314 07:02:01.055733 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd723b75-2439-4310-af24-f180d494ec68-kube-api-access-4dzwh" (OuterVolumeSpecName: "kube-api-access-4dzwh") pod "bd723b75-2439-4310-af24-f180d494ec68" (UID: "bd723b75-2439-4310-af24-f180d494ec68"). InnerVolumeSpecName "kube-api-access-4dzwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:02:01 crc kubenswrapper[4893]: I0314 07:02:01.102290 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd723b75-2439-4310-af24-f180d494ec68-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bd723b75-2439-4310-af24-f180d494ec68" (UID: "bd723b75-2439-4310-af24-f180d494ec68"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:02:01 crc kubenswrapper[4893]: I0314 07:02:01.147673 4893 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd723b75-2439-4310-af24-f180d494ec68-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 07:02:01 crc kubenswrapper[4893]: I0314 07:02:01.147905 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dzwh\" (UniqueName: \"kubernetes.io/projected/bd723b75-2439-4310-af24-f180d494ec68-kube-api-access-4dzwh\") on node \"crc\" DevicePath \"\"" Mar 14 07:02:01 crc kubenswrapper[4893]: I0314 07:02:01.147969 4893 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd723b75-2439-4310-af24-f180d494ec68-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 07:02:01 crc kubenswrapper[4893]: I0314 07:02:01.289838 4893 generic.go:334] "Generic (PLEG): container finished" podID="bd723b75-2439-4310-af24-f180d494ec68" containerID="1f7d49c48aa5c7d30acee1a33d7196b30b4dc7844d32fa49ed1a1426d0fd9774" exitCode=0 Mar 14 07:02:01 crc kubenswrapper[4893]: I0314 07:02:01.289926 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dkbml" event={"ID":"bd723b75-2439-4310-af24-f180d494ec68","Type":"ContainerDied","Data":"1f7d49c48aa5c7d30acee1a33d7196b30b4dc7844d32fa49ed1a1426d0fd9774"} Mar 14 07:02:01 crc kubenswrapper[4893]: I0314 07:02:01.289964 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dkbml" event={"ID":"bd723b75-2439-4310-af24-f180d494ec68","Type":"ContainerDied","Data":"9232b33f6d77f897b909f7e366384a54d3b02c19c9adc33122f68997eda7ad3a"} Mar 14 07:02:01 crc kubenswrapper[4893]: I0314 07:02:01.289985 4893 scope.go:117] "RemoveContainer" containerID="1f7d49c48aa5c7d30acee1a33d7196b30b4dc7844d32fa49ed1a1426d0fd9774" Mar 14 07:02:01 crc kubenswrapper[4893]: I0314 07:02:01.290024 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dkbml" Mar 14 07:02:01 crc kubenswrapper[4893]: I0314 07:02:01.290834 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557862-tmqkz" event={"ID":"19c5a27c-6f7c-46d9-bc16-27796b0bd030","Type":"ContainerStarted","Data":"c2a0b88ec44a9ed322d24d3afd2484ace38c7c8208c3a26ca94f426b59109938"} Mar 14 07:02:01 crc kubenswrapper[4893]: I0314 07:02:01.317303 4893 scope.go:117] "RemoveContainer" containerID="e7a557e064b03c6e3f5ac7058570ea6e240cae8d9e7bd7b41c4d3aa4504aaf06" Mar 14 07:02:01 crc kubenswrapper[4893]: I0314 07:02:01.329330 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dkbml"] Mar 14 07:02:01 crc kubenswrapper[4893]: I0314 07:02:01.337924 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dkbml"] Mar 14 07:02:01 crc kubenswrapper[4893]: I0314 07:02:01.342891 4893 scope.go:117] "RemoveContainer" containerID="34d7649e839505bd8fc824c6ec09c8a6328c336a0b41703f8c7dfe891eaf0278" Mar 14 07:02:01 crc kubenswrapper[4893]: I0314 07:02:01.372321 4893 scope.go:117] "RemoveContainer" containerID="1f7d49c48aa5c7d30acee1a33d7196b30b4dc7844d32fa49ed1a1426d0fd9774" Mar 14 07:02:01 crc kubenswrapper[4893]: E0314 07:02:01.372895 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f7d49c48aa5c7d30acee1a33d7196b30b4dc7844d32fa49ed1a1426d0fd9774\": container with ID starting with 1f7d49c48aa5c7d30acee1a33d7196b30b4dc7844d32fa49ed1a1426d0fd9774 not found: ID does not exist" containerID="1f7d49c48aa5c7d30acee1a33d7196b30b4dc7844d32fa49ed1a1426d0fd9774" Mar 14 07:02:01 crc kubenswrapper[4893]: I0314 07:02:01.372936 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f7d49c48aa5c7d30acee1a33d7196b30b4dc7844d32fa49ed1a1426d0fd9774"} err="failed to get container status \"1f7d49c48aa5c7d30acee1a33d7196b30b4dc7844d32fa49ed1a1426d0fd9774\": rpc error: code = NotFound desc = could not find container \"1f7d49c48aa5c7d30acee1a33d7196b30b4dc7844d32fa49ed1a1426d0fd9774\": container with ID starting with 1f7d49c48aa5c7d30acee1a33d7196b30b4dc7844d32fa49ed1a1426d0fd9774 not found: ID does not exist" Mar 14 07:02:01 crc kubenswrapper[4893]: I0314 07:02:01.372962 4893 scope.go:117] "RemoveContainer" containerID="e7a557e064b03c6e3f5ac7058570ea6e240cae8d9e7bd7b41c4d3aa4504aaf06" Mar 14 07:02:01 crc kubenswrapper[4893]: E0314 07:02:01.373386 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7a557e064b03c6e3f5ac7058570ea6e240cae8d9e7bd7b41c4d3aa4504aaf06\": container with ID starting with e7a557e064b03c6e3f5ac7058570ea6e240cae8d9e7bd7b41c4d3aa4504aaf06 not found: ID does not exist" containerID="e7a557e064b03c6e3f5ac7058570ea6e240cae8d9e7bd7b41c4d3aa4504aaf06" Mar 14 07:02:01 crc kubenswrapper[4893]: I0314 07:02:01.373406 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7a557e064b03c6e3f5ac7058570ea6e240cae8d9e7bd7b41c4d3aa4504aaf06"} err="failed to get container status \"e7a557e064b03c6e3f5ac7058570ea6e240cae8d9e7bd7b41c4d3aa4504aaf06\": rpc error: code = NotFound desc = could not find container \"e7a557e064b03c6e3f5ac7058570ea6e240cae8d9e7bd7b41c4d3aa4504aaf06\": container with ID starting with e7a557e064b03c6e3f5ac7058570ea6e240cae8d9e7bd7b41c4d3aa4504aaf06 not found: ID does not exist" Mar 14 07:02:01 crc kubenswrapper[4893]: I0314 07:02:01.373418 4893 scope.go:117] "RemoveContainer" containerID="34d7649e839505bd8fc824c6ec09c8a6328c336a0b41703f8c7dfe891eaf0278" Mar 14 07:02:01 crc kubenswrapper[4893]: E0314 07:02:01.373751 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34d7649e839505bd8fc824c6ec09c8a6328c336a0b41703f8c7dfe891eaf0278\": container with ID starting with 34d7649e839505bd8fc824c6ec09c8a6328c336a0b41703f8c7dfe891eaf0278 not found: ID does not exist" containerID="34d7649e839505bd8fc824c6ec09c8a6328c336a0b41703f8c7dfe891eaf0278" Mar 14 07:02:01 crc kubenswrapper[4893]: I0314 07:02:01.373771 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34d7649e839505bd8fc824c6ec09c8a6328c336a0b41703f8c7dfe891eaf0278"} err="failed to get container status \"34d7649e839505bd8fc824c6ec09c8a6328c336a0b41703f8c7dfe891eaf0278\": rpc error: code = NotFound desc = could not find container \"34d7649e839505bd8fc824c6ec09c8a6328c336a0b41703f8c7dfe891eaf0278\": container with ID starting with 34d7649e839505bd8fc824c6ec09c8a6328c336a0b41703f8c7dfe891eaf0278 not found: ID does not exist" Mar 14 07:02:01 crc kubenswrapper[4893]: I0314 07:02:01.384615 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd723b75-2439-4310-af24-f180d494ec68" path="/var/lib/kubelet/pods/bd723b75-2439-4310-af24-f180d494ec68/volumes" Mar 14 07:02:05 crc kubenswrapper[4893]: I0314 07:02:05.817016 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-58ffcf8686-lkg7m"] Mar 14 07:02:05 crc kubenswrapper[4893]: I0314 07:02:05.817750 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-58ffcf8686-lkg7m" podUID="52f2f9c0-c74e-4918-bae7-0c3656513ba4" containerName="controller-manager" containerID="cri-o://3bc73172b0fc42fc9653ec721a109e28550474722584a71b341ccc25ee03382a" gracePeriod=30 Mar 14 07:02:05 crc kubenswrapper[4893]: I0314 07:02:05.827667 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-db54787bb-pvcjl"] Mar 14 07:02:05 crc kubenswrapper[4893]: I0314 07:02:05.827860 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-db54787bb-pvcjl" podUID="686355b3-8ecb-49d4-9230-c9d64d5ca268" containerName="route-controller-manager" containerID="cri-o://09f60905d4310bf58c24979769a4fdd89ad93f1df5da3de6071faae8c9b0257f" gracePeriod=30 Mar 14 07:02:05 crc kubenswrapper[4893]: I0314 07:02:05.912604 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-727v2" Mar 14 07:02:05 crc kubenswrapper[4893]: I0314 07:02:05.963078 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-727v2" Mar 14 07:02:06 crc kubenswrapper[4893]: I0314 07:02:06.091479 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vhk5b" Mar 14 07:02:06 crc kubenswrapper[4893]: I0314 07:02:06.091653 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vhk5b" Mar 14 07:02:06 crc kubenswrapper[4893]: I0314 07:02:06.124324 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vhk5b" Mar 14 07:02:06 crc kubenswrapper[4893]: I0314 07:02:06.333574 4893 generic.go:334] "Generic (PLEG): container finished" podID="686355b3-8ecb-49d4-9230-c9d64d5ca268" containerID="09f60905d4310bf58c24979769a4fdd89ad93f1df5da3de6071faae8c9b0257f" exitCode=0 Mar 14 07:02:06 crc kubenswrapper[4893]: I0314 07:02:06.333670 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-db54787bb-pvcjl" event={"ID":"686355b3-8ecb-49d4-9230-c9d64d5ca268","Type":"ContainerDied","Data":"09f60905d4310bf58c24979769a4fdd89ad93f1df5da3de6071faae8c9b0257f"} Mar 14 07:02:06 crc kubenswrapper[4893]: I0314 07:02:06.335043 4893 generic.go:334] "Generic (PLEG): container finished" podID="52f2f9c0-c74e-4918-bae7-0c3656513ba4" containerID="3bc73172b0fc42fc9653ec721a109e28550474722584a71b341ccc25ee03382a" exitCode=0 Mar 14 07:02:06 crc kubenswrapper[4893]: I0314 07:02:06.335853 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-58ffcf8686-lkg7m" event={"ID":"52f2f9c0-c74e-4918-bae7-0c3656513ba4","Type":"ContainerDied","Data":"3bc73172b0fc42fc9653ec721a109e28550474722584a71b341ccc25ee03382a"} Mar 14 07:02:06 crc kubenswrapper[4893]: I0314 07:02:06.380326 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vhk5b" Mar 14 07:02:07 crc kubenswrapper[4893]: I0314 07:02:07.683476 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gn5xf" Mar 14 07:02:07 crc kubenswrapper[4893]: I0314 07:02:07.683718 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gn5xf" Mar 14 07:02:07 crc kubenswrapper[4893]: I0314 07:02:07.731377 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gn5xf" Mar 14 07:02:08 crc kubenswrapper[4893]: I0314 07:02:08.123798 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-84hsf" Mar 14 07:02:08 crc kubenswrapper[4893]: I0314 07:02:08.123895 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-84hsf" Mar 14 07:02:08 crc kubenswrapper[4893]: I0314 07:02:08.166486 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-84hsf" Mar 14 07:02:08 crc kubenswrapper[4893]: I0314 07:02:08.396253 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-84hsf" Mar 14 07:02:08 crc kubenswrapper[4893]: I0314 07:02:08.397008 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gn5xf" Mar 14 07:02:08 crc kubenswrapper[4893]: I0314 07:02:08.431697 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vhk5b"] Mar 14 07:02:09 crc kubenswrapper[4893]: I0314 07:02:09.229691 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xsf27" Mar 14 07:02:09 crc kubenswrapper[4893]: I0314 07:02:09.229743 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xsf27" Mar 14 07:02:09 crc kubenswrapper[4893]: I0314 07:02:09.274667 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xsf27" Mar 14 07:02:09 crc kubenswrapper[4893]: I0314 07:02:09.356835 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vhk5b" podUID="28f15d4e-4de1-481e-bd52-d4dcada252a4" containerName="registry-server" containerID="cri-o://fc0cd124674798c21623e1b056821e8346e46a680c244ebe1bcb21cb03fade33" gracePeriod=2 Mar 14 07:02:09 crc kubenswrapper[4893]: I0314 07:02:09.398691 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xsf27" Mar 14 07:02:09 crc kubenswrapper[4893]: I0314 07:02:09.795817 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6hlsj" Mar 14 07:02:09 crc kubenswrapper[4893]: I0314 07:02:09.795866 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6hlsj" Mar 14 07:02:09 crc kubenswrapper[4893]: I0314 07:02:09.877724 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6hlsj" Mar 14 07:02:09 crc kubenswrapper[4893]: I0314 07:02:09.964198 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-db54787bb-pvcjl" Mar 14 07:02:09 crc kubenswrapper[4893]: I0314 07:02:09.969980 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-58ffcf8686-lkg7m" Mar 14 07:02:09 crc kubenswrapper[4893]: I0314 07:02:09.995659 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59689957cc-nqtk2"] Mar 14 07:02:09 crc kubenswrapper[4893]: E0314 07:02:09.995981 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd723b75-2439-4310-af24-f180d494ec68" containerName="extract-utilities" Mar 14 07:02:09 crc kubenswrapper[4893]: I0314 07:02:09.995999 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd723b75-2439-4310-af24-f180d494ec68" containerName="extract-utilities" Mar 14 07:02:09 crc kubenswrapper[4893]: E0314 07:02:09.996013 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd723b75-2439-4310-af24-f180d494ec68" containerName="extract-content" Mar 14 07:02:09 crc kubenswrapper[4893]: I0314 07:02:09.996022 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd723b75-2439-4310-af24-f180d494ec68" containerName="extract-content" Mar 14 07:02:09 crc kubenswrapper[4893]: E0314 07:02:09.996035 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52f2f9c0-c74e-4918-bae7-0c3656513ba4" containerName="controller-manager" Mar 14 07:02:09 crc kubenswrapper[4893]: I0314 07:02:09.996043 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="52f2f9c0-c74e-4918-bae7-0c3656513ba4" containerName="controller-manager" Mar 14 07:02:09 crc kubenswrapper[4893]: E0314 07:02:09.996060 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd723b75-2439-4310-af24-f180d494ec68" containerName="registry-server" Mar 14 07:02:09 crc kubenswrapper[4893]: I0314 07:02:09.996068 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd723b75-2439-4310-af24-f180d494ec68" containerName="registry-server" Mar 14 07:02:09 crc kubenswrapper[4893]: E0314 07:02:09.996084 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="686355b3-8ecb-49d4-9230-c9d64d5ca268" containerName="route-controller-manager" Mar 14 07:02:09 crc kubenswrapper[4893]: I0314 07:02:09.996092 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="686355b3-8ecb-49d4-9230-c9d64d5ca268" containerName="route-controller-manager" Mar 14 07:02:09 crc kubenswrapper[4893]: I0314 07:02:09.996218 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd723b75-2439-4310-af24-f180d494ec68" containerName="registry-server" Mar 14 07:02:09 crc kubenswrapper[4893]: I0314 07:02:09.996233 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="52f2f9c0-c74e-4918-bae7-0c3656513ba4" containerName="controller-manager" Mar 14 07:02:09 crc kubenswrapper[4893]: I0314 07:02:09.996254 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="686355b3-8ecb-49d4-9230-c9d64d5ca268" containerName="route-controller-manager" Mar 14 07:02:09 crc kubenswrapper[4893]: I0314 07:02:09.996754 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-59689957cc-nqtk2" Mar 14 07:02:10 crc kubenswrapper[4893]: I0314 07:02:10.003150 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59689957cc-nqtk2"] Mar 14 07:02:10 crc kubenswrapper[4893]: I0314 07:02:10.070428 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/686355b3-8ecb-49d4-9230-c9d64d5ca268-client-ca\") pod \"686355b3-8ecb-49d4-9230-c9d64d5ca268\" (UID: \"686355b3-8ecb-49d4-9230-c9d64d5ca268\") " Mar 14 07:02:10 crc kubenswrapper[4893]: I0314 07:02:10.070503 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67vtv\" (UniqueName: \"kubernetes.io/projected/52f2f9c0-c74e-4918-bae7-0c3656513ba4-kube-api-access-67vtv\") pod \"52f2f9c0-c74e-4918-bae7-0c3656513ba4\" (UID: \"52f2f9c0-c74e-4918-bae7-0c3656513ba4\") " Mar 14 07:02:10 crc kubenswrapper[4893]: I0314 07:02:10.070590 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52f2f9c0-c74e-4918-bae7-0c3656513ba4-serving-cert\") pod \"52f2f9c0-c74e-4918-bae7-0c3656513ba4\" (UID: \"52f2f9c0-c74e-4918-bae7-0c3656513ba4\") " Mar 14 07:02:10 crc kubenswrapper[4893]: I0314 07:02:10.070631 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/686355b3-8ecb-49d4-9230-c9d64d5ca268-config\") pod \"686355b3-8ecb-49d4-9230-c9d64d5ca268\" (UID: \"686355b3-8ecb-49d4-9230-c9d64d5ca268\") " Mar 14 07:02:10 crc kubenswrapper[4893]: I0314 07:02:10.070666 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/52f2f9c0-c74e-4918-bae7-0c3656513ba4-client-ca\") pod \"52f2f9c0-c74e-4918-bae7-0c3656513ba4\" (UID: \"52f2f9c0-c74e-4918-bae7-0c3656513ba4\") " Mar 14 07:02:10 crc kubenswrapper[4893]: I0314 07:02:10.070725 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/52f2f9c0-c74e-4918-bae7-0c3656513ba4-proxy-ca-bundles\") pod \"52f2f9c0-c74e-4918-bae7-0c3656513ba4\" (UID: \"52f2f9c0-c74e-4918-bae7-0c3656513ba4\") " Mar 14 07:02:10 crc kubenswrapper[4893]: I0314 07:02:10.070747 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52f2f9c0-c74e-4918-bae7-0c3656513ba4-config\") pod \"52f2f9c0-c74e-4918-bae7-0c3656513ba4\" (UID: \"52f2f9c0-c74e-4918-bae7-0c3656513ba4\") " Mar 14 07:02:10 crc kubenswrapper[4893]: I0314 07:02:10.071460 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52f2f9c0-c74e-4918-bae7-0c3656513ba4-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "52f2f9c0-c74e-4918-bae7-0c3656513ba4" (UID: "52f2f9c0-c74e-4918-bae7-0c3656513ba4"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:02:10 crc kubenswrapper[4893]: I0314 07:02:10.071504 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52f2f9c0-c74e-4918-bae7-0c3656513ba4-client-ca" (OuterVolumeSpecName: "client-ca") pod "52f2f9c0-c74e-4918-bae7-0c3656513ba4" (UID: "52f2f9c0-c74e-4918-bae7-0c3656513ba4"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:02:10 crc kubenswrapper[4893]: I0314 07:02:10.071585 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/686355b3-8ecb-49d4-9230-c9d64d5ca268-serving-cert\") pod \"686355b3-8ecb-49d4-9230-c9d64d5ca268\" (UID: \"686355b3-8ecb-49d4-9230-c9d64d5ca268\") " Mar 14 07:02:10 crc kubenswrapper[4893]: I0314 07:02:10.071774 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgcq6\" (UniqueName: \"kubernetes.io/projected/686355b3-8ecb-49d4-9230-c9d64d5ca268-kube-api-access-fgcq6\") pod \"686355b3-8ecb-49d4-9230-c9d64d5ca268\" (UID: \"686355b3-8ecb-49d4-9230-c9d64d5ca268\") " Mar 14 07:02:10 crc kubenswrapper[4893]: I0314 07:02:10.071971 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/686355b3-8ecb-49d4-9230-c9d64d5ca268-config" (OuterVolumeSpecName: "config") pod "686355b3-8ecb-49d4-9230-c9d64d5ca268" (UID: "686355b3-8ecb-49d4-9230-c9d64d5ca268"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:02:10 crc kubenswrapper[4893]: I0314 07:02:10.072340 4893 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/52f2f9c0-c74e-4918-bae7-0c3656513ba4-client-ca\") on node \"crc\" DevicePath \"\"" Mar 14 07:02:10 crc kubenswrapper[4893]: I0314 07:02:10.072374 4893 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/52f2f9c0-c74e-4918-bae7-0c3656513ba4-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 14 07:02:10 crc kubenswrapper[4893]: I0314 07:02:10.072393 4893 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/686355b3-8ecb-49d4-9230-c9d64d5ca268-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:02:10 crc kubenswrapper[4893]: I0314 07:02:10.072853 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52f2f9c0-c74e-4918-bae7-0c3656513ba4-config" (OuterVolumeSpecName: "config") pod "52f2f9c0-c74e-4918-bae7-0c3656513ba4" (UID: "52f2f9c0-c74e-4918-bae7-0c3656513ba4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:02:10 crc kubenswrapper[4893]: I0314 07:02:10.076590 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/686355b3-8ecb-49d4-9230-c9d64d5ca268-kube-api-access-fgcq6" (OuterVolumeSpecName: "kube-api-access-fgcq6") pod "686355b3-8ecb-49d4-9230-c9d64d5ca268" (UID: "686355b3-8ecb-49d4-9230-c9d64d5ca268"). InnerVolumeSpecName "kube-api-access-fgcq6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:02:10 crc kubenswrapper[4893]: I0314 07:02:10.076634 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/686355b3-8ecb-49d4-9230-c9d64d5ca268-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "686355b3-8ecb-49d4-9230-c9d64d5ca268" (UID: "686355b3-8ecb-49d4-9230-c9d64d5ca268"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:02:10 crc kubenswrapper[4893]: I0314 07:02:10.076657 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52f2f9c0-c74e-4918-bae7-0c3656513ba4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "52f2f9c0-c74e-4918-bae7-0c3656513ba4" (UID: "52f2f9c0-c74e-4918-bae7-0c3656513ba4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:02:10 crc kubenswrapper[4893]: I0314 07:02:10.076766 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/686355b3-8ecb-49d4-9230-c9d64d5ca268-client-ca" (OuterVolumeSpecName: "client-ca") pod "686355b3-8ecb-49d4-9230-c9d64d5ca268" (UID: "686355b3-8ecb-49d4-9230-c9d64d5ca268"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:02:10 crc kubenswrapper[4893]: I0314 07:02:10.082946 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52f2f9c0-c74e-4918-bae7-0c3656513ba4-kube-api-access-67vtv" (OuterVolumeSpecName: "kube-api-access-67vtv") pod "52f2f9c0-c74e-4918-bae7-0c3656513ba4" (UID: "52f2f9c0-c74e-4918-bae7-0c3656513ba4"). InnerVolumeSpecName "kube-api-access-67vtv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:02:10 crc kubenswrapper[4893]: I0314 07:02:10.174130 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cdc6aea3-c426-4e6b-9f31-cdace87b288e-client-ca\") pod \"route-controller-manager-59689957cc-nqtk2\" (UID: \"cdc6aea3-c426-4e6b-9f31-cdace87b288e\") " pod="openshift-route-controller-manager/route-controller-manager-59689957cc-nqtk2" Mar 14 07:02:10 crc kubenswrapper[4893]: I0314 07:02:10.174556 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvrsr\" (UniqueName: \"kubernetes.io/projected/cdc6aea3-c426-4e6b-9f31-cdace87b288e-kube-api-access-tvrsr\") pod \"route-controller-manager-59689957cc-nqtk2\" (UID: \"cdc6aea3-c426-4e6b-9f31-cdace87b288e\") " pod="openshift-route-controller-manager/route-controller-manager-59689957cc-nqtk2" Mar 14 07:02:10 crc kubenswrapper[4893]: I0314 07:02:10.174629 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cdc6aea3-c426-4e6b-9f31-cdace87b288e-config\") pod \"route-controller-manager-59689957cc-nqtk2\" (UID: \"cdc6aea3-c426-4e6b-9f31-cdace87b288e\") " pod="openshift-route-controller-manager/route-controller-manager-59689957cc-nqtk2" Mar 14 07:02:10 crc kubenswrapper[4893]: I0314 07:02:10.174668 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cdc6aea3-c426-4e6b-9f31-cdace87b288e-serving-cert\") pod \"route-controller-manager-59689957cc-nqtk2\" (UID: \"cdc6aea3-c426-4e6b-9f31-cdace87b288e\") " pod="openshift-route-controller-manager/route-controller-manager-59689957cc-nqtk2" Mar 14 07:02:10 crc kubenswrapper[4893]: I0314 07:02:10.174755 4893 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/686355b3-8ecb-49d4-9230-c9d64d5ca268-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:02:10 crc kubenswrapper[4893]: I0314 07:02:10.174778 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgcq6\" (UniqueName: \"kubernetes.io/projected/686355b3-8ecb-49d4-9230-c9d64d5ca268-kube-api-access-fgcq6\") on node \"crc\" DevicePath \"\"" Mar 14 07:02:10 crc kubenswrapper[4893]: I0314 07:02:10.174797 4893 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/686355b3-8ecb-49d4-9230-c9d64d5ca268-client-ca\") on node \"crc\" DevicePath \"\"" Mar 14 07:02:10 crc kubenswrapper[4893]: I0314 07:02:10.174815 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67vtv\" (UniqueName: \"kubernetes.io/projected/52f2f9c0-c74e-4918-bae7-0c3656513ba4-kube-api-access-67vtv\") on node \"crc\" DevicePath \"\"" Mar 14 07:02:10 crc kubenswrapper[4893]: I0314 07:02:10.174832 4893 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52f2f9c0-c74e-4918-bae7-0c3656513ba4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:02:10 crc kubenswrapper[4893]: I0314 07:02:10.174849 4893 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52f2f9c0-c74e-4918-bae7-0c3656513ba4-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:02:10 crc kubenswrapper[4893]: I0314 07:02:10.276202 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvrsr\" (UniqueName: \"kubernetes.io/projected/cdc6aea3-c426-4e6b-9f31-cdace87b288e-kube-api-access-tvrsr\") pod \"route-controller-manager-59689957cc-nqtk2\" (UID: \"cdc6aea3-c426-4e6b-9f31-cdace87b288e\") " pod="openshift-route-controller-manager/route-controller-manager-59689957cc-nqtk2" Mar 14 07:02:10 crc kubenswrapper[4893]: I0314 07:02:10.276277 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cdc6aea3-c426-4e6b-9f31-cdace87b288e-config\") pod \"route-controller-manager-59689957cc-nqtk2\" (UID: \"cdc6aea3-c426-4e6b-9f31-cdace87b288e\") " pod="openshift-route-controller-manager/route-controller-manager-59689957cc-nqtk2" Mar 14 07:02:10 crc kubenswrapper[4893]: I0314 07:02:10.276311 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cdc6aea3-c426-4e6b-9f31-cdace87b288e-serving-cert\") pod \"route-controller-manager-59689957cc-nqtk2\" (UID: \"cdc6aea3-c426-4e6b-9f31-cdace87b288e\") " pod="openshift-route-controller-manager/route-controller-manager-59689957cc-nqtk2" Mar 14 07:02:10 crc kubenswrapper[4893]: I0314 07:02:10.276356 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cdc6aea3-c426-4e6b-9f31-cdace87b288e-client-ca\") pod \"route-controller-manager-59689957cc-nqtk2\" (UID: \"cdc6aea3-c426-4e6b-9f31-cdace87b288e\") " pod="openshift-route-controller-manager/route-controller-manager-59689957cc-nqtk2" Mar 14 07:02:10 crc kubenswrapper[4893]: I0314 07:02:10.277451 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cdc6aea3-c426-4e6b-9f31-cdace87b288e-client-ca\") pod \"route-controller-manager-59689957cc-nqtk2\" (UID: \"cdc6aea3-c426-4e6b-9f31-cdace87b288e\") " pod="openshift-route-controller-manager/route-controller-manager-59689957cc-nqtk2" Mar 14 07:02:10 crc kubenswrapper[4893]: I0314 07:02:10.280649 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cdc6aea3-c426-4e6b-9f31-cdace87b288e-config\") pod \"route-controller-manager-59689957cc-nqtk2\" (UID: \"cdc6aea3-c426-4e6b-9f31-cdace87b288e\") " pod="openshift-route-controller-manager/route-controller-manager-59689957cc-nqtk2" Mar 14 07:02:10 crc kubenswrapper[4893]: I0314 07:02:10.281669 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cdc6aea3-c426-4e6b-9f31-cdace87b288e-serving-cert\") pod \"route-controller-manager-59689957cc-nqtk2\" (UID: \"cdc6aea3-c426-4e6b-9f31-cdace87b288e\") " pod="openshift-route-controller-manager/route-controller-manager-59689957cc-nqtk2" Mar 14 07:02:10 crc kubenswrapper[4893]: I0314 07:02:10.293979 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvrsr\" (UniqueName: \"kubernetes.io/projected/cdc6aea3-c426-4e6b-9f31-cdace87b288e-kube-api-access-tvrsr\") pod \"route-controller-manager-59689957cc-nqtk2\" (UID: \"cdc6aea3-c426-4e6b-9f31-cdace87b288e\") " pod="openshift-route-controller-manager/route-controller-manager-59689957cc-nqtk2" Mar 14 07:02:10 crc kubenswrapper[4893]: I0314 07:02:10.323614 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-59689957cc-nqtk2" Mar 14 07:02:10 crc kubenswrapper[4893]: I0314 07:02:10.365395 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-db54787bb-pvcjl" event={"ID":"686355b3-8ecb-49d4-9230-c9d64d5ca268","Type":"ContainerDied","Data":"ee8cd669b5997e78d407e45c4402d8921a1f59c0a237ff1e24557b9fc279c6a0"} Mar 14 07:02:10 crc kubenswrapper[4893]: I0314 07:02:10.365425 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-db54787bb-pvcjl" Mar 14 07:02:10 crc kubenswrapper[4893]: I0314 07:02:10.365711 4893 scope.go:117] "RemoveContainer" containerID="09f60905d4310bf58c24979769a4fdd89ad93f1df5da3de6071faae8c9b0257f" Mar 14 07:02:10 crc kubenswrapper[4893]: I0314 07:02:10.368002 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-58ffcf8686-lkg7m" event={"ID":"52f2f9c0-c74e-4918-bae7-0c3656513ba4","Type":"ContainerDied","Data":"d872ceab733f38c564f567654da974ec64b264843e5aac24dcf42ff0227515c8"} Mar 14 07:02:10 crc kubenswrapper[4893]: I0314 07:02:10.368047 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-58ffcf8686-lkg7m" Mar 14 07:02:10 crc kubenswrapper[4893]: I0314 07:02:10.372042 4893 generic.go:334] "Generic (PLEG): container finished" podID="28f15d4e-4de1-481e-bd52-d4dcada252a4" containerID="fc0cd124674798c21623e1b056821e8346e46a680c244ebe1bcb21cb03fade33" exitCode=0 Mar 14 07:02:10 crc kubenswrapper[4893]: I0314 07:02:10.372116 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vhk5b" event={"ID":"28f15d4e-4de1-481e-bd52-d4dcada252a4","Type":"ContainerDied","Data":"fc0cd124674798c21623e1b056821e8346e46a680c244ebe1bcb21cb03fade33"} Mar 14 07:02:10 crc kubenswrapper[4893]: I0314 07:02:10.400040 4893 patch_prober.go:28] interesting pod/route-controller-manager-db54787bb-pvcjl container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.62:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 07:02:10 crc kubenswrapper[4893]: I0314 07:02:10.400133 4893 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-db54787bb-pvcjl" podUID="686355b3-8ecb-49d4-9230-c9d64d5ca268" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.62:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 07:02:10 crc kubenswrapper[4893]: I0314 07:02:10.401178 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-db54787bb-pvcjl"] Mar 14 07:02:10 crc kubenswrapper[4893]: I0314 07:02:10.410180 4893 patch_prober.go:28] interesting pod/controller-manager-58ffcf8686-lkg7m container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.61:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 07:02:10 crc kubenswrapper[4893]: I0314 07:02:10.410241 4893 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-58ffcf8686-lkg7m" podUID="52f2f9c0-c74e-4918-bae7-0c3656513ba4" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.61:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 07:02:10 crc kubenswrapper[4893]: I0314 07:02:10.420702 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-db54787bb-pvcjl"] Mar 14 07:02:10 crc kubenswrapper[4893]: I0314 07:02:10.427278 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-58ffcf8686-lkg7m"] Mar 14 07:02:10 crc kubenswrapper[4893]: I0314 07:02:10.429228 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-58ffcf8686-lkg7m"] Mar 14 07:02:10 crc kubenswrapper[4893]: I0314 07:02:10.437283 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6hlsj" Mar 14 07:02:10 crc kubenswrapper[4893]: I0314 07:02:10.826990 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-84hsf"] Mar 14 07:02:10 crc kubenswrapper[4893]: I0314 07:02:10.827308 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-84hsf" podUID="00db91c2-32b1-4b21-9795-e47042b4b9a4" containerName="registry-server" containerID="cri-o://b1e95e25be1dc1fd7ee4ff0d6d851a112c9e18efd6bc3d83face4f1aefda1708" gracePeriod=2 Mar 14 07:02:10 crc kubenswrapper[4893]: I0314 07:02:10.975813 4893 scope.go:117] "RemoveContainer" containerID="3bc73172b0fc42fc9653ec721a109e28550474722584a71b341ccc25ee03382a" Mar 14 07:02:11 crc kubenswrapper[4893]: I0314 07:02:11.022124 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vhk5b" Mar 14 07:02:11 crc kubenswrapper[4893]: I0314 07:02:11.190053 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ks8zw\" (UniqueName: \"kubernetes.io/projected/28f15d4e-4de1-481e-bd52-d4dcada252a4-kube-api-access-ks8zw\") pod \"28f15d4e-4de1-481e-bd52-d4dcada252a4\" (UID: \"28f15d4e-4de1-481e-bd52-d4dcada252a4\") " Mar 14 07:02:11 crc kubenswrapper[4893]: I0314 07:02:11.190550 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28f15d4e-4de1-481e-bd52-d4dcada252a4-catalog-content\") pod \"28f15d4e-4de1-481e-bd52-d4dcada252a4\" (UID: \"28f15d4e-4de1-481e-bd52-d4dcada252a4\") " Mar 14 07:02:11 crc kubenswrapper[4893]: I0314 07:02:11.190590 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28f15d4e-4de1-481e-bd52-d4dcada252a4-utilities\") pod \"28f15d4e-4de1-481e-bd52-d4dcada252a4\" (UID: \"28f15d4e-4de1-481e-bd52-d4dcada252a4\") " Mar 14 07:02:11 crc kubenswrapper[4893]: I0314 07:02:11.191501 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28f15d4e-4de1-481e-bd52-d4dcada252a4-utilities" (OuterVolumeSpecName: "utilities") pod "28f15d4e-4de1-481e-bd52-d4dcada252a4" (UID: "28f15d4e-4de1-481e-bd52-d4dcada252a4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:02:11 crc kubenswrapper[4893]: I0314 07:02:11.212038 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28f15d4e-4de1-481e-bd52-d4dcada252a4-kube-api-access-ks8zw" (OuterVolumeSpecName: "kube-api-access-ks8zw") pod "28f15d4e-4de1-481e-bd52-d4dcada252a4" (UID: "28f15d4e-4de1-481e-bd52-d4dcada252a4"). InnerVolumeSpecName "kube-api-access-ks8zw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:02:11 crc kubenswrapper[4893]: I0314 07:02:11.250564 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28f15d4e-4de1-481e-bd52-d4dcada252a4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "28f15d4e-4de1-481e-bd52-d4dcada252a4" (UID: "28f15d4e-4de1-481e-bd52-d4dcada252a4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:02:11 crc kubenswrapper[4893]: I0314 07:02:11.256103 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-84hsf" Mar 14 07:02:11 crc kubenswrapper[4893]: I0314 07:02:11.292209 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ks8zw\" (UniqueName: \"kubernetes.io/projected/28f15d4e-4de1-481e-bd52-d4dcada252a4-kube-api-access-ks8zw\") on node \"crc\" DevicePath \"\"" Mar 14 07:02:11 crc kubenswrapper[4893]: I0314 07:02:11.292249 4893 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28f15d4e-4de1-481e-bd52-d4dcada252a4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 07:02:11 crc kubenswrapper[4893]: I0314 07:02:11.292257 4893 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28f15d4e-4de1-481e-bd52-d4dcada252a4-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 07:02:11 crc kubenswrapper[4893]: I0314 07:02:11.380606 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vhk5b" Mar 14 07:02:11 crc kubenswrapper[4893]: I0314 07:02:11.383766 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52f2f9c0-c74e-4918-bae7-0c3656513ba4" path="/var/lib/kubelet/pods/52f2f9c0-c74e-4918-bae7-0c3656513ba4/volumes" Mar 14 07:02:11 crc kubenswrapper[4893]: I0314 07:02:11.384573 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="686355b3-8ecb-49d4-9230-c9d64d5ca268" path="/var/lib/kubelet/pods/686355b3-8ecb-49d4-9230-c9d64d5ca268/volumes" Mar 14 07:02:11 crc kubenswrapper[4893]: I0314 07:02:11.387922 4893 generic.go:334] "Generic (PLEG): container finished" podID="00db91c2-32b1-4b21-9795-e47042b4b9a4" containerID="b1e95e25be1dc1fd7ee4ff0d6d851a112c9e18efd6bc3d83face4f1aefda1708" exitCode=0 Mar 14 07:02:11 crc kubenswrapper[4893]: I0314 07:02:11.388018 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-84hsf" Mar 14 07:02:11 crc kubenswrapper[4893]: I0314 07:02:11.390236 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vhk5b" event={"ID":"28f15d4e-4de1-481e-bd52-d4dcada252a4","Type":"ContainerDied","Data":"acc8a9581e067969c51fc5d53c4ddb0fcd7ed049ae41db327b296a4eb9d42cdb"} Mar 14 07:02:11 crc kubenswrapper[4893]: I0314 07:02:11.390466 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-84hsf" event={"ID":"00db91c2-32b1-4b21-9795-e47042b4b9a4","Type":"ContainerDied","Data":"b1e95e25be1dc1fd7ee4ff0d6d851a112c9e18efd6bc3d83face4f1aefda1708"} Mar 14 07:02:11 crc kubenswrapper[4893]: I0314 07:02:11.390482 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-84hsf" event={"ID":"00db91c2-32b1-4b21-9795-e47042b4b9a4","Type":"ContainerDied","Data":"6975f86bbf0ae73d6c39bd214508fc8255f16230902b8dc4881c906e227fe6fd"} Mar 14 07:02:11 crc kubenswrapper[4893]: I0314 07:02:11.390499 4893 scope.go:117] "RemoveContainer" containerID="fc0cd124674798c21623e1b056821e8346e46a680c244ebe1bcb21cb03fade33" Mar 14 07:02:11 crc kubenswrapper[4893]: I0314 07:02:11.392630 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00db91c2-32b1-4b21-9795-e47042b4b9a4-catalog-content\") pod \"00db91c2-32b1-4b21-9795-e47042b4b9a4\" (UID: \"00db91c2-32b1-4b21-9795-e47042b4b9a4\") " Mar 14 07:02:11 crc kubenswrapper[4893]: I0314 07:02:11.392742 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00db91c2-32b1-4b21-9795-e47042b4b9a4-utilities\") pod \"00db91c2-32b1-4b21-9795-e47042b4b9a4\" (UID: \"00db91c2-32b1-4b21-9795-e47042b4b9a4\") " Mar 14 07:02:11 crc kubenswrapper[4893]: I0314 07:02:11.392838 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dq87\" (UniqueName: \"kubernetes.io/projected/00db91c2-32b1-4b21-9795-e47042b4b9a4-kube-api-access-4dq87\") pod \"00db91c2-32b1-4b21-9795-e47042b4b9a4\" (UID: \"00db91c2-32b1-4b21-9795-e47042b4b9a4\") " Mar 14 07:02:11 crc kubenswrapper[4893]: I0314 07:02:11.394323 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00db91c2-32b1-4b21-9795-e47042b4b9a4-utilities" (OuterVolumeSpecName: "utilities") pod "00db91c2-32b1-4b21-9795-e47042b4b9a4" (UID: "00db91c2-32b1-4b21-9795-e47042b4b9a4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:02:11 crc kubenswrapper[4893]: I0314 07:02:11.404173 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00db91c2-32b1-4b21-9795-e47042b4b9a4-kube-api-access-4dq87" (OuterVolumeSpecName: "kube-api-access-4dq87") pod "00db91c2-32b1-4b21-9795-e47042b4b9a4" (UID: "00db91c2-32b1-4b21-9795-e47042b4b9a4"). InnerVolumeSpecName "kube-api-access-4dq87". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:02:11 crc kubenswrapper[4893]: I0314 07:02:11.417850 4893 scope.go:117] "RemoveContainer" containerID="e900a302b46ca9e81acc00477f015190845447188d0df4f90cdbcd767bb583cd" Mar 14 07:02:11 crc kubenswrapper[4893]: I0314 07:02:11.420241 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vhk5b"] Mar 14 07:02:11 crc kubenswrapper[4893]: I0314 07:02:11.428404 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vhk5b"] Mar 14 07:02:11 crc kubenswrapper[4893]: I0314 07:02:11.429967 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00db91c2-32b1-4b21-9795-e47042b4b9a4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "00db91c2-32b1-4b21-9795-e47042b4b9a4" (UID: "00db91c2-32b1-4b21-9795-e47042b4b9a4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:02:11 crc kubenswrapper[4893]: I0314 07:02:11.437440 4893 scope.go:117] "RemoveContainer" containerID="ddc5f1090ac1a8c06dbbcbc5d6a750840f369a0d3b5747a26457caadf0fdf413" Mar 14 07:02:11 crc kubenswrapper[4893]: I0314 07:02:11.459434 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59689957cc-nqtk2"] Mar 14 07:02:11 crc kubenswrapper[4893]: I0314 07:02:11.462327 4893 scope.go:117] "RemoveContainer" containerID="b1e95e25be1dc1fd7ee4ff0d6d851a112c9e18efd6bc3d83face4f1aefda1708" Mar 14 07:02:11 crc kubenswrapper[4893]: W0314 07:02:11.465835 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdc6aea3_c426_4e6b_9f31_cdace87b288e.slice/crio-e5f4c52cf914d1d29b7dfef16bafdbd0da7f205d2228ccee125af67c15d7ae22 WatchSource:0}: Error finding container e5f4c52cf914d1d29b7dfef16bafdbd0da7f205d2228ccee125af67c15d7ae22: Status 404 returned error can't find the container with id e5f4c52cf914d1d29b7dfef16bafdbd0da7f205d2228ccee125af67c15d7ae22 Mar 14 07:02:11 crc kubenswrapper[4893]: I0314 07:02:11.474588 4893 scope.go:117] "RemoveContainer" containerID="70c4fa752470b944f0f1731115975db4874198018da4717d8692f5c49a456a6e" Mar 14 07:02:11 crc kubenswrapper[4893]: I0314 07:02:11.494394 4893 scope.go:117] "RemoveContainer" containerID="4b69fcab3f4b059d2e0c4faa086a3d2a184ca90d28b0a7ef54b75327656d28c5" Mar 14 07:02:11 crc kubenswrapper[4893]: I0314 07:02:11.494470 4893 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00db91c2-32b1-4b21-9795-e47042b4b9a4-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 07:02:11 crc kubenswrapper[4893]: I0314 07:02:11.494499 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dq87\" (UniqueName: \"kubernetes.io/projected/00db91c2-32b1-4b21-9795-e47042b4b9a4-kube-api-access-4dq87\") on node \"crc\" DevicePath \"\"" Mar 14 07:02:11 crc kubenswrapper[4893]: I0314 07:02:11.494510 4893 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00db91c2-32b1-4b21-9795-e47042b4b9a4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 07:02:11 crc kubenswrapper[4893]: I0314 07:02:11.513012 4893 scope.go:117] "RemoveContainer" containerID="b1e95e25be1dc1fd7ee4ff0d6d851a112c9e18efd6bc3d83face4f1aefda1708" Mar 14 07:02:11 crc kubenswrapper[4893]: E0314 07:02:11.514221 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1e95e25be1dc1fd7ee4ff0d6d851a112c9e18efd6bc3d83face4f1aefda1708\": container with ID starting with b1e95e25be1dc1fd7ee4ff0d6d851a112c9e18efd6bc3d83face4f1aefda1708 not found: ID does not exist" containerID="b1e95e25be1dc1fd7ee4ff0d6d851a112c9e18efd6bc3d83face4f1aefda1708" Mar 14 07:02:11 crc kubenswrapper[4893]: I0314 07:02:11.514368 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1e95e25be1dc1fd7ee4ff0d6d851a112c9e18efd6bc3d83face4f1aefda1708"} err="failed to get container status \"b1e95e25be1dc1fd7ee4ff0d6d851a112c9e18efd6bc3d83face4f1aefda1708\": rpc error: code = NotFound desc = could not find container \"b1e95e25be1dc1fd7ee4ff0d6d851a112c9e18efd6bc3d83face4f1aefda1708\": container with ID starting with b1e95e25be1dc1fd7ee4ff0d6d851a112c9e18efd6bc3d83face4f1aefda1708 not found: ID does not exist" Mar 14 07:02:11 crc kubenswrapper[4893]: I0314 07:02:11.514417 4893 scope.go:117] "RemoveContainer" containerID="70c4fa752470b944f0f1731115975db4874198018da4717d8692f5c49a456a6e" Mar 14 07:02:11 crc kubenswrapper[4893]: E0314 07:02:11.515150 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70c4fa752470b944f0f1731115975db4874198018da4717d8692f5c49a456a6e\": container with ID starting with 70c4fa752470b944f0f1731115975db4874198018da4717d8692f5c49a456a6e not found: ID does not exist" containerID="70c4fa752470b944f0f1731115975db4874198018da4717d8692f5c49a456a6e" Mar 14 07:02:11 crc kubenswrapper[4893]: I0314 07:02:11.515192 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70c4fa752470b944f0f1731115975db4874198018da4717d8692f5c49a456a6e"} err="failed to get container status \"70c4fa752470b944f0f1731115975db4874198018da4717d8692f5c49a456a6e\": rpc error: code = NotFound desc = could not find container \"70c4fa752470b944f0f1731115975db4874198018da4717d8692f5c49a456a6e\": container with ID starting with 70c4fa752470b944f0f1731115975db4874198018da4717d8692f5c49a456a6e not found: ID does not exist" Mar 14 07:02:11 crc kubenswrapper[4893]: I0314 07:02:11.515221 4893 scope.go:117] "RemoveContainer" containerID="4b69fcab3f4b059d2e0c4faa086a3d2a184ca90d28b0a7ef54b75327656d28c5" Mar 14 07:02:11 crc kubenswrapper[4893]: E0314 07:02:11.515646 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b69fcab3f4b059d2e0c4faa086a3d2a184ca90d28b0a7ef54b75327656d28c5\": container with ID starting with 4b69fcab3f4b059d2e0c4faa086a3d2a184ca90d28b0a7ef54b75327656d28c5 not found: ID does not exist" containerID="4b69fcab3f4b059d2e0c4faa086a3d2a184ca90d28b0a7ef54b75327656d28c5" Mar 14 07:02:11 crc kubenswrapper[4893]: I0314 07:02:11.515674 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b69fcab3f4b059d2e0c4faa086a3d2a184ca90d28b0a7ef54b75327656d28c5"} err="failed to get container status \"4b69fcab3f4b059d2e0c4faa086a3d2a184ca90d28b0a7ef54b75327656d28c5\": rpc error: code = NotFound desc = could not find container \"4b69fcab3f4b059d2e0c4faa086a3d2a184ca90d28b0a7ef54b75327656d28c5\": container with ID starting with 4b69fcab3f4b059d2e0c4faa086a3d2a184ca90d28b0a7ef54b75327656d28c5 not found: ID does not exist" Mar 14 07:02:11 crc kubenswrapper[4893]: I0314 07:02:11.715561 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-84hsf"] Mar 14 07:02:11 crc kubenswrapper[4893]: I0314 07:02:11.721253 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-84hsf"] Mar 14 07:02:11 crc kubenswrapper[4893]: I0314 07:02:11.890619 4893 csr.go:261] certificate signing request csr-7dplm is approved, waiting to be issued Mar 14 07:02:11 crc kubenswrapper[4893]: I0314 07:02:11.897660 4893 csr.go:257] certificate signing request csr-7dplm is issued Mar 14 07:02:12 crc kubenswrapper[4893]: I0314 07:02:12.083230 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5655bc4dbd-26tjh"] Mar 14 07:02:12 crc kubenswrapper[4893]: E0314 07:02:12.083418 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28f15d4e-4de1-481e-bd52-d4dcada252a4" containerName="extract-content" Mar 14 07:02:12 crc kubenswrapper[4893]: I0314 07:02:12.083429 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="28f15d4e-4de1-481e-bd52-d4dcada252a4" containerName="extract-content" Mar 14 07:02:12 crc kubenswrapper[4893]: E0314 07:02:12.083439 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00db91c2-32b1-4b21-9795-e47042b4b9a4" containerName="registry-server" Mar 14 07:02:12 crc kubenswrapper[4893]: I0314 07:02:12.083445 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="00db91c2-32b1-4b21-9795-e47042b4b9a4" containerName="registry-server" Mar 14 07:02:12 crc kubenswrapper[4893]: E0314 07:02:12.083452 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28f15d4e-4de1-481e-bd52-d4dcada252a4" containerName="extract-utilities" Mar 14 07:02:12 crc kubenswrapper[4893]: I0314 07:02:12.083458 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="28f15d4e-4de1-481e-bd52-d4dcada252a4" containerName="extract-utilities" Mar 14 07:02:12 crc kubenswrapper[4893]: E0314 07:02:12.083474 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00db91c2-32b1-4b21-9795-e47042b4b9a4" containerName="extract-utilities" Mar 14 07:02:12 crc kubenswrapper[4893]: I0314 07:02:12.083482 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="00db91c2-32b1-4b21-9795-e47042b4b9a4" containerName="extract-utilities" Mar 14 07:02:12 crc kubenswrapper[4893]: E0314 07:02:12.083490 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00db91c2-32b1-4b21-9795-e47042b4b9a4" containerName="extract-content" Mar 14 07:02:12 crc kubenswrapper[4893]: I0314 07:02:12.083497 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="00db91c2-32b1-4b21-9795-e47042b4b9a4" containerName="extract-content" Mar 14 07:02:12 crc kubenswrapper[4893]: E0314 07:02:12.083506 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28f15d4e-4de1-481e-bd52-d4dcada252a4" containerName="registry-server" Mar 14 07:02:12 crc kubenswrapper[4893]: I0314 07:02:12.083511 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="28f15d4e-4de1-481e-bd52-d4dcada252a4" containerName="registry-server" Mar 14 07:02:12 crc kubenswrapper[4893]: I0314 07:02:12.083612 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="28f15d4e-4de1-481e-bd52-d4dcada252a4" containerName="registry-server" Mar 14 07:02:12 crc kubenswrapper[4893]: I0314 07:02:12.083623 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="00db91c2-32b1-4b21-9795-e47042b4b9a4" containerName="registry-server" Mar 14 07:02:12 crc kubenswrapper[4893]: I0314 07:02:12.083958 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5655bc4dbd-26tjh" Mar 14 07:02:12 crc kubenswrapper[4893]: I0314 07:02:12.086401 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 14 07:02:12 crc kubenswrapper[4893]: I0314 07:02:12.086640 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 14 07:02:12 crc kubenswrapper[4893]: I0314 07:02:12.087875 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 14 07:02:12 crc kubenswrapper[4893]: I0314 07:02:12.087961 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 14 07:02:12 crc kubenswrapper[4893]: I0314 07:02:12.088100 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 14 07:02:12 crc kubenswrapper[4893]: I0314 07:02:12.088239 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 14 07:02:12 crc kubenswrapper[4893]: I0314 07:02:12.094291 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 14 07:02:12 crc kubenswrapper[4893]: I0314 07:02:12.098061 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5655bc4dbd-26tjh"] Mar 14 07:02:12 crc kubenswrapper[4893]: I0314 07:02:12.203966 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fd3b7325-46db-44d8-9c67-cde475def14a-proxy-ca-bundles\") pod \"controller-manager-5655bc4dbd-26tjh\" (UID: \"fd3b7325-46db-44d8-9c67-cde475def14a\") " pod="openshift-controller-manager/controller-manager-5655bc4dbd-26tjh" Mar 14 07:02:12 crc kubenswrapper[4893]: I0314 07:02:12.204016 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fd3b7325-46db-44d8-9c67-cde475def14a-client-ca\") pod \"controller-manager-5655bc4dbd-26tjh\" (UID: \"fd3b7325-46db-44d8-9c67-cde475def14a\") " pod="openshift-controller-manager/controller-manager-5655bc4dbd-26tjh" Mar 14 07:02:12 crc kubenswrapper[4893]: I0314 07:02:12.204052 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzz2p\" (UniqueName: \"kubernetes.io/projected/fd3b7325-46db-44d8-9c67-cde475def14a-kube-api-access-wzz2p\") pod \"controller-manager-5655bc4dbd-26tjh\" (UID: \"fd3b7325-46db-44d8-9c67-cde475def14a\") " pod="openshift-controller-manager/controller-manager-5655bc4dbd-26tjh" Mar 14 07:02:12 crc kubenswrapper[4893]: I0314 07:02:12.204097 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd3b7325-46db-44d8-9c67-cde475def14a-config\") pod \"controller-manager-5655bc4dbd-26tjh\" (UID: \"fd3b7325-46db-44d8-9c67-cde475def14a\") " pod="openshift-controller-manager/controller-manager-5655bc4dbd-26tjh" Mar 14 07:02:12 crc kubenswrapper[4893]: I0314 07:02:12.204204 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd3b7325-46db-44d8-9c67-cde475def14a-serving-cert\") pod \"controller-manager-5655bc4dbd-26tjh\" (UID: \"fd3b7325-46db-44d8-9c67-cde475def14a\") " pod="openshift-controller-manager/controller-manager-5655bc4dbd-26tjh" Mar 14 07:02:12 crc kubenswrapper[4893]: I0314 07:02:12.305247 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd3b7325-46db-44d8-9c67-cde475def14a-serving-cert\") pod \"controller-manager-5655bc4dbd-26tjh\" (UID: \"fd3b7325-46db-44d8-9c67-cde475def14a\") " pod="openshift-controller-manager/controller-manager-5655bc4dbd-26tjh" Mar 14 07:02:12 crc kubenswrapper[4893]: I0314 07:02:12.305347 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fd3b7325-46db-44d8-9c67-cde475def14a-proxy-ca-bundles\") pod \"controller-manager-5655bc4dbd-26tjh\" (UID: \"fd3b7325-46db-44d8-9c67-cde475def14a\") " pod="openshift-controller-manager/controller-manager-5655bc4dbd-26tjh" Mar 14 07:02:12 crc kubenswrapper[4893]: I0314 07:02:12.305374 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fd3b7325-46db-44d8-9c67-cde475def14a-client-ca\") pod \"controller-manager-5655bc4dbd-26tjh\" (UID: \"fd3b7325-46db-44d8-9c67-cde475def14a\") " pod="openshift-controller-manager/controller-manager-5655bc4dbd-26tjh" Mar 14 07:02:12 crc kubenswrapper[4893]: I0314 07:02:12.305398 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzz2p\" (UniqueName: \"kubernetes.io/projected/fd3b7325-46db-44d8-9c67-cde475def14a-kube-api-access-wzz2p\") pod \"controller-manager-5655bc4dbd-26tjh\" (UID: \"fd3b7325-46db-44d8-9c67-cde475def14a\") " pod="openshift-controller-manager/controller-manager-5655bc4dbd-26tjh" Mar 14 07:02:12 crc kubenswrapper[4893]: I0314 07:02:12.305433 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd3b7325-46db-44d8-9c67-cde475def14a-config\") pod \"controller-manager-5655bc4dbd-26tjh\" (UID: \"fd3b7325-46db-44d8-9c67-cde475def14a\") " pod="openshift-controller-manager/controller-manager-5655bc4dbd-26tjh" Mar 14 07:02:12 crc kubenswrapper[4893]: I0314 07:02:12.306711 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd3b7325-46db-44d8-9c67-cde475def14a-config\") pod \"controller-manager-5655bc4dbd-26tjh\" (UID: \"fd3b7325-46db-44d8-9c67-cde475def14a\") " pod="openshift-controller-manager/controller-manager-5655bc4dbd-26tjh" Mar 14 07:02:12 crc kubenswrapper[4893]: I0314 07:02:12.307271 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fd3b7325-46db-44d8-9c67-cde475def14a-client-ca\") pod \"controller-manager-5655bc4dbd-26tjh\" (UID: \"fd3b7325-46db-44d8-9c67-cde475def14a\") " pod="openshift-controller-manager/controller-manager-5655bc4dbd-26tjh" Mar 14 07:02:12 crc kubenswrapper[4893]: I0314 07:02:12.307656 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fd3b7325-46db-44d8-9c67-cde475def14a-proxy-ca-bundles\") pod \"controller-manager-5655bc4dbd-26tjh\" (UID: \"fd3b7325-46db-44d8-9c67-cde475def14a\") " pod="openshift-controller-manager/controller-manager-5655bc4dbd-26tjh" Mar 14 07:02:12 crc kubenswrapper[4893]: I0314 07:02:12.311620 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd3b7325-46db-44d8-9c67-cde475def14a-serving-cert\") pod \"controller-manager-5655bc4dbd-26tjh\" (UID: \"fd3b7325-46db-44d8-9c67-cde475def14a\") " pod="openshift-controller-manager/controller-manager-5655bc4dbd-26tjh" Mar 14 07:02:12 crc kubenswrapper[4893]: I0314 07:02:12.323744 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzz2p\" (UniqueName: \"kubernetes.io/projected/fd3b7325-46db-44d8-9c67-cde475def14a-kube-api-access-wzz2p\") pod \"controller-manager-5655bc4dbd-26tjh\" (UID: \"fd3b7325-46db-44d8-9c67-cde475def14a\") " pod="openshift-controller-manager/controller-manager-5655bc4dbd-26tjh" Mar 14 07:02:12 crc kubenswrapper[4893]: I0314 07:02:12.395875 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5655bc4dbd-26tjh" Mar 14 07:02:12 crc kubenswrapper[4893]: I0314 07:02:12.405223 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-59689957cc-nqtk2" event={"ID":"cdc6aea3-c426-4e6b-9f31-cdace87b288e","Type":"ContainerStarted","Data":"8b36934016777d8402938fd1d798fc42a0fe65f4ac02866e089873c00e82370e"} Mar 14 07:02:12 crc kubenswrapper[4893]: I0314 07:02:12.405335 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-59689957cc-nqtk2" event={"ID":"cdc6aea3-c426-4e6b-9f31-cdace87b288e","Type":"ContainerStarted","Data":"e5f4c52cf914d1d29b7dfef16bafdbd0da7f205d2228ccee125af67c15d7ae22"} Mar 14 07:02:12 crc kubenswrapper[4893]: I0314 07:02:12.405625 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-59689957cc-nqtk2" Mar 14 07:02:12 crc kubenswrapper[4893]: I0314 07:02:12.409857 4893 generic.go:334] "Generic (PLEG): container finished" podID="19c5a27c-6f7c-46d9-bc16-27796b0bd030" containerID="85541424c8554b4ec125fcbc99002ef957e72ab249cce4e5d31dfa491362836e" exitCode=0 Mar 14 07:02:12 crc kubenswrapper[4893]: I0314 07:02:12.409951 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557862-tmqkz" event={"ID":"19c5a27c-6f7c-46d9-bc16-27796b0bd030","Type":"ContainerDied","Data":"85541424c8554b4ec125fcbc99002ef957e72ab249cce4e5d31dfa491362836e"} Mar 14 07:02:12 crc kubenswrapper[4893]: I0314 07:02:12.414448 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-59689957cc-nqtk2" Mar 14 07:02:12 crc kubenswrapper[4893]: I0314 07:02:12.427980 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-59689957cc-nqtk2" podStartSLOduration=7.42796495 podStartE2EDuration="7.42796495s" podCreationTimestamp="2026-03-14 07:02:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:02:12.423918451 +0000 UTC m=+211.686095253" watchObservedRunningTime="2026-03-14 07:02:12.42796495 +0000 UTC m=+211.690141752" Mar 14 07:02:12 crc kubenswrapper[4893]: I0314 07:02:12.631225 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5655bc4dbd-26tjh"] Mar 14 07:02:12 crc kubenswrapper[4893]: W0314 07:02:12.640005 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd3b7325_46db_44d8_9c67_cde475def14a.slice/crio-c4aa915ee540138089f1183215ee8d7e6e9c2ab268c13ec53f83e15f22ee3d83 WatchSource:0}: Error finding container c4aa915ee540138089f1183215ee8d7e6e9c2ab268c13ec53f83e15f22ee3d83: Status 404 returned error can't find the container with id c4aa915ee540138089f1183215ee8d7e6e9c2ab268c13ec53f83e15f22ee3d83 Mar 14 07:02:12 crc kubenswrapper[4893]: I0314 07:02:12.899148 4893 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-25 19:24:44.58053304 +0000 UTC Mar 14 07:02:12 crc kubenswrapper[4893]: I0314 07:02:12.899200 4893 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6156h22m31.681336002s for next certificate rotation Mar 14 07:02:13 crc kubenswrapper[4893]: I0314 07:02:13.229461 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6hlsj"] Mar 14 07:02:13 crc kubenswrapper[4893]: I0314 07:02:13.229827 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6hlsj" podUID="250e7aeb-ae56-47aa-99b5-47e01b338fd1" containerName="registry-server" containerID="cri-o://e4ddcf34d82f5565176166fb88eb31ccc6cc6e954e24a621657a20759900a54b" gracePeriod=2 Mar 14 07:02:13 crc kubenswrapper[4893]: I0314 07:02:13.389052 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00db91c2-32b1-4b21-9795-e47042b4b9a4" path="/var/lib/kubelet/pods/00db91c2-32b1-4b21-9795-e47042b4b9a4/volumes" Mar 14 07:02:13 crc kubenswrapper[4893]: I0314 07:02:13.390326 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28f15d4e-4de1-481e-bd52-d4dcada252a4" path="/var/lib/kubelet/pods/28f15d4e-4de1-481e-bd52-d4dcada252a4/volumes" Mar 14 07:02:13 crc kubenswrapper[4893]: I0314 07:02:13.419877 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5655bc4dbd-26tjh" event={"ID":"fd3b7325-46db-44d8-9c67-cde475def14a","Type":"ContainerStarted","Data":"c4aa915ee540138089f1183215ee8d7e6e9c2ab268c13ec53f83e15f22ee3d83"} Mar 14 07:02:13 crc kubenswrapper[4893]: I0314 07:02:13.765865 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557862-tmqkz" Mar 14 07:02:13 crc kubenswrapper[4893]: I0314 07:02:13.899562 4893 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-05 02:38:06.66739918 +0000 UTC Mar 14 07:02:13 crc kubenswrapper[4893]: I0314 07:02:13.899611 4893 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6379h35m52.767792248s for next certificate rotation Mar 14 07:02:13 crc kubenswrapper[4893]: I0314 07:02:13.926452 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kkc5\" (UniqueName: \"kubernetes.io/projected/19c5a27c-6f7c-46d9-bc16-27796b0bd030-kube-api-access-2kkc5\") pod \"19c5a27c-6f7c-46d9-bc16-27796b0bd030\" (UID: \"19c5a27c-6f7c-46d9-bc16-27796b0bd030\") " Mar 14 07:02:13 crc kubenswrapper[4893]: I0314 07:02:13.935456 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19c5a27c-6f7c-46d9-bc16-27796b0bd030-kube-api-access-2kkc5" (OuterVolumeSpecName: "kube-api-access-2kkc5") pod "19c5a27c-6f7c-46d9-bc16-27796b0bd030" (UID: "19c5a27c-6f7c-46d9-bc16-27796b0bd030"). InnerVolumeSpecName "kube-api-access-2kkc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:02:14 crc kubenswrapper[4893]: I0314 07:02:14.028692 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2kkc5\" (UniqueName: \"kubernetes.io/projected/19c5a27c-6f7c-46d9-bc16-27796b0bd030-kube-api-access-2kkc5\") on node \"crc\" DevicePath \"\"" Mar 14 07:02:14 crc kubenswrapper[4893]: I0314 07:02:14.426414 4893 generic.go:334] "Generic (PLEG): container finished" podID="250e7aeb-ae56-47aa-99b5-47e01b338fd1" containerID="e4ddcf34d82f5565176166fb88eb31ccc6cc6e954e24a621657a20759900a54b" exitCode=0 Mar 14 07:02:14 crc kubenswrapper[4893]: I0314 07:02:14.426492 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6hlsj" event={"ID":"250e7aeb-ae56-47aa-99b5-47e01b338fd1","Type":"ContainerDied","Data":"e4ddcf34d82f5565176166fb88eb31ccc6cc6e954e24a621657a20759900a54b"} Mar 14 07:02:14 crc kubenswrapper[4893]: I0314 07:02:14.428054 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5655bc4dbd-26tjh" event={"ID":"fd3b7325-46db-44d8-9c67-cde475def14a","Type":"ContainerStarted","Data":"f4de5eb4111e51cb92497df15674cddc8e3b7857637f956cf29506c6ebefeaeb"} Mar 14 07:02:14 crc kubenswrapper[4893]: I0314 07:02:14.429553 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557862-tmqkz" Mar 14 07:02:14 crc kubenswrapper[4893]: I0314 07:02:14.429974 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557862-tmqkz" event={"ID":"19c5a27c-6f7c-46d9-bc16-27796b0bd030","Type":"ContainerDied","Data":"c2a0b88ec44a9ed322d24d3afd2484ace38c7c8208c3a26ca94f426b59109938"} Mar 14 07:02:14 crc kubenswrapper[4893]: I0314 07:02:14.430026 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2a0b88ec44a9ed322d24d3afd2484ace38c7c8208c3a26ca94f426b59109938" Mar 14 07:02:15 crc kubenswrapper[4893]: I0314 07:02:15.014801 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6hlsj" Mar 14 07:02:15 crc kubenswrapper[4893]: I0314 07:02:15.142134 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/250e7aeb-ae56-47aa-99b5-47e01b338fd1-catalog-content\") pod \"250e7aeb-ae56-47aa-99b5-47e01b338fd1\" (UID: \"250e7aeb-ae56-47aa-99b5-47e01b338fd1\") " Mar 14 07:02:15 crc kubenswrapper[4893]: I0314 07:02:15.142237 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/250e7aeb-ae56-47aa-99b5-47e01b338fd1-utilities\") pod \"250e7aeb-ae56-47aa-99b5-47e01b338fd1\" (UID: \"250e7aeb-ae56-47aa-99b5-47e01b338fd1\") " Mar 14 07:02:15 crc kubenswrapper[4893]: I0314 07:02:15.142285 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-227db\" (UniqueName: \"kubernetes.io/projected/250e7aeb-ae56-47aa-99b5-47e01b338fd1-kube-api-access-227db\") pod \"250e7aeb-ae56-47aa-99b5-47e01b338fd1\" (UID: \"250e7aeb-ae56-47aa-99b5-47e01b338fd1\") " Mar 14 07:02:15 crc kubenswrapper[4893]: I0314 07:02:15.144896 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/250e7aeb-ae56-47aa-99b5-47e01b338fd1-utilities" (OuterVolumeSpecName: "utilities") pod "250e7aeb-ae56-47aa-99b5-47e01b338fd1" (UID: "250e7aeb-ae56-47aa-99b5-47e01b338fd1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:02:15 crc kubenswrapper[4893]: I0314 07:02:15.149627 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/250e7aeb-ae56-47aa-99b5-47e01b338fd1-kube-api-access-227db" (OuterVolumeSpecName: "kube-api-access-227db") pod "250e7aeb-ae56-47aa-99b5-47e01b338fd1" (UID: "250e7aeb-ae56-47aa-99b5-47e01b338fd1"). InnerVolumeSpecName "kube-api-access-227db". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:02:15 crc kubenswrapper[4893]: I0314 07:02:15.245485 4893 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/250e7aeb-ae56-47aa-99b5-47e01b338fd1-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 07:02:15 crc kubenswrapper[4893]: I0314 07:02:15.245554 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-227db\" (UniqueName: \"kubernetes.io/projected/250e7aeb-ae56-47aa-99b5-47e01b338fd1-kube-api-access-227db\") on node \"crc\" DevicePath \"\"" Mar 14 07:02:15 crc kubenswrapper[4893]: I0314 07:02:15.438567 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6hlsj" event={"ID":"250e7aeb-ae56-47aa-99b5-47e01b338fd1","Type":"ContainerDied","Data":"8479710ac7103e8375eadf05c3fa294bd42048ca83ede063787971a3d95b495d"} Mar 14 07:02:15 crc kubenswrapper[4893]: I0314 07:02:15.439234 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5655bc4dbd-26tjh" Mar 14 07:02:15 crc kubenswrapper[4893]: I0314 07:02:15.438649 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6hlsj" Mar 14 07:02:15 crc kubenswrapper[4893]: I0314 07:02:15.439291 4893 scope.go:117] "RemoveContainer" containerID="e4ddcf34d82f5565176166fb88eb31ccc6cc6e954e24a621657a20759900a54b" Mar 14 07:02:15 crc kubenswrapper[4893]: I0314 07:02:15.445645 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5655bc4dbd-26tjh" Mar 14 07:02:15 crc kubenswrapper[4893]: I0314 07:02:15.462378 4893 scope.go:117] "RemoveContainer" containerID="d618272cb79a935dc611117a0c341159f37bf17353f666dfaabe1dea27e01f6a" Mar 14 07:02:15 crc kubenswrapper[4893]: I0314 07:02:15.467982 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5655bc4dbd-26tjh" podStartSLOduration=10.467960369 podStartE2EDuration="10.467960369s" podCreationTimestamp="2026-03-14 07:02:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:02:15.464451776 +0000 UTC m=+214.726628638" watchObservedRunningTime="2026-03-14 07:02:15.467960369 +0000 UTC m=+214.730137171" Mar 14 07:02:15 crc kubenswrapper[4893]: I0314 07:02:15.491488 4893 scope.go:117] "RemoveContainer" containerID="2ea75b4d656e4cb1f477668083b27517f228be9251653a28525ccafe25c86c0e" Mar 14 07:02:15 crc kubenswrapper[4893]: I0314 07:02:15.651566 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/250e7aeb-ae56-47aa-99b5-47e01b338fd1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "250e7aeb-ae56-47aa-99b5-47e01b338fd1" (UID: "250e7aeb-ae56-47aa-99b5-47e01b338fd1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:02:15 crc kubenswrapper[4893]: I0314 07:02:15.751304 4893 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/250e7aeb-ae56-47aa-99b5-47e01b338fd1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 07:02:15 crc kubenswrapper[4893]: I0314 07:02:15.774136 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6hlsj"] Mar 14 07:02:15 crc kubenswrapper[4893]: I0314 07:02:15.779951 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6hlsj"] Mar 14 07:02:17 crc kubenswrapper[4893]: I0314 07:02:17.236877 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-sfrgk"] Mar 14 07:02:17 crc kubenswrapper[4893]: I0314 07:02:17.382818 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="250e7aeb-ae56-47aa-99b5-47e01b338fd1" path="/var/lib/kubelet/pods/250e7aeb-ae56-47aa-99b5-47e01b338fd1/volumes" Mar 14 07:02:24 crc kubenswrapper[4893]: I0314 07:02:24.493650 4893 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 14 07:02:24 crc kubenswrapper[4893]: E0314 07:02:24.494274 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="250e7aeb-ae56-47aa-99b5-47e01b338fd1" containerName="extract-utilities" Mar 14 07:02:24 crc kubenswrapper[4893]: I0314 07:02:24.494296 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="250e7aeb-ae56-47aa-99b5-47e01b338fd1" containerName="extract-utilities" Mar 14 07:02:24 crc kubenswrapper[4893]: E0314 07:02:24.494325 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="250e7aeb-ae56-47aa-99b5-47e01b338fd1" containerName="registry-server" Mar 14 07:02:24 crc kubenswrapper[4893]: I0314 07:02:24.494342 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="250e7aeb-ae56-47aa-99b5-47e01b338fd1" containerName="registry-server" Mar 14 07:02:24 crc kubenswrapper[4893]: E0314 07:02:24.494361 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="250e7aeb-ae56-47aa-99b5-47e01b338fd1" containerName="extract-content" Mar 14 07:02:24 crc kubenswrapper[4893]: I0314 07:02:24.494377 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="250e7aeb-ae56-47aa-99b5-47e01b338fd1" containerName="extract-content" Mar 14 07:02:24 crc kubenswrapper[4893]: E0314 07:02:24.494399 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19c5a27c-6f7c-46d9-bc16-27796b0bd030" containerName="oc" Mar 14 07:02:24 crc kubenswrapper[4893]: I0314 07:02:24.494415 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="19c5a27c-6f7c-46d9-bc16-27796b0bd030" containerName="oc" Mar 14 07:02:24 crc kubenswrapper[4893]: I0314 07:02:24.496636 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="19c5a27c-6f7c-46d9-bc16-27796b0bd030" containerName="oc" Mar 14 07:02:24 crc kubenswrapper[4893]: I0314 07:02:24.496678 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="250e7aeb-ae56-47aa-99b5-47e01b338fd1" containerName="registry-server" Mar 14 07:02:24 crc kubenswrapper[4893]: I0314 07:02:24.497616 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 07:02:24 crc kubenswrapper[4893]: I0314 07:02:24.544627 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 14 07:02:24 crc kubenswrapper[4893]: I0314 07:02:24.557508 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 07:02:24 crc kubenswrapper[4893]: I0314 07:02:24.557587 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 07:02:24 crc kubenswrapper[4893]: I0314 07:02:24.557618 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 07:02:24 crc kubenswrapper[4893]: I0314 07:02:24.557687 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 07:02:24 crc kubenswrapper[4893]: I0314 07:02:24.557801 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 07:02:24 crc kubenswrapper[4893]: I0314 07:02:24.559434 4893 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 14 07:02:24 crc kubenswrapper[4893]: I0314 07:02:24.559930 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://1d52e070ff2c06d0dd8b8a6ebcc2123cdd912571402285a57f7198351020c530" gracePeriod=15 Mar 14 07:02:24 crc kubenswrapper[4893]: I0314 07:02:24.559969 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://086579b9d9b1129c4045ccc79d9052a9d293d62e1e6f19b8283acaa562aabe61" gracePeriod=15 Mar 14 07:02:24 crc kubenswrapper[4893]: I0314 07:02:24.560046 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://9bb559a0110d78c5a24c05232d87f123298de2a9865937e122aa1e0b4f22800a" gracePeriod=15 Mar 14 07:02:24 crc kubenswrapper[4893]: I0314 07:02:24.560057 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://0dc9e912767032cdb60b63896636e573b4f6df5817c678fb00bca04809b23eee" gracePeriod=15 Mar 14 07:02:24 crc kubenswrapper[4893]: I0314 07:02:24.559979 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://67fb6e3258810524baca6310f3158bd631cab495efae35751a6935f6b504eceb" gracePeriod=15 Mar 14 07:02:24 crc kubenswrapper[4893]: I0314 07:02:24.564901 4893 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 14 07:02:24 crc kubenswrapper[4893]: E0314 07:02:24.565458 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 07:02:24 crc kubenswrapper[4893]: I0314 07:02:24.565496 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 07:02:24 crc kubenswrapper[4893]: E0314 07:02:24.565516 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 14 07:02:24 crc kubenswrapper[4893]: I0314 07:02:24.565569 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 14 07:02:24 crc kubenswrapper[4893]: E0314 07:02:24.565592 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 14 07:02:24 crc kubenswrapper[4893]: I0314 07:02:24.565607 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 14 07:02:24 crc kubenswrapper[4893]: E0314 07:02:24.565626 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 07:02:24 crc kubenswrapper[4893]: I0314 07:02:24.565639 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 07:02:24 crc kubenswrapper[4893]: E0314 07:02:24.565659 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 07:02:24 crc kubenswrapper[4893]: I0314 07:02:24.565674 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 07:02:24 crc kubenswrapper[4893]: E0314 07:02:24.565692 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 07:02:24 crc kubenswrapper[4893]: I0314 07:02:24.565706 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 07:02:24 crc kubenswrapper[4893]: E0314 07:02:24.565732 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 14 07:02:24 crc kubenswrapper[4893]: I0314 07:02:24.565745 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 14 07:02:24 crc kubenswrapper[4893]: E0314 07:02:24.565769 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 14 07:02:24 crc kubenswrapper[4893]: I0314 07:02:24.565783 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 14 07:02:24 crc kubenswrapper[4893]: E0314 07:02:24.565803 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 14 07:02:24 crc kubenswrapper[4893]: I0314 07:02:24.565817 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 14 07:02:24 crc kubenswrapper[4893]: I0314 07:02:24.565994 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 14 07:02:24 crc kubenswrapper[4893]: I0314 07:02:24.566015 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 07:02:24 crc kubenswrapper[4893]: I0314 07:02:24.566029 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 07:02:24 crc kubenswrapper[4893]: I0314 07:02:24.566045 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 07:02:24 crc kubenswrapper[4893]: I0314 07:02:24.566063 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 14 07:02:24 crc kubenswrapper[4893]: I0314 07:02:24.566080 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 14 07:02:24 crc kubenswrapper[4893]: I0314 07:02:24.566096 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 14 07:02:24 crc kubenswrapper[4893]: E0314 07:02:24.566286 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 07:02:24 crc kubenswrapper[4893]: I0314 07:02:24.566302 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 07:02:24 crc kubenswrapper[4893]: I0314 07:02:24.566463 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 07:02:24 crc kubenswrapper[4893]: I0314 07:02:24.566480 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 07:02:24 crc kubenswrapper[4893]: I0314 07:02:24.658772 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 07:02:24 crc kubenswrapper[4893]: I0314 07:02:24.658850 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 07:02:24 crc kubenswrapper[4893]: I0314 07:02:24.658889 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 07:02:24 crc kubenswrapper[4893]: I0314 07:02:24.658947 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 07:02:24 crc kubenswrapper[4893]: I0314 07:02:24.659036 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 07:02:24 crc kubenswrapper[4893]: I0314 07:02:24.659043 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 07:02:24 crc kubenswrapper[4893]: I0314 07:02:24.659074 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 07:02:24 crc kubenswrapper[4893]: I0314 07:02:24.659142 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 07:02:24 crc kubenswrapper[4893]: I0314 07:02:24.659294 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 07:02:24 crc kubenswrapper[4893]: I0314 07:02:24.659489 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 07:02:24 crc kubenswrapper[4893]: I0314 07:02:24.659551 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 07:02:24 crc kubenswrapper[4893]: I0314 07:02:24.659840 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 07:02:24 crc kubenswrapper[4893]: I0314 07:02:24.659990 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 07:02:24 crc kubenswrapper[4893]: I0314 07:02:24.761624 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 07:02:24 crc kubenswrapper[4893]: I0314 07:02:24.761678 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 07:02:24 crc kubenswrapper[4893]: I0314 07:02:24.761701 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 07:02:24 crc kubenswrapper[4893]: I0314 07:02:24.761811 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 07:02:24 crc kubenswrapper[4893]: I0314 07:02:24.761868 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 07:02:24 crc kubenswrapper[4893]: I0314 07:02:24.762121 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 07:02:24 crc kubenswrapper[4893]: I0314 07:02:24.830116 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 07:02:24 crc kubenswrapper[4893]: E0314 07:02:24.897144 4893 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.5:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189ca327ddbefc43 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 07:02:24.896465987 +0000 UTC m=+224.158642829,LastTimestamp:2026-03-14 07:02:24.896465987 +0000 UTC m=+224.158642829,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 07:02:25 crc kubenswrapper[4893]: I0314 07:02:25.006030 4893 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Mar 14 07:02:25 crc kubenswrapper[4893]: I0314 07:02:25.006082 4893 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Mar 14 07:02:25 crc kubenswrapper[4893]: I0314 07:02:25.507827 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"b8928467294c43a5436bf9538ac37fb5fe9005ea78e2fab0a866f08f5e69671e"} Mar 14 07:02:25 crc kubenswrapper[4893]: I0314 07:02:25.516854 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 14 07:02:25 crc kubenswrapper[4893]: I0314 07:02:25.518733 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 14 07:02:25 crc kubenswrapper[4893]: I0314 07:02:25.519991 4893 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9bb559a0110d78c5a24c05232d87f123298de2a9865937e122aa1e0b4f22800a" exitCode=2 Mar 14 07:02:26 crc kubenswrapper[4893]: I0314 07:02:26.528312 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 14 07:02:26 crc kubenswrapper[4893]: I0314 07:02:26.529967 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 14 07:02:26 crc kubenswrapper[4893]: I0314 07:02:26.530678 4893 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="086579b9d9b1129c4045ccc79d9052a9d293d62e1e6f19b8283acaa562aabe61" exitCode=0 Mar 14 07:02:26 crc kubenswrapper[4893]: I0314 07:02:26.530707 4893 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0dc9e912767032cdb60b63896636e573b4f6df5817c678fb00bca04809b23eee" exitCode=0 Mar 14 07:02:26 crc kubenswrapper[4893]: I0314 07:02:26.530716 4893 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="67fb6e3258810524baca6310f3158bd631cab495efae35751a6935f6b504eceb" exitCode=0 Mar 14 07:02:26 crc kubenswrapper[4893]: I0314 07:02:26.530776 4893 scope.go:117] "RemoveContainer" containerID="431bfcfc9563ced2c11a429c8d1775119884bdebac6d38a8341dc0958b1f5aa6" Mar 14 07:02:26 crc kubenswrapper[4893]: I0314 07:02:26.532161 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"f3947ae54f8768c62816ae39099452483ad6c05295536b2b977b188ce1d87d1a"} Mar 14 07:02:26 crc kubenswrapper[4893]: I0314 07:02:26.533630 4893 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.5:6443: connect: connection refused" Mar 14 07:02:26 crc kubenswrapper[4893]: I0314 07:02:26.535772 4893 generic.go:334] "Generic (PLEG): container finished" podID="c077ca86-7535-42ff-b1c4-2acec5092071" containerID="a36a94358c8a537c018de2194283954daf43f5f034f747459589670389e03d57" exitCode=0 Mar 14 07:02:26 crc kubenswrapper[4893]: I0314 07:02:26.535858 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"c077ca86-7535-42ff-b1c4-2acec5092071","Type":"ContainerDied","Data":"a36a94358c8a537c018de2194283954daf43f5f034f747459589670389e03d57"} Mar 14 07:02:26 crc kubenswrapper[4893]: I0314 07:02:26.536948 4893 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.5:6443: connect: connection refused" Mar 14 07:02:26 crc kubenswrapper[4893]: I0314 07:02:26.537365 4893 status_manager.go:851] "Failed to get status for pod" podUID="c077ca86-7535-42ff-b1c4-2acec5092071" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.5:6443: connect: connection refused" Mar 14 07:02:26 crc kubenswrapper[4893]: I0314 07:02:26.931945 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 14 07:02:26 crc kubenswrapper[4893]: I0314 07:02:26.933183 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 07:02:26 crc kubenswrapper[4893]: I0314 07:02:26.934121 4893 status_manager.go:851] "Failed to get status for pod" podUID="c077ca86-7535-42ff-b1c4-2acec5092071" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.5:6443: connect: connection refused" Mar 14 07:02:26 crc kubenswrapper[4893]: I0314 07:02:26.934635 4893 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.5:6443: connect: connection refused" Mar 14 07:02:26 crc kubenswrapper[4893]: I0314 07:02:26.935183 4893 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.5:6443: connect: connection refused" Mar 14 07:02:27 crc kubenswrapper[4893]: I0314 07:02:27.015502 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 14 07:02:27 crc kubenswrapper[4893]: I0314 07:02:27.015651 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:02:27 crc kubenswrapper[4893]: I0314 07:02:27.015670 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 14 07:02:27 crc kubenswrapper[4893]: I0314 07:02:27.015726 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:02:27 crc kubenswrapper[4893]: I0314 07:02:27.015765 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 14 07:02:27 crc kubenswrapper[4893]: I0314 07:02:27.015929 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:02:27 crc kubenswrapper[4893]: I0314 07:02:27.016185 4893 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 14 07:02:27 crc kubenswrapper[4893]: I0314 07:02:27.016203 4893 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 14 07:02:27 crc kubenswrapper[4893]: I0314 07:02:27.016212 4893 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 14 07:02:27 crc kubenswrapper[4893]: E0314 07:02:27.180438 4893 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:02:27Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:02:27Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:02:27Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:02:27Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.5:6443: connect: connection refused" Mar 14 07:02:27 crc kubenswrapper[4893]: E0314 07:02:27.180898 4893 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.5:6443: connect: connection refused" Mar 14 07:02:27 crc kubenswrapper[4893]: E0314 07:02:27.181298 4893 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.5:6443: connect: connection refused" Mar 14 07:02:27 crc kubenswrapper[4893]: E0314 07:02:27.181920 4893 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.5:6443: connect: connection refused" Mar 14 07:02:27 crc kubenswrapper[4893]: E0314 07:02:27.182592 4893 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.5:6443: connect: connection refused" Mar 14 07:02:27 crc kubenswrapper[4893]: E0314 07:02:27.182634 4893 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 14 07:02:27 crc kubenswrapper[4893]: I0314 07:02:27.396233 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 14 07:02:27 crc kubenswrapper[4893]: I0314 07:02:27.544494 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 14 07:02:27 crc kubenswrapper[4893]: I0314 07:02:27.545280 4893 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="1d52e070ff2c06d0dd8b8a6ebcc2123cdd912571402285a57f7198351020c530" exitCode=0 Mar 14 07:02:27 crc kubenswrapper[4893]: I0314 07:02:27.545626 4893 scope.go:117] "RemoveContainer" containerID="086579b9d9b1129c4045ccc79d9052a9d293d62e1e6f19b8283acaa562aabe61" Mar 14 07:02:27 crc kubenswrapper[4893]: I0314 07:02:27.545752 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 07:02:27 crc kubenswrapper[4893]: I0314 07:02:27.546377 4893 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.5:6443: connect: connection refused" Mar 14 07:02:27 crc kubenswrapper[4893]: I0314 07:02:27.546718 4893 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.5:6443: connect: connection refused" Mar 14 07:02:27 crc kubenswrapper[4893]: I0314 07:02:27.547412 4893 status_manager.go:851] "Failed to get status for pod" podUID="c077ca86-7535-42ff-b1c4-2acec5092071" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.5:6443: connect: connection refused" Mar 14 07:02:27 crc kubenswrapper[4893]: I0314 07:02:27.550945 4893 status_manager.go:851] "Failed to get status for pod" podUID="c077ca86-7535-42ff-b1c4-2acec5092071" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.5:6443: connect: connection refused" Mar 14 07:02:27 crc kubenswrapper[4893]: I0314 07:02:27.551494 4893 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.5:6443: connect: connection refused" Mar 14 07:02:27 crc kubenswrapper[4893]: I0314 07:02:27.551859 4893 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.5:6443: connect: connection refused" Mar 14 07:02:27 crc kubenswrapper[4893]: I0314 07:02:27.576340 4893 scope.go:117] "RemoveContainer" containerID="0dc9e912767032cdb60b63896636e573b4f6df5817c678fb00bca04809b23eee" Mar 14 07:02:27 crc kubenswrapper[4893]: I0314 07:02:27.599795 4893 scope.go:117] "RemoveContainer" containerID="67fb6e3258810524baca6310f3158bd631cab495efae35751a6935f6b504eceb" Mar 14 07:02:27 crc kubenswrapper[4893]: I0314 07:02:27.617004 4893 scope.go:117] "RemoveContainer" containerID="9bb559a0110d78c5a24c05232d87f123298de2a9865937e122aa1e0b4f22800a" Mar 14 07:02:27 crc kubenswrapper[4893]: I0314 07:02:27.631752 4893 scope.go:117] "RemoveContainer" containerID="1d52e070ff2c06d0dd8b8a6ebcc2123cdd912571402285a57f7198351020c530" Mar 14 07:02:27 crc kubenswrapper[4893]: I0314 07:02:27.654000 4893 scope.go:117] "RemoveContainer" containerID="c80d2b2255ef2adea9837a99343f51b8e5b351eb739bb9aa8d9f66e72cc31830" Mar 14 07:02:27 crc kubenswrapper[4893]: I0314 07:02:27.687474 4893 scope.go:117] "RemoveContainer" containerID="086579b9d9b1129c4045ccc79d9052a9d293d62e1e6f19b8283acaa562aabe61" Mar 14 07:02:27 crc kubenswrapper[4893]: E0314 07:02:27.689176 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"086579b9d9b1129c4045ccc79d9052a9d293d62e1e6f19b8283acaa562aabe61\": container with ID starting with 086579b9d9b1129c4045ccc79d9052a9d293d62e1e6f19b8283acaa562aabe61 not found: ID does not exist" containerID="086579b9d9b1129c4045ccc79d9052a9d293d62e1e6f19b8283acaa562aabe61" Mar 14 07:02:27 crc kubenswrapper[4893]: I0314 07:02:27.689245 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"086579b9d9b1129c4045ccc79d9052a9d293d62e1e6f19b8283acaa562aabe61"} err="failed to get container status \"086579b9d9b1129c4045ccc79d9052a9d293d62e1e6f19b8283acaa562aabe61\": rpc error: code = NotFound desc = could not find container \"086579b9d9b1129c4045ccc79d9052a9d293d62e1e6f19b8283acaa562aabe61\": container with ID starting with 086579b9d9b1129c4045ccc79d9052a9d293d62e1e6f19b8283acaa562aabe61 not found: ID does not exist" Mar 14 07:02:27 crc kubenswrapper[4893]: I0314 07:02:27.689268 4893 scope.go:117] "RemoveContainer" containerID="0dc9e912767032cdb60b63896636e573b4f6df5817c678fb00bca04809b23eee" Mar 14 07:02:27 crc kubenswrapper[4893]: E0314 07:02:27.689691 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0dc9e912767032cdb60b63896636e573b4f6df5817c678fb00bca04809b23eee\": container with ID starting with 0dc9e912767032cdb60b63896636e573b4f6df5817c678fb00bca04809b23eee not found: ID does not exist" containerID="0dc9e912767032cdb60b63896636e573b4f6df5817c678fb00bca04809b23eee" Mar 14 07:02:27 crc kubenswrapper[4893]: I0314 07:02:27.689713 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0dc9e912767032cdb60b63896636e573b4f6df5817c678fb00bca04809b23eee"} err="failed to get container status \"0dc9e912767032cdb60b63896636e573b4f6df5817c678fb00bca04809b23eee\": rpc error: code = NotFound desc = could not find container \"0dc9e912767032cdb60b63896636e573b4f6df5817c678fb00bca04809b23eee\": container with ID starting with 0dc9e912767032cdb60b63896636e573b4f6df5817c678fb00bca04809b23eee not found: ID does not exist" Mar 14 07:02:27 crc kubenswrapper[4893]: I0314 07:02:27.689728 4893 scope.go:117] "RemoveContainer" containerID="67fb6e3258810524baca6310f3158bd631cab495efae35751a6935f6b504eceb" Mar 14 07:02:27 crc kubenswrapper[4893]: E0314 07:02:27.690059 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67fb6e3258810524baca6310f3158bd631cab495efae35751a6935f6b504eceb\": container with ID starting with 67fb6e3258810524baca6310f3158bd631cab495efae35751a6935f6b504eceb not found: ID does not exist" containerID="67fb6e3258810524baca6310f3158bd631cab495efae35751a6935f6b504eceb" Mar 14 07:02:27 crc kubenswrapper[4893]: I0314 07:02:27.690108 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67fb6e3258810524baca6310f3158bd631cab495efae35751a6935f6b504eceb"} err="failed to get container status \"67fb6e3258810524baca6310f3158bd631cab495efae35751a6935f6b504eceb\": rpc error: code = NotFound desc = could not find container \"67fb6e3258810524baca6310f3158bd631cab495efae35751a6935f6b504eceb\": container with ID starting with 67fb6e3258810524baca6310f3158bd631cab495efae35751a6935f6b504eceb not found: ID does not exist" Mar 14 07:02:27 crc kubenswrapper[4893]: I0314 07:02:27.690141 4893 scope.go:117] "RemoveContainer" containerID="9bb559a0110d78c5a24c05232d87f123298de2a9865937e122aa1e0b4f22800a" Mar 14 07:02:27 crc kubenswrapper[4893]: E0314 07:02:27.690861 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bb559a0110d78c5a24c05232d87f123298de2a9865937e122aa1e0b4f22800a\": container with ID starting with 9bb559a0110d78c5a24c05232d87f123298de2a9865937e122aa1e0b4f22800a not found: ID does not exist" containerID="9bb559a0110d78c5a24c05232d87f123298de2a9865937e122aa1e0b4f22800a" Mar 14 07:02:27 crc kubenswrapper[4893]: I0314 07:02:27.690890 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bb559a0110d78c5a24c05232d87f123298de2a9865937e122aa1e0b4f22800a"} err="failed to get container status \"9bb559a0110d78c5a24c05232d87f123298de2a9865937e122aa1e0b4f22800a\": rpc error: code = NotFound desc = could not find container \"9bb559a0110d78c5a24c05232d87f123298de2a9865937e122aa1e0b4f22800a\": container with ID starting with 9bb559a0110d78c5a24c05232d87f123298de2a9865937e122aa1e0b4f22800a not found: ID does not exist" Mar 14 07:02:27 crc kubenswrapper[4893]: I0314 07:02:27.690916 4893 scope.go:117] "RemoveContainer" containerID="1d52e070ff2c06d0dd8b8a6ebcc2123cdd912571402285a57f7198351020c530" Mar 14 07:02:27 crc kubenswrapper[4893]: E0314 07:02:27.691367 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d52e070ff2c06d0dd8b8a6ebcc2123cdd912571402285a57f7198351020c530\": container with ID starting with 1d52e070ff2c06d0dd8b8a6ebcc2123cdd912571402285a57f7198351020c530 not found: ID does not exist" containerID="1d52e070ff2c06d0dd8b8a6ebcc2123cdd912571402285a57f7198351020c530" Mar 14 07:02:27 crc kubenswrapper[4893]: I0314 07:02:27.691414 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d52e070ff2c06d0dd8b8a6ebcc2123cdd912571402285a57f7198351020c530"} err="failed to get container status \"1d52e070ff2c06d0dd8b8a6ebcc2123cdd912571402285a57f7198351020c530\": rpc error: code = NotFound desc = could not find container \"1d52e070ff2c06d0dd8b8a6ebcc2123cdd912571402285a57f7198351020c530\": container with ID starting with 1d52e070ff2c06d0dd8b8a6ebcc2123cdd912571402285a57f7198351020c530 not found: ID does not exist" Mar 14 07:02:27 crc kubenswrapper[4893]: I0314 07:02:27.691445 4893 scope.go:117] "RemoveContainer" containerID="c80d2b2255ef2adea9837a99343f51b8e5b351eb739bb9aa8d9f66e72cc31830" Mar 14 07:02:27 crc kubenswrapper[4893]: E0314 07:02:27.692216 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c80d2b2255ef2adea9837a99343f51b8e5b351eb739bb9aa8d9f66e72cc31830\": container with ID starting with c80d2b2255ef2adea9837a99343f51b8e5b351eb739bb9aa8d9f66e72cc31830 not found: ID does not exist" containerID="c80d2b2255ef2adea9837a99343f51b8e5b351eb739bb9aa8d9f66e72cc31830" Mar 14 07:02:27 crc kubenswrapper[4893]: I0314 07:02:27.692245 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c80d2b2255ef2adea9837a99343f51b8e5b351eb739bb9aa8d9f66e72cc31830"} err="failed to get container status \"c80d2b2255ef2adea9837a99343f51b8e5b351eb739bb9aa8d9f66e72cc31830\": rpc error: code = NotFound desc = could not find container \"c80d2b2255ef2adea9837a99343f51b8e5b351eb739bb9aa8d9f66e72cc31830\": container with ID starting with c80d2b2255ef2adea9837a99343f51b8e5b351eb739bb9aa8d9f66e72cc31830 not found: ID does not exist" Mar 14 07:02:27 crc kubenswrapper[4893]: I0314 07:02:27.959407 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 14 07:02:27 crc kubenswrapper[4893]: I0314 07:02:27.960331 4893 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.5:6443: connect: connection refused" Mar 14 07:02:27 crc kubenswrapper[4893]: I0314 07:02:27.960686 4893 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.5:6443: connect: connection refused" Mar 14 07:02:27 crc kubenswrapper[4893]: I0314 07:02:27.960917 4893 status_manager.go:851] "Failed to get status for pod" podUID="c077ca86-7535-42ff-b1c4-2acec5092071" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.5:6443: connect: connection refused" Mar 14 07:02:28 crc kubenswrapper[4893]: I0314 07:02:28.027779 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c077ca86-7535-42ff-b1c4-2acec5092071-kube-api-access\") pod \"c077ca86-7535-42ff-b1c4-2acec5092071\" (UID: \"c077ca86-7535-42ff-b1c4-2acec5092071\") " Mar 14 07:02:28 crc kubenswrapper[4893]: I0314 07:02:28.027846 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c077ca86-7535-42ff-b1c4-2acec5092071-var-lock\") pod \"c077ca86-7535-42ff-b1c4-2acec5092071\" (UID: \"c077ca86-7535-42ff-b1c4-2acec5092071\") " Mar 14 07:02:28 crc kubenswrapper[4893]: I0314 07:02:28.027900 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c077ca86-7535-42ff-b1c4-2acec5092071-kubelet-dir\") pod \"c077ca86-7535-42ff-b1c4-2acec5092071\" (UID: \"c077ca86-7535-42ff-b1c4-2acec5092071\") " Mar 14 07:02:28 crc kubenswrapper[4893]: I0314 07:02:28.027980 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c077ca86-7535-42ff-b1c4-2acec5092071-var-lock" (OuterVolumeSpecName: "var-lock") pod "c077ca86-7535-42ff-b1c4-2acec5092071" (UID: "c077ca86-7535-42ff-b1c4-2acec5092071"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:02:28 crc kubenswrapper[4893]: I0314 07:02:28.028014 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c077ca86-7535-42ff-b1c4-2acec5092071-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "c077ca86-7535-42ff-b1c4-2acec5092071" (UID: "c077ca86-7535-42ff-b1c4-2acec5092071"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:02:28 crc kubenswrapper[4893]: I0314 07:02:28.028083 4893 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c077ca86-7535-42ff-b1c4-2acec5092071-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 14 07:02:28 crc kubenswrapper[4893]: I0314 07:02:28.028097 4893 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c077ca86-7535-42ff-b1c4-2acec5092071-var-lock\") on node \"crc\" DevicePath \"\"" Mar 14 07:02:28 crc kubenswrapper[4893]: I0314 07:02:28.035987 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c077ca86-7535-42ff-b1c4-2acec5092071-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "c077ca86-7535-42ff-b1c4-2acec5092071" (UID: "c077ca86-7535-42ff-b1c4-2acec5092071"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:02:28 crc kubenswrapper[4893]: I0314 07:02:28.128967 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c077ca86-7535-42ff-b1c4-2acec5092071-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 14 07:02:28 crc kubenswrapper[4893]: E0314 07:02:28.241501 4893 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.5:6443: connect: connection refused" Mar 14 07:02:28 crc kubenswrapper[4893]: E0314 07:02:28.241973 4893 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.5:6443: connect: connection refused" Mar 14 07:02:28 crc kubenswrapper[4893]: E0314 07:02:28.242205 4893 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.5:6443: connect: connection refused" Mar 14 07:02:28 crc kubenswrapper[4893]: E0314 07:02:28.242397 4893 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.5:6443: connect: connection refused" Mar 14 07:02:28 crc kubenswrapper[4893]: E0314 07:02:28.242653 4893 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.5:6443: connect: connection refused" Mar 14 07:02:28 crc kubenswrapper[4893]: I0314 07:02:28.242687 4893 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 14 07:02:28 crc kubenswrapper[4893]: E0314 07:02:28.242983 4893 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.5:6443: connect: connection refused" interval="200ms" Mar 14 07:02:28 crc kubenswrapper[4893]: E0314 07:02:28.444386 4893 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.5:6443: connect: connection refused" interval="400ms" Mar 14 07:02:28 crc kubenswrapper[4893]: I0314 07:02:28.552753 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"c077ca86-7535-42ff-b1c4-2acec5092071","Type":"ContainerDied","Data":"f35e1d81f7308f95f217c94c114ccf3aadc7a55410b42e727ee101a34a9885db"} Mar 14 07:02:28 crc kubenswrapper[4893]: I0314 07:02:28.552825 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f35e1d81f7308f95f217c94c114ccf3aadc7a55410b42e727ee101a34a9885db" Mar 14 07:02:28 crc kubenswrapper[4893]: I0314 07:02:28.553463 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 14 07:02:28 crc kubenswrapper[4893]: I0314 07:02:28.567628 4893 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.5:6443: connect: connection refused" Mar 14 07:02:28 crc kubenswrapper[4893]: I0314 07:02:28.568177 4893 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.5:6443: connect: connection refused" Mar 14 07:02:28 crc kubenswrapper[4893]: I0314 07:02:28.568568 4893 status_manager.go:851] "Failed to get status for pod" podUID="c077ca86-7535-42ff-b1c4-2acec5092071" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.5:6443: connect: connection refused" Mar 14 07:02:28 crc kubenswrapper[4893]: E0314 07:02:28.845431 4893 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.5:6443: connect: connection refused" interval="800ms" Mar 14 07:02:29 crc kubenswrapper[4893]: E0314 07:02:29.587709 4893 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.5:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189ca327ddbefc43 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 07:02:24.896465987 +0000 UTC m=+224.158642829,LastTimestamp:2026-03-14 07:02:24.896465987 +0000 UTC m=+224.158642829,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 07:02:29 crc kubenswrapper[4893]: E0314 07:02:29.647097 4893 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.5:6443: connect: connection refused" interval="1.6s" Mar 14 07:02:29 crc kubenswrapper[4893]: I0314 07:02:29.731490 4893 patch_prober.go:28] interesting pod/machine-config-daemon-d4x6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:02:29 crc kubenswrapper[4893]: I0314 07:02:29.731570 4893 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:02:31 crc kubenswrapper[4893]: E0314 07:02:31.248126 4893 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.5:6443: connect: connection refused" interval="3.2s" Mar 14 07:02:31 crc kubenswrapper[4893]: I0314 07:02:31.378851 4893 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.5:6443: connect: connection refused" Mar 14 07:02:31 crc kubenswrapper[4893]: I0314 07:02:31.379675 4893 status_manager.go:851] "Failed to get status for pod" podUID="c077ca86-7535-42ff-b1c4-2acec5092071" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.5:6443: connect: connection refused" Mar 14 07:02:34 crc kubenswrapper[4893]: E0314 07:02:34.449315 4893 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.5:6443: connect: connection refused" interval="6.4s" Mar 14 07:02:37 crc kubenswrapper[4893]: E0314 07:02:37.346206 4893 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-d9c375df2546244fc3486a22cc2fdfa8309783f94a19e116dc1c3fc9a6719dc7.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-conmon-d9c375df2546244fc3486a22cc2fdfa8309783f94a19e116dc1c3fc9a6719dc7.scope\": RecentStats: unable to find data in memory cache]" Mar 14 07:02:37 crc kubenswrapper[4893]: I0314 07:02:37.375813 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 07:02:37 crc kubenswrapper[4893]: I0314 07:02:37.376941 4893 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.5:6443: connect: connection refused" Mar 14 07:02:37 crc kubenswrapper[4893]: I0314 07:02:37.378288 4893 status_manager.go:851] "Failed to get status for pod" podUID="c077ca86-7535-42ff-b1c4-2acec5092071" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.5:6443: connect: connection refused" Mar 14 07:02:37 crc kubenswrapper[4893]: I0314 07:02:37.389588 4893 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="dd3078e8-83f7-4c25-a9c0-cc8d37f36c5f" Mar 14 07:02:37 crc kubenswrapper[4893]: I0314 07:02:37.389619 4893 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="dd3078e8-83f7-4c25-a9c0-cc8d37f36c5f" Mar 14 07:02:37 crc kubenswrapper[4893]: E0314 07:02:37.390169 4893 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.5:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 07:02:37 crc kubenswrapper[4893]: I0314 07:02:37.390970 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 07:02:37 crc kubenswrapper[4893]: W0314 07:02:37.413016 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-d698f4bc978b2b58c5c5a33968eb239c2fc466f51fdbd7bf8ab03e248a177e27 WatchSource:0}: Error finding container d698f4bc978b2b58c5c5a33968eb239c2fc466f51fdbd7bf8ab03e248a177e27: Status 404 returned error can't find the container with id d698f4bc978b2b58c5c5a33968eb239c2fc466f51fdbd7bf8ab03e248a177e27 Mar 14 07:02:37 crc kubenswrapper[4893]: E0314 07:02:37.514359 4893 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:02:37Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:02:37Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:02:37Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:02:37Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.5:6443: connect: connection refused" Mar 14 07:02:37 crc kubenswrapper[4893]: E0314 07:02:37.514770 4893 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.5:6443: connect: connection refused" Mar 14 07:02:37 crc kubenswrapper[4893]: E0314 07:02:37.515035 4893 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.5:6443: connect: connection refused" Mar 14 07:02:37 crc kubenswrapper[4893]: E0314 07:02:37.515248 4893 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.5:6443: connect: connection refused" Mar 14 07:02:37 crc kubenswrapper[4893]: E0314 07:02:37.515505 4893 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.5:6443: connect: connection refused" Mar 14 07:02:37 crc kubenswrapper[4893]: E0314 07:02:37.515546 4893 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 14 07:02:37 crc kubenswrapper[4893]: I0314 07:02:37.611763 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d698f4bc978b2b58c5c5a33968eb239c2fc466f51fdbd7bf8ab03e248a177e27"} Mar 14 07:02:37 crc kubenswrapper[4893]: I0314 07:02:37.614812 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 14 07:02:37 crc kubenswrapper[4893]: I0314 07:02:37.615422 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 14 07:02:37 crc kubenswrapper[4893]: I0314 07:02:37.615464 4893 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="d9c375df2546244fc3486a22cc2fdfa8309783f94a19e116dc1c3fc9a6719dc7" exitCode=1 Mar 14 07:02:37 crc kubenswrapper[4893]: I0314 07:02:37.615487 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"d9c375df2546244fc3486a22cc2fdfa8309783f94a19e116dc1c3fc9a6719dc7"} Mar 14 07:02:37 crc kubenswrapper[4893]: I0314 07:02:37.616054 4893 scope.go:117] "RemoveContainer" containerID="d9c375df2546244fc3486a22cc2fdfa8309783f94a19e116dc1c3fc9a6719dc7" Mar 14 07:02:37 crc kubenswrapper[4893]: I0314 07:02:37.616138 4893 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.5:6443: connect: connection refused" Mar 14 07:02:37 crc kubenswrapper[4893]: I0314 07:02:37.616365 4893 status_manager.go:851] "Failed to get status for pod" podUID="c077ca86-7535-42ff-b1c4-2acec5092071" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.5:6443: connect: connection refused" Mar 14 07:02:37 crc kubenswrapper[4893]: I0314 07:02:37.616654 4893 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.5:6443: connect: connection refused" Mar 14 07:02:38 crc kubenswrapper[4893]: I0314 07:02:38.624003 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 14 07:02:38 crc kubenswrapper[4893]: I0314 07:02:38.625581 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 14 07:02:38 crc kubenswrapper[4893]: I0314 07:02:38.625681 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ec02ee43d519e96d12bcf27f041d09f2dfbd6888995d41567db16e2c687a00ed"} Mar 14 07:02:38 crc kubenswrapper[4893]: I0314 07:02:38.626601 4893 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.5:6443: connect: connection refused" Mar 14 07:02:38 crc kubenswrapper[4893]: I0314 07:02:38.627007 4893 status_manager.go:851] "Failed to get status for pod" podUID="c077ca86-7535-42ff-b1c4-2acec5092071" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.5:6443: connect: connection refused" Mar 14 07:02:38 crc kubenswrapper[4893]: I0314 07:02:38.627247 4893 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.5:6443: connect: connection refused" Mar 14 07:02:38 crc kubenswrapper[4893]: I0314 07:02:38.627598 4893 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="78fe5b53dd728973fa3c9aa3d601e3d17f4cb23b8d681bd3355b14200d7a6cf1" exitCode=0 Mar 14 07:02:38 crc kubenswrapper[4893]: I0314 07:02:38.627647 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"78fe5b53dd728973fa3c9aa3d601e3d17f4cb23b8d681bd3355b14200d7a6cf1"} Mar 14 07:02:38 crc kubenswrapper[4893]: I0314 07:02:38.627840 4893 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="dd3078e8-83f7-4c25-a9c0-cc8d37f36c5f" Mar 14 07:02:38 crc kubenswrapper[4893]: I0314 07:02:38.627865 4893 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="dd3078e8-83f7-4c25-a9c0-cc8d37f36c5f" Mar 14 07:02:38 crc kubenswrapper[4893]: I0314 07:02:38.628249 4893 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.5:6443: connect: connection refused" Mar 14 07:02:38 crc kubenswrapper[4893]: E0314 07:02:38.628268 4893 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.5:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 07:02:38 crc kubenswrapper[4893]: I0314 07:02:38.628651 4893 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.5:6443: connect: connection refused" Mar 14 07:02:38 crc kubenswrapper[4893]: I0314 07:02:38.628919 4893 status_manager.go:851] "Failed to get status for pod" podUID="c077ca86-7535-42ff-b1c4-2acec5092071" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.5:6443: connect: connection refused" Mar 14 07:02:39 crc kubenswrapper[4893]: I0314 07:02:39.634875 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d3a2fcd06be069cdf693818afee9f394f1a9d89130c8637f6c4a0013a9ac5351"} Mar 14 07:02:39 crc kubenswrapper[4893]: I0314 07:02:39.635240 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"42eafb1433e4c8a16aa9938588d841a12013a74f021207cd4a383fb964066ccd"} Mar 14 07:02:39 crc kubenswrapper[4893]: I0314 07:02:39.635253 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"77e8aa2d7629e3a84335dc040a16f88915800ead3f13a2f15653c76a0bedae73"} Mar 14 07:02:39 crc kubenswrapper[4893]: I0314 07:02:39.635265 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"148a18006e51ef99cd6b18e2cc1ffaf749eaf3b25faa58d18a4e1d543c7a9de6"} Mar 14 07:02:40 crc kubenswrapper[4893]: I0314 07:02:40.664902 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"7b340e7aa47e533807d8efb06999835d6feacb4928b14dec72230b585dd4df94"} Mar 14 07:02:40 crc kubenswrapper[4893]: I0314 07:02:40.665500 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 07:02:40 crc kubenswrapper[4893]: I0314 07:02:40.665630 4893 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="dd3078e8-83f7-4c25-a9c0-cc8d37f36c5f" Mar 14 07:02:40 crc kubenswrapper[4893]: I0314 07:02:40.665672 4893 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="dd3078e8-83f7-4c25-a9c0-cc8d37f36c5f" Mar 14 07:02:42 crc kubenswrapper[4893]: I0314 07:02:42.269840 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-sfrgk" podUID="928468fc-c237-4779-a2c6-7365b3764fe8" containerName="oauth-openshift" containerID="cri-o://fa62d26f6bce6ceb8f7e651a0b2d60460ae87c6adc1dd272bd9ccc46e5424bc8" gracePeriod=15 Mar 14 07:02:42 crc kubenswrapper[4893]: I0314 07:02:42.392228 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 07:02:42 crc kubenswrapper[4893]: I0314 07:02:42.392297 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 07:02:42 crc kubenswrapper[4893]: I0314 07:02:42.403763 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 07:02:42 crc kubenswrapper[4893]: I0314 07:02:42.678225 4893 generic.go:334] "Generic (PLEG): container finished" podID="928468fc-c237-4779-a2c6-7365b3764fe8" containerID="fa62d26f6bce6ceb8f7e651a0b2d60460ae87c6adc1dd272bd9ccc46e5424bc8" exitCode=0 Mar 14 07:02:42 crc kubenswrapper[4893]: I0314 07:02:42.678287 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-sfrgk" event={"ID":"928468fc-c237-4779-a2c6-7365b3764fe8","Type":"ContainerDied","Data":"fa62d26f6bce6ceb8f7e651a0b2d60460ae87c6adc1dd272bd9ccc46e5424bc8"} Mar 14 07:02:42 crc kubenswrapper[4893]: I0314 07:02:42.787097 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-sfrgk" Mar 14 07:02:42 crc kubenswrapper[4893]: I0314 07:02:42.925783 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/928468fc-c237-4779-a2c6-7365b3764fe8-v4-0-config-user-template-login\") pod \"928468fc-c237-4779-a2c6-7365b3764fe8\" (UID: \"928468fc-c237-4779-a2c6-7365b3764fe8\") " Mar 14 07:02:42 crc kubenswrapper[4893]: I0314 07:02:42.926482 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/928468fc-c237-4779-a2c6-7365b3764fe8-v4-0-config-system-serving-cert\") pod \"928468fc-c237-4779-a2c6-7365b3764fe8\" (UID: \"928468fc-c237-4779-a2c6-7365b3764fe8\") " Mar 14 07:02:42 crc kubenswrapper[4893]: I0314 07:02:42.927592 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ksktz\" (UniqueName: \"kubernetes.io/projected/928468fc-c237-4779-a2c6-7365b3764fe8-kube-api-access-ksktz\") pod \"928468fc-c237-4779-a2c6-7365b3764fe8\" (UID: \"928468fc-c237-4779-a2c6-7365b3764fe8\") " Mar 14 07:02:42 crc kubenswrapper[4893]: I0314 07:02:42.927648 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/928468fc-c237-4779-a2c6-7365b3764fe8-v4-0-config-system-ocp-branding-template\") pod \"928468fc-c237-4779-a2c6-7365b3764fe8\" (UID: \"928468fc-c237-4779-a2c6-7365b3764fe8\") " Mar 14 07:02:42 crc kubenswrapper[4893]: I0314 07:02:42.927710 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/928468fc-c237-4779-a2c6-7365b3764fe8-v4-0-config-system-router-certs\") pod \"928468fc-c237-4779-a2c6-7365b3764fe8\" (UID: \"928468fc-c237-4779-a2c6-7365b3764fe8\") " Mar 14 07:02:42 crc kubenswrapper[4893]: I0314 07:02:42.927755 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/928468fc-c237-4779-a2c6-7365b3764fe8-v4-0-config-user-idp-0-file-data\") pod \"928468fc-c237-4779-a2c6-7365b3764fe8\" (UID: \"928468fc-c237-4779-a2c6-7365b3764fe8\") " Mar 14 07:02:42 crc kubenswrapper[4893]: I0314 07:02:42.927820 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/928468fc-c237-4779-a2c6-7365b3764fe8-v4-0-config-system-cliconfig\") pod \"928468fc-c237-4779-a2c6-7365b3764fe8\" (UID: \"928468fc-c237-4779-a2c6-7365b3764fe8\") " Mar 14 07:02:42 crc kubenswrapper[4893]: I0314 07:02:42.927865 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/928468fc-c237-4779-a2c6-7365b3764fe8-v4-0-config-user-template-error\") pod \"928468fc-c237-4779-a2c6-7365b3764fe8\" (UID: \"928468fc-c237-4779-a2c6-7365b3764fe8\") " Mar 14 07:02:42 crc kubenswrapper[4893]: I0314 07:02:42.927903 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/928468fc-c237-4779-a2c6-7365b3764fe8-v4-0-config-user-template-provider-selection\") pod \"928468fc-c237-4779-a2c6-7365b3764fe8\" (UID: \"928468fc-c237-4779-a2c6-7365b3764fe8\") " Mar 14 07:02:42 crc kubenswrapper[4893]: I0314 07:02:42.927943 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/928468fc-c237-4779-a2c6-7365b3764fe8-v4-0-config-system-service-ca\") pod \"928468fc-c237-4779-a2c6-7365b3764fe8\" (UID: \"928468fc-c237-4779-a2c6-7365b3764fe8\") " Mar 14 07:02:42 crc kubenswrapper[4893]: I0314 07:02:42.927992 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/928468fc-c237-4779-a2c6-7365b3764fe8-v4-0-config-system-trusted-ca-bundle\") pod \"928468fc-c237-4779-a2c6-7365b3764fe8\" (UID: \"928468fc-c237-4779-a2c6-7365b3764fe8\") " Mar 14 07:02:42 crc kubenswrapper[4893]: I0314 07:02:42.928037 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/928468fc-c237-4779-a2c6-7365b3764fe8-audit-dir\") pod \"928468fc-c237-4779-a2c6-7365b3764fe8\" (UID: \"928468fc-c237-4779-a2c6-7365b3764fe8\") " Mar 14 07:02:42 crc kubenswrapper[4893]: I0314 07:02:42.928093 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/928468fc-c237-4779-a2c6-7365b3764fe8-audit-policies\") pod \"928468fc-c237-4779-a2c6-7365b3764fe8\" (UID: \"928468fc-c237-4779-a2c6-7365b3764fe8\") " Mar 14 07:02:42 crc kubenswrapper[4893]: I0314 07:02:42.928124 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/928468fc-c237-4779-a2c6-7365b3764fe8-v4-0-config-system-session\") pod \"928468fc-c237-4779-a2c6-7365b3764fe8\" (UID: \"928468fc-c237-4779-a2c6-7365b3764fe8\") " Mar 14 07:02:42 crc kubenswrapper[4893]: I0314 07:02:42.928755 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/928468fc-c237-4779-a2c6-7365b3764fe8-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "928468fc-c237-4779-a2c6-7365b3764fe8" (UID: "928468fc-c237-4779-a2c6-7365b3764fe8"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:02:42 crc kubenswrapper[4893]: I0314 07:02:42.928853 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/928468fc-c237-4779-a2c6-7365b3764fe8-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "928468fc-c237-4779-a2c6-7365b3764fe8" (UID: "928468fc-c237-4779-a2c6-7365b3764fe8"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:02:42 crc kubenswrapper[4893]: I0314 07:02:42.929416 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/928468fc-c237-4779-a2c6-7365b3764fe8-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "928468fc-c237-4779-a2c6-7365b3764fe8" (UID: "928468fc-c237-4779-a2c6-7365b3764fe8"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:02:42 crc kubenswrapper[4893]: I0314 07:02:42.933373 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/928468fc-c237-4779-a2c6-7365b3764fe8-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "928468fc-c237-4779-a2c6-7365b3764fe8" (UID: "928468fc-c237-4779-a2c6-7365b3764fe8"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:02:42 crc kubenswrapper[4893]: I0314 07:02:42.935456 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/928468fc-c237-4779-a2c6-7365b3764fe8-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "928468fc-c237-4779-a2c6-7365b3764fe8" (UID: "928468fc-c237-4779-a2c6-7365b3764fe8"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:02:42 crc kubenswrapper[4893]: I0314 07:02:42.935682 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/928468fc-c237-4779-a2c6-7365b3764fe8-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "928468fc-c237-4779-a2c6-7365b3764fe8" (UID: "928468fc-c237-4779-a2c6-7365b3764fe8"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:02:42 crc kubenswrapper[4893]: I0314 07:02:42.936463 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/928468fc-c237-4779-a2c6-7365b3764fe8-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "928468fc-c237-4779-a2c6-7365b3764fe8" (UID: "928468fc-c237-4779-a2c6-7365b3764fe8"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:02:42 crc kubenswrapper[4893]: I0314 07:02:42.936895 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/928468fc-c237-4779-a2c6-7365b3764fe8-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "928468fc-c237-4779-a2c6-7365b3764fe8" (UID: "928468fc-c237-4779-a2c6-7365b3764fe8"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:02:42 crc kubenswrapper[4893]: I0314 07:02:42.937070 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/928468fc-c237-4779-a2c6-7365b3764fe8-kube-api-access-ksktz" (OuterVolumeSpecName: "kube-api-access-ksktz") pod "928468fc-c237-4779-a2c6-7365b3764fe8" (UID: "928468fc-c237-4779-a2c6-7365b3764fe8"). InnerVolumeSpecName "kube-api-access-ksktz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:02:42 crc kubenswrapper[4893]: I0314 07:02:42.937578 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/928468fc-c237-4779-a2c6-7365b3764fe8-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "928468fc-c237-4779-a2c6-7365b3764fe8" (UID: "928468fc-c237-4779-a2c6-7365b3764fe8"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:02:42 crc kubenswrapper[4893]: I0314 07:02:42.937407 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/928468fc-c237-4779-a2c6-7365b3764fe8-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "928468fc-c237-4779-a2c6-7365b3764fe8" (UID: "928468fc-c237-4779-a2c6-7365b3764fe8"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:02:42 crc kubenswrapper[4893]: I0314 07:02:42.938899 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/928468fc-c237-4779-a2c6-7365b3764fe8-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "928468fc-c237-4779-a2c6-7365b3764fe8" (UID: "928468fc-c237-4779-a2c6-7365b3764fe8"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:02:42 crc kubenswrapper[4893]: I0314 07:02:42.939882 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/928468fc-c237-4779-a2c6-7365b3764fe8-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "928468fc-c237-4779-a2c6-7365b3764fe8" (UID: "928468fc-c237-4779-a2c6-7365b3764fe8"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:02:42 crc kubenswrapper[4893]: I0314 07:02:42.944931 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/928468fc-c237-4779-a2c6-7365b3764fe8-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "928468fc-c237-4779-a2c6-7365b3764fe8" (UID: "928468fc-c237-4779-a2c6-7365b3764fe8"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:02:43 crc kubenswrapper[4893]: I0314 07:02:43.029812 4893 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/928468fc-c237-4779-a2c6-7365b3764fe8-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:02:43 crc kubenswrapper[4893]: I0314 07:02:43.029875 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ksktz\" (UniqueName: \"kubernetes.io/projected/928468fc-c237-4779-a2c6-7365b3764fe8-kube-api-access-ksktz\") on node \"crc\" DevicePath \"\"" Mar 14 07:02:43 crc kubenswrapper[4893]: I0314 07:02:43.029894 4893 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/928468fc-c237-4779-a2c6-7365b3764fe8-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 14 07:02:43 crc kubenswrapper[4893]: I0314 07:02:43.029913 4893 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/928468fc-c237-4779-a2c6-7365b3764fe8-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:02:43 crc kubenswrapper[4893]: I0314 07:02:43.029931 4893 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/928468fc-c237-4779-a2c6-7365b3764fe8-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:02:43 crc kubenswrapper[4893]: I0314 07:02:43.029947 4893 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/928468fc-c237-4779-a2c6-7365b3764fe8-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 14 07:02:43 crc kubenswrapper[4893]: I0314 07:02:43.029999 4893 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/928468fc-c237-4779-a2c6-7365b3764fe8-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 14 07:02:43 crc kubenswrapper[4893]: I0314 07:02:43.030018 4893 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/928468fc-c237-4779-a2c6-7365b3764fe8-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 14 07:02:43 crc kubenswrapper[4893]: I0314 07:02:43.030039 4893 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/928468fc-c237-4779-a2c6-7365b3764fe8-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 14 07:02:43 crc kubenswrapper[4893]: I0314 07:02:43.030057 4893 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/928468fc-c237-4779-a2c6-7365b3764fe8-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:02:43 crc kubenswrapper[4893]: I0314 07:02:43.030073 4893 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/928468fc-c237-4779-a2c6-7365b3764fe8-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 14 07:02:43 crc kubenswrapper[4893]: I0314 07:02:43.030091 4893 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/928468fc-c237-4779-a2c6-7365b3764fe8-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 14 07:02:43 crc kubenswrapper[4893]: I0314 07:02:43.030106 4893 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/928468fc-c237-4779-a2c6-7365b3764fe8-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 14 07:02:43 crc kubenswrapper[4893]: I0314 07:02:43.030125 4893 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/928468fc-c237-4779-a2c6-7365b3764fe8-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 14 07:02:43 crc kubenswrapper[4893]: I0314 07:02:43.685026 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-sfrgk" event={"ID":"928468fc-c237-4779-a2c6-7365b3764fe8","Type":"ContainerDied","Data":"f35f87f1115fec26294aa10c852357897d09d5951e9fd108f37f1f38e705df5f"} Mar 14 07:02:43 crc kubenswrapper[4893]: I0314 07:02:43.685093 4893 scope.go:117] "RemoveContainer" containerID="fa62d26f6bce6ceb8f7e651a0b2d60460ae87c6adc1dd272bd9ccc46e5424bc8" Mar 14 07:02:43 crc kubenswrapper[4893]: I0314 07:02:43.685136 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-sfrgk" Mar 14 07:02:43 crc kubenswrapper[4893]: I0314 07:02:43.974056 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 07:02:43 crc kubenswrapper[4893]: I0314 07:02:43.984075 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 07:02:44 crc kubenswrapper[4893]: I0314 07:02:44.694398 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 07:02:45 crc kubenswrapper[4893]: I0314 07:02:45.678399 4893 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 07:02:45 crc kubenswrapper[4893]: I0314 07:02:45.698680 4893 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="dd3078e8-83f7-4c25-a9c0-cc8d37f36c5f" Mar 14 07:02:45 crc kubenswrapper[4893]: I0314 07:02:45.698723 4893 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="dd3078e8-83f7-4c25-a9c0-cc8d37f36c5f" Mar 14 07:02:45 crc kubenswrapper[4893]: I0314 07:02:45.705948 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 07:02:45 crc kubenswrapper[4893]: I0314 07:02:45.757024 4893 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="31e8298d-ec71-4879-821f-fe5dd166a7eb" Mar 14 07:02:46 crc kubenswrapper[4893]: I0314 07:02:46.707530 4893 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="dd3078e8-83f7-4c25-a9c0-cc8d37f36c5f" Mar 14 07:02:46 crc kubenswrapper[4893]: I0314 07:02:46.707586 4893 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="dd3078e8-83f7-4c25-a9c0-cc8d37f36c5f" Mar 14 07:02:46 crc kubenswrapper[4893]: I0314 07:02:46.710844 4893 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="31e8298d-ec71-4879-821f-fe5dd166a7eb" Mar 14 07:02:55 crc kubenswrapper[4893]: I0314 07:02:55.096206 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 14 07:02:55 crc kubenswrapper[4893]: I0314 07:02:55.820221 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 14 07:02:56 crc kubenswrapper[4893]: I0314 07:02:56.213871 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 14 07:02:56 crc kubenswrapper[4893]: I0314 07:02:56.304416 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 14 07:02:56 crc kubenswrapper[4893]: I0314 07:02:56.329222 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 14 07:02:56 crc kubenswrapper[4893]: I0314 07:02:56.489846 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 14 07:02:56 crc kubenswrapper[4893]: I0314 07:02:56.776272 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 07:02:56 crc kubenswrapper[4893]: I0314 07:02:56.837671 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 14 07:02:57 crc kubenswrapper[4893]: I0314 07:02:57.145876 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 14 07:02:57 crc kubenswrapper[4893]: I0314 07:02:57.264176 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 14 07:02:57 crc kubenswrapper[4893]: I0314 07:02:57.469766 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 14 07:02:57 crc kubenswrapper[4893]: I0314 07:02:57.528871 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 14 07:02:57 crc kubenswrapper[4893]: I0314 07:02:57.606331 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 14 07:02:57 crc kubenswrapper[4893]: I0314 07:02:57.639999 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 14 07:02:57 crc kubenswrapper[4893]: I0314 07:02:57.774644 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 14 07:02:57 crc kubenswrapper[4893]: I0314 07:02:57.834914 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 14 07:02:58 crc kubenswrapper[4893]: I0314 07:02:58.044955 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 14 07:02:58 crc kubenswrapper[4893]: I0314 07:02:58.202419 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 14 07:02:58 crc kubenswrapper[4893]: I0314 07:02:58.253902 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 14 07:02:58 crc kubenswrapper[4893]: I0314 07:02:58.281367 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 14 07:02:58 crc kubenswrapper[4893]: I0314 07:02:58.307742 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 14 07:02:58 crc kubenswrapper[4893]: I0314 07:02:58.643402 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 14 07:02:58 crc kubenswrapper[4893]: I0314 07:02:58.682261 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 14 07:02:58 crc kubenswrapper[4893]: I0314 07:02:58.787392 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 14 07:02:58 crc kubenswrapper[4893]: I0314 07:02:58.846684 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 14 07:02:58 crc kubenswrapper[4893]: I0314 07:02:58.886629 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 14 07:02:59 crc kubenswrapper[4893]: I0314 07:02:59.085295 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 14 07:02:59 crc kubenswrapper[4893]: I0314 07:02:59.329177 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 14 07:02:59 crc kubenswrapper[4893]: I0314 07:02:59.329404 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 14 07:02:59 crc kubenswrapper[4893]: I0314 07:02:59.348247 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 14 07:02:59 crc kubenswrapper[4893]: I0314 07:02:59.410066 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 14 07:02:59 crc kubenswrapper[4893]: I0314 07:02:59.426255 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 14 07:02:59 crc kubenswrapper[4893]: I0314 07:02:59.468959 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 14 07:02:59 crc kubenswrapper[4893]: I0314 07:02:59.490978 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 14 07:02:59 crc kubenswrapper[4893]: I0314 07:02:59.497152 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 14 07:02:59 crc kubenswrapper[4893]: I0314 07:02:59.552126 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 14 07:02:59 crc kubenswrapper[4893]: I0314 07:02:59.615565 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 14 07:02:59 crc kubenswrapper[4893]: I0314 07:02:59.731623 4893 patch_prober.go:28] interesting pod/machine-config-daemon-d4x6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:02:59 crc kubenswrapper[4893]: I0314 07:02:59.731685 4893 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:02:59 crc kubenswrapper[4893]: I0314 07:02:59.811730 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 14 07:02:59 crc kubenswrapper[4893]: I0314 07:02:59.891645 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 14 07:02:59 crc kubenswrapper[4893]: I0314 07:02:59.934907 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 14 07:02:59 crc kubenswrapper[4893]: I0314 07:02:59.974979 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 14 07:03:00 crc kubenswrapper[4893]: I0314 07:03:00.023248 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 14 07:03:00 crc kubenswrapper[4893]: I0314 07:03:00.025325 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 14 07:03:00 crc kubenswrapper[4893]: I0314 07:03:00.066040 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 14 07:03:00 crc kubenswrapper[4893]: I0314 07:03:00.092859 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 14 07:03:00 crc kubenswrapper[4893]: I0314 07:03:00.221760 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 14 07:03:00 crc kubenswrapper[4893]: I0314 07:03:00.224024 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 14 07:03:00 crc kubenswrapper[4893]: I0314 07:03:00.224408 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 14 07:03:00 crc kubenswrapper[4893]: I0314 07:03:00.269273 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 14 07:03:00 crc kubenswrapper[4893]: I0314 07:03:00.296598 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 14 07:03:00 crc kubenswrapper[4893]: I0314 07:03:00.408421 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 14 07:03:00 crc kubenswrapper[4893]: I0314 07:03:00.415416 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 14 07:03:00 crc kubenswrapper[4893]: I0314 07:03:00.525472 4893 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 14 07:03:00 crc kubenswrapper[4893]: I0314 07:03:00.572052 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 14 07:03:00 crc kubenswrapper[4893]: I0314 07:03:00.595824 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 14 07:03:00 crc kubenswrapper[4893]: I0314 07:03:00.644576 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 14 07:03:00 crc kubenswrapper[4893]: I0314 07:03:00.722741 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 14 07:03:00 crc kubenswrapper[4893]: I0314 07:03:00.951882 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 14 07:03:00 crc kubenswrapper[4893]: I0314 07:03:00.988738 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 14 07:03:01 crc kubenswrapper[4893]: I0314 07:03:01.072367 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 14 07:03:01 crc kubenswrapper[4893]: I0314 07:03:01.141820 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 14 07:03:01 crc kubenswrapper[4893]: I0314 07:03:01.157177 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 14 07:03:01 crc kubenswrapper[4893]: I0314 07:03:01.223572 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 14 07:03:01 crc kubenswrapper[4893]: I0314 07:03:01.313598 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 14 07:03:01 crc kubenswrapper[4893]: I0314 07:03:01.324235 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 14 07:03:01 crc kubenswrapper[4893]: I0314 07:03:01.443095 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 14 07:03:01 crc kubenswrapper[4893]: I0314 07:03:01.469986 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 14 07:03:01 crc kubenswrapper[4893]: I0314 07:03:01.499569 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 14 07:03:01 crc kubenswrapper[4893]: I0314 07:03:01.500445 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 14 07:03:01 crc kubenswrapper[4893]: I0314 07:03:01.543957 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 14 07:03:01 crc kubenswrapper[4893]: I0314 07:03:01.563755 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 14 07:03:01 crc kubenswrapper[4893]: I0314 07:03:01.596299 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 14 07:03:01 crc kubenswrapper[4893]: I0314 07:03:01.714267 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 14 07:03:01 crc kubenswrapper[4893]: I0314 07:03:01.772156 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 14 07:03:01 crc kubenswrapper[4893]: I0314 07:03:01.856882 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 14 07:03:01 crc kubenswrapper[4893]: I0314 07:03:01.942115 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 14 07:03:01 crc kubenswrapper[4893]: I0314 07:03:01.952956 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 14 07:03:01 crc kubenswrapper[4893]: I0314 07:03:01.963384 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 14 07:03:01 crc kubenswrapper[4893]: I0314 07:03:01.975614 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 14 07:03:02 crc kubenswrapper[4893]: I0314 07:03:02.177388 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 14 07:03:02 crc kubenswrapper[4893]: I0314 07:03:02.217449 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 14 07:03:02 crc kubenswrapper[4893]: I0314 07:03:02.245593 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 14 07:03:02 crc kubenswrapper[4893]: I0314 07:03:02.337344 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 14 07:03:02 crc kubenswrapper[4893]: I0314 07:03:02.395089 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 14 07:03:02 crc kubenswrapper[4893]: I0314 07:03:02.401616 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 14 07:03:02 crc kubenswrapper[4893]: I0314 07:03:02.451359 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 14 07:03:02 crc kubenswrapper[4893]: I0314 07:03:02.492133 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 14 07:03:02 crc kubenswrapper[4893]: I0314 07:03:02.616444 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 14 07:03:02 crc kubenswrapper[4893]: I0314 07:03:02.676915 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 14 07:03:02 crc kubenswrapper[4893]: I0314 07:03:02.678642 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 14 07:03:02 crc kubenswrapper[4893]: I0314 07:03:02.681138 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 14 07:03:02 crc kubenswrapper[4893]: I0314 07:03:02.689003 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 14 07:03:02 crc kubenswrapper[4893]: I0314 07:03:02.689368 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 14 07:03:02 crc kubenswrapper[4893]: I0314 07:03:02.774175 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 14 07:03:02 crc kubenswrapper[4893]: I0314 07:03:02.783814 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 14 07:03:02 crc kubenswrapper[4893]: I0314 07:03:02.830406 4893 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 14 07:03:02 crc kubenswrapper[4893]: I0314 07:03:02.833946 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=38.833927976 podStartE2EDuration="38.833927976s" podCreationTimestamp="2026-03-14 07:02:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:02:45.688239995 +0000 UTC m=+244.950416797" watchObservedRunningTime="2026-03-14 07:03:02.833927976 +0000 UTC m=+262.096104768" Mar 14 07:03:02 crc kubenswrapper[4893]: I0314 07:03:02.835048 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-sfrgk","openshift-kube-apiserver/kube-apiserver-crc"] Mar 14 07:03:02 crc kubenswrapper[4893]: I0314 07:03:02.835097 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 14 07:03:02 crc kubenswrapper[4893]: I0314 07:03:02.838459 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 07:03:02 crc kubenswrapper[4893]: I0314 07:03:02.850921 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=17.850912223999998 podStartE2EDuration="17.850912224s" podCreationTimestamp="2026-03-14 07:02:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:03:02.849981349 +0000 UTC m=+262.112158151" watchObservedRunningTime="2026-03-14 07:03:02.850912224 +0000 UTC m=+262.113089016" Mar 14 07:03:02 crc kubenswrapper[4893]: I0314 07:03:02.877589 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 14 07:03:02 crc kubenswrapper[4893]: I0314 07:03:02.953269 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 14 07:03:02 crc kubenswrapper[4893]: I0314 07:03:02.988325 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 14 07:03:03 crc kubenswrapper[4893]: I0314 07:03:03.029087 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 14 07:03:03 crc kubenswrapper[4893]: I0314 07:03:03.167712 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 14 07:03:03 crc kubenswrapper[4893]: I0314 07:03:03.175720 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 14 07:03:03 crc kubenswrapper[4893]: I0314 07:03:03.217073 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 14 07:03:03 crc kubenswrapper[4893]: I0314 07:03:03.225426 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 14 07:03:03 crc kubenswrapper[4893]: I0314 07:03:03.285109 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 14 07:03:03 crc kubenswrapper[4893]: I0314 07:03:03.328259 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 14 07:03:03 crc kubenswrapper[4893]: I0314 07:03:03.385515 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="928468fc-c237-4779-a2c6-7365b3764fe8" path="/var/lib/kubelet/pods/928468fc-c237-4779-a2c6-7365b3764fe8/volumes" Mar 14 07:03:03 crc kubenswrapper[4893]: I0314 07:03:03.409994 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 14 07:03:03 crc kubenswrapper[4893]: I0314 07:03:03.460249 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 14 07:03:03 crc kubenswrapper[4893]: I0314 07:03:03.480843 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 14 07:03:03 crc kubenswrapper[4893]: I0314 07:03:03.498704 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 14 07:03:03 crc kubenswrapper[4893]: I0314 07:03:03.583020 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 14 07:03:03 crc kubenswrapper[4893]: I0314 07:03:03.613054 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 14 07:03:03 crc kubenswrapper[4893]: I0314 07:03:03.635758 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 14 07:03:03 crc kubenswrapper[4893]: I0314 07:03:03.707873 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 14 07:03:03 crc kubenswrapper[4893]: I0314 07:03:03.741402 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 14 07:03:03 crc kubenswrapper[4893]: I0314 07:03:03.788476 4893 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 14 07:03:03 crc kubenswrapper[4893]: I0314 07:03:03.804588 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 14 07:03:03 crc kubenswrapper[4893]: I0314 07:03:03.806615 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 14 07:03:03 crc kubenswrapper[4893]: I0314 07:03:03.950882 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 14 07:03:03 crc kubenswrapper[4893]: I0314 07:03:03.975038 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 14 07:03:04 crc kubenswrapper[4893]: I0314 07:03:04.037962 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 14 07:03:04 crc kubenswrapper[4893]: I0314 07:03:04.070217 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 14 07:03:04 crc kubenswrapper[4893]: I0314 07:03:04.176939 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 14 07:03:04 crc kubenswrapper[4893]: I0314 07:03:04.178343 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 14 07:03:04 crc kubenswrapper[4893]: I0314 07:03:04.340255 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 14 07:03:04 crc kubenswrapper[4893]: I0314 07:03:04.369900 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 14 07:03:04 crc kubenswrapper[4893]: I0314 07:03:04.423411 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 14 07:03:04 crc kubenswrapper[4893]: I0314 07:03:04.494448 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 14 07:03:04 crc kubenswrapper[4893]: I0314 07:03:04.696688 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 14 07:03:04 crc kubenswrapper[4893]: I0314 07:03:04.706727 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 14 07:03:04 crc kubenswrapper[4893]: I0314 07:03:04.869296 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 14 07:03:05 crc kubenswrapper[4893]: I0314 07:03:05.012906 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 14 07:03:05 crc kubenswrapper[4893]: I0314 07:03:05.078896 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 14 07:03:05 crc kubenswrapper[4893]: I0314 07:03:05.086979 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 14 07:03:05 crc kubenswrapper[4893]: I0314 07:03:05.113215 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 14 07:03:05 crc kubenswrapper[4893]: I0314 07:03:05.195050 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 14 07:03:05 crc kubenswrapper[4893]: I0314 07:03:05.291902 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 14 07:03:05 crc kubenswrapper[4893]: I0314 07:03:05.352361 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 14 07:03:05 crc kubenswrapper[4893]: I0314 07:03:05.378466 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 14 07:03:05 crc kubenswrapper[4893]: I0314 07:03:05.444796 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 14 07:03:05 crc kubenswrapper[4893]: I0314 07:03:05.499805 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 14 07:03:05 crc kubenswrapper[4893]: I0314 07:03:05.560756 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 14 07:03:05 crc kubenswrapper[4893]: I0314 07:03:05.647301 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 14 07:03:05 crc kubenswrapper[4893]: I0314 07:03:05.656354 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 14 07:03:05 crc kubenswrapper[4893]: I0314 07:03:05.704605 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 14 07:03:05 crc kubenswrapper[4893]: I0314 07:03:05.759287 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 14 07:03:05 crc kubenswrapper[4893]: I0314 07:03:05.759903 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 14 07:03:05 crc kubenswrapper[4893]: I0314 07:03:05.882052 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 14 07:03:05 crc kubenswrapper[4893]: I0314 07:03:05.976411 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 14 07:03:06 crc kubenswrapper[4893]: I0314 07:03:06.015584 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 14 07:03:06 crc kubenswrapper[4893]: I0314 07:03:06.080107 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 14 07:03:06 crc kubenswrapper[4893]: I0314 07:03:06.080749 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 14 07:03:06 crc kubenswrapper[4893]: I0314 07:03:06.095281 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 14 07:03:06 crc kubenswrapper[4893]: I0314 07:03:06.137364 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 14 07:03:06 crc kubenswrapper[4893]: I0314 07:03:06.157122 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 14 07:03:06 crc kubenswrapper[4893]: I0314 07:03:06.157970 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 14 07:03:06 crc kubenswrapper[4893]: I0314 07:03:06.170623 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 14 07:03:06 crc kubenswrapper[4893]: I0314 07:03:06.186402 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 14 07:03:06 crc kubenswrapper[4893]: I0314 07:03:06.228132 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 14 07:03:06 crc kubenswrapper[4893]: I0314 07:03:06.291069 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 14 07:03:06 crc kubenswrapper[4893]: I0314 07:03:06.321842 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 14 07:03:06 crc kubenswrapper[4893]: I0314 07:03:06.336415 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 14 07:03:06 crc kubenswrapper[4893]: I0314 07:03:06.371267 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 14 07:03:06 crc kubenswrapper[4893]: I0314 07:03:06.407735 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 14 07:03:06 crc kubenswrapper[4893]: I0314 07:03:06.430041 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 14 07:03:06 crc kubenswrapper[4893]: I0314 07:03:06.467658 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 14 07:03:06 crc kubenswrapper[4893]: I0314 07:03:06.473557 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 14 07:03:06 crc kubenswrapper[4893]: I0314 07:03:06.481944 4893 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 14 07:03:06 crc kubenswrapper[4893]: I0314 07:03:06.513626 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 14 07:03:06 crc kubenswrapper[4893]: I0314 07:03:06.521588 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 14 07:03:06 crc kubenswrapper[4893]: I0314 07:03:06.543391 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 14 07:03:06 crc kubenswrapper[4893]: I0314 07:03:06.554682 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 14 07:03:06 crc kubenswrapper[4893]: I0314 07:03:06.602437 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 14 07:03:06 crc kubenswrapper[4893]: I0314 07:03:06.764013 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 14 07:03:06 crc kubenswrapper[4893]: I0314 07:03:06.767611 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 14 07:03:06 crc kubenswrapper[4893]: I0314 07:03:06.785638 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 14 07:03:06 crc kubenswrapper[4893]: I0314 07:03:06.834548 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 14 07:03:06 crc kubenswrapper[4893]: I0314 07:03:06.973239 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 14 07:03:06 crc kubenswrapper[4893]: I0314 07:03:06.988726 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 14 07:03:07 crc kubenswrapper[4893]: I0314 07:03:07.066380 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 14 07:03:07 crc kubenswrapper[4893]: I0314 07:03:07.082794 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 14 07:03:07 crc kubenswrapper[4893]: I0314 07:03:07.106095 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 14 07:03:07 crc kubenswrapper[4893]: I0314 07:03:07.115277 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 14 07:03:07 crc kubenswrapper[4893]: I0314 07:03:07.267920 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 14 07:03:07 crc kubenswrapper[4893]: I0314 07:03:07.341170 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 14 07:03:07 crc kubenswrapper[4893]: I0314 07:03:07.342298 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 14 07:03:07 crc kubenswrapper[4893]: I0314 07:03:07.456206 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 14 07:03:07 crc kubenswrapper[4893]: I0314 07:03:07.467591 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 14 07:03:07 crc kubenswrapper[4893]: I0314 07:03:07.477760 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 14 07:03:07 crc kubenswrapper[4893]: I0314 07:03:07.500997 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 14 07:03:07 crc kubenswrapper[4893]: I0314 07:03:07.563086 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 14 07:03:07 crc kubenswrapper[4893]: I0314 07:03:07.598331 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 14 07:03:07 crc kubenswrapper[4893]: I0314 07:03:07.606838 4893 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 14 07:03:07 crc kubenswrapper[4893]: I0314 07:03:07.783610 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 14 07:03:07 crc kubenswrapper[4893]: I0314 07:03:07.828728 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 14 07:03:07 crc kubenswrapper[4893]: I0314 07:03:07.873931 4893 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 14 07:03:07 crc kubenswrapper[4893]: I0314 07:03:07.874258 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://f3947ae54f8768c62816ae39099452483ad6c05295536b2b977b188ce1d87d1a" gracePeriod=5 Mar 14 07:03:07 crc kubenswrapper[4893]: I0314 07:03:07.968774 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.100904 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-774cdcb5dd-g4rgb"] Mar 14 07:03:08 crc kubenswrapper[4893]: E0314 07:03:08.101197 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="928468fc-c237-4779-a2c6-7365b3764fe8" containerName="oauth-openshift" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.101212 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="928468fc-c237-4779-a2c6-7365b3764fe8" containerName="oauth-openshift" Mar 14 07:03:08 crc kubenswrapper[4893]: E0314 07:03:08.101228 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.101237 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 14 07:03:08 crc kubenswrapper[4893]: E0314 07:03:08.101253 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c077ca86-7535-42ff-b1c4-2acec5092071" containerName="installer" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.101261 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="c077ca86-7535-42ff-b1c4-2acec5092071" containerName="installer" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.101376 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.101389 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="928468fc-c237-4779-a2c6-7365b3764fe8" containerName="oauth-openshift" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.101402 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="c077ca86-7535-42ff-b1c4-2acec5092071" containerName="installer" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.101855 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-774cdcb5dd-g4rgb" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.104858 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.114058 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.114555 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.115060 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.115263 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.115446 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.115646 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.115671 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.115756 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.115856 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.115908 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.115859 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.124267 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.129929 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.137083 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.149859 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.174154 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d714ad20-f905-48c1-b6c1-934f2dd4f330-v4-0-config-system-serving-cert\") pod \"oauth-openshift-774cdcb5dd-g4rgb\" (UID: \"d714ad20-f905-48c1-b6c1-934f2dd4f330\") " pod="openshift-authentication/oauth-openshift-774cdcb5dd-g4rgb" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.174255 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d714ad20-f905-48c1-b6c1-934f2dd4f330-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-774cdcb5dd-g4rgb\" (UID: \"d714ad20-f905-48c1-b6c1-934f2dd4f330\") " pod="openshift-authentication/oauth-openshift-774cdcb5dd-g4rgb" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.174279 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d714ad20-f905-48c1-b6c1-934f2dd4f330-v4-0-config-user-template-error\") pod \"oauth-openshift-774cdcb5dd-g4rgb\" (UID: \"d714ad20-f905-48c1-b6c1-934f2dd4f330\") " pod="openshift-authentication/oauth-openshift-774cdcb5dd-g4rgb" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.174306 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d714ad20-f905-48c1-b6c1-934f2dd4f330-v4-0-config-system-cliconfig\") pod \"oauth-openshift-774cdcb5dd-g4rgb\" (UID: \"d714ad20-f905-48c1-b6c1-934f2dd4f330\") " pod="openshift-authentication/oauth-openshift-774cdcb5dd-g4rgb" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.174326 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d714ad20-f905-48c1-b6c1-934f2dd4f330-v4-0-config-user-template-login\") pod \"oauth-openshift-774cdcb5dd-g4rgb\" (UID: \"d714ad20-f905-48c1-b6c1-934f2dd4f330\") " pod="openshift-authentication/oauth-openshift-774cdcb5dd-g4rgb" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.174353 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sk5p7\" (UniqueName: \"kubernetes.io/projected/d714ad20-f905-48c1-b6c1-934f2dd4f330-kube-api-access-sk5p7\") pod \"oauth-openshift-774cdcb5dd-g4rgb\" (UID: \"d714ad20-f905-48c1-b6c1-934f2dd4f330\") " pod="openshift-authentication/oauth-openshift-774cdcb5dd-g4rgb" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.174379 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d714ad20-f905-48c1-b6c1-934f2dd4f330-audit-dir\") pod \"oauth-openshift-774cdcb5dd-g4rgb\" (UID: \"d714ad20-f905-48c1-b6c1-934f2dd4f330\") " pod="openshift-authentication/oauth-openshift-774cdcb5dd-g4rgb" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.174429 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d714ad20-f905-48c1-b6c1-934f2dd4f330-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-774cdcb5dd-g4rgb\" (UID: \"d714ad20-f905-48c1-b6c1-934f2dd4f330\") " pod="openshift-authentication/oauth-openshift-774cdcb5dd-g4rgb" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.174467 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d714ad20-f905-48c1-b6c1-934f2dd4f330-v4-0-config-system-session\") pod \"oauth-openshift-774cdcb5dd-g4rgb\" (UID: \"d714ad20-f905-48c1-b6c1-934f2dd4f330\") " pod="openshift-authentication/oauth-openshift-774cdcb5dd-g4rgb" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.174507 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d714ad20-f905-48c1-b6c1-934f2dd4f330-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-774cdcb5dd-g4rgb\" (UID: \"d714ad20-f905-48c1-b6c1-934f2dd4f330\") " pod="openshift-authentication/oauth-openshift-774cdcb5dd-g4rgb" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.174684 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d714ad20-f905-48c1-b6c1-934f2dd4f330-v4-0-config-system-router-certs\") pod \"oauth-openshift-774cdcb5dd-g4rgb\" (UID: \"d714ad20-f905-48c1-b6c1-934f2dd4f330\") " pod="openshift-authentication/oauth-openshift-774cdcb5dd-g4rgb" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.174735 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d714ad20-f905-48c1-b6c1-934f2dd4f330-v4-0-config-system-service-ca\") pod \"oauth-openshift-774cdcb5dd-g4rgb\" (UID: \"d714ad20-f905-48c1-b6c1-934f2dd4f330\") " pod="openshift-authentication/oauth-openshift-774cdcb5dd-g4rgb" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.174782 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d714ad20-f905-48c1-b6c1-934f2dd4f330-audit-policies\") pod \"oauth-openshift-774cdcb5dd-g4rgb\" (UID: \"d714ad20-f905-48c1-b6c1-934f2dd4f330\") " pod="openshift-authentication/oauth-openshift-774cdcb5dd-g4rgb" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.174838 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d714ad20-f905-48c1-b6c1-934f2dd4f330-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-774cdcb5dd-g4rgb\" (UID: \"d714ad20-f905-48c1-b6c1-934f2dd4f330\") " pod="openshift-authentication/oauth-openshift-774cdcb5dd-g4rgb" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.190740 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-774cdcb5dd-g4rgb"] Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.235954 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.276213 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d714ad20-f905-48c1-b6c1-934f2dd4f330-v4-0-config-system-router-certs\") pod \"oauth-openshift-774cdcb5dd-g4rgb\" (UID: \"d714ad20-f905-48c1-b6c1-934f2dd4f330\") " pod="openshift-authentication/oauth-openshift-774cdcb5dd-g4rgb" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.276259 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d714ad20-f905-48c1-b6c1-934f2dd4f330-v4-0-config-system-service-ca\") pod \"oauth-openshift-774cdcb5dd-g4rgb\" (UID: \"d714ad20-f905-48c1-b6c1-934f2dd4f330\") " pod="openshift-authentication/oauth-openshift-774cdcb5dd-g4rgb" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.276295 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d714ad20-f905-48c1-b6c1-934f2dd4f330-audit-policies\") pod \"oauth-openshift-774cdcb5dd-g4rgb\" (UID: \"d714ad20-f905-48c1-b6c1-934f2dd4f330\") " pod="openshift-authentication/oauth-openshift-774cdcb5dd-g4rgb" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.276330 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d714ad20-f905-48c1-b6c1-934f2dd4f330-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-774cdcb5dd-g4rgb\" (UID: \"d714ad20-f905-48c1-b6c1-934f2dd4f330\") " pod="openshift-authentication/oauth-openshift-774cdcb5dd-g4rgb" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.276352 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d714ad20-f905-48c1-b6c1-934f2dd4f330-v4-0-config-system-serving-cert\") pod \"oauth-openshift-774cdcb5dd-g4rgb\" (UID: \"d714ad20-f905-48c1-b6c1-934f2dd4f330\") " pod="openshift-authentication/oauth-openshift-774cdcb5dd-g4rgb" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.276393 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d714ad20-f905-48c1-b6c1-934f2dd4f330-v4-0-config-user-template-error\") pod \"oauth-openshift-774cdcb5dd-g4rgb\" (UID: \"d714ad20-f905-48c1-b6c1-934f2dd4f330\") " pod="openshift-authentication/oauth-openshift-774cdcb5dd-g4rgb" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.276416 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d714ad20-f905-48c1-b6c1-934f2dd4f330-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-774cdcb5dd-g4rgb\" (UID: \"d714ad20-f905-48c1-b6c1-934f2dd4f330\") " pod="openshift-authentication/oauth-openshift-774cdcb5dd-g4rgb" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.276441 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d714ad20-f905-48c1-b6c1-934f2dd4f330-v4-0-config-system-cliconfig\") pod \"oauth-openshift-774cdcb5dd-g4rgb\" (UID: \"d714ad20-f905-48c1-b6c1-934f2dd4f330\") " pod="openshift-authentication/oauth-openshift-774cdcb5dd-g4rgb" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.276461 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d714ad20-f905-48c1-b6c1-934f2dd4f330-v4-0-config-user-template-login\") pod \"oauth-openshift-774cdcb5dd-g4rgb\" (UID: \"d714ad20-f905-48c1-b6c1-934f2dd4f330\") " pod="openshift-authentication/oauth-openshift-774cdcb5dd-g4rgb" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.276514 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sk5p7\" (UniqueName: \"kubernetes.io/projected/d714ad20-f905-48c1-b6c1-934f2dd4f330-kube-api-access-sk5p7\") pod \"oauth-openshift-774cdcb5dd-g4rgb\" (UID: \"d714ad20-f905-48c1-b6c1-934f2dd4f330\") " pod="openshift-authentication/oauth-openshift-774cdcb5dd-g4rgb" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.276569 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d714ad20-f905-48c1-b6c1-934f2dd4f330-audit-dir\") pod \"oauth-openshift-774cdcb5dd-g4rgb\" (UID: \"d714ad20-f905-48c1-b6c1-934f2dd4f330\") " pod="openshift-authentication/oauth-openshift-774cdcb5dd-g4rgb" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.276594 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d714ad20-f905-48c1-b6c1-934f2dd4f330-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-774cdcb5dd-g4rgb\" (UID: \"d714ad20-f905-48c1-b6c1-934f2dd4f330\") " pod="openshift-authentication/oauth-openshift-774cdcb5dd-g4rgb" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.276650 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d714ad20-f905-48c1-b6c1-934f2dd4f330-v4-0-config-system-session\") pod \"oauth-openshift-774cdcb5dd-g4rgb\" (UID: \"d714ad20-f905-48c1-b6c1-934f2dd4f330\") " pod="openshift-authentication/oauth-openshift-774cdcb5dd-g4rgb" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.276714 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d714ad20-f905-48c1-b6c1-934f2dd4f330-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-774cdcb5dd-g4rgb\" (UID: \"d714ad20-f905-48c1-b6c1-934f2dd4f330\") " pod="openshift-authentication/oauth-openshift-774cdcb5dd-g4rgb" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.277582 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d714ad20-f905-48c1-b6c1-934f2dd4f330-v4-0-config-system-service-ca\") pod \"oauth-openshift-774cdcb5dd-g4rgb\" (UID: \"d714ad20-f905-48c1-b6c1-934f2dd4f330\") " pod="openshift-authentication/oauth-openshift-774cdcb5dd-g4rgb" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.277581 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d714ad20-f905-48c1-b6c1-934f2dd4f330-audit-dir\") pod \"oauth-openshift-774cdcb5dd-g4rgb\" (UID: \"d714ad20-f905-48c1-b6c1-934f2dd4f330\") " pod="openshift-authentication/oauth-openshift-774cdcb5dd-g4rgb" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.278422 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d714ad20-f905-48c1-b6c1-934f2dd4f330-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-774cdcb5dd-g4rgb\" (UID: \"d714ad20-f905-48c1-b6c1-934f2dd4f330\") " pod="openshift-authentication/oauth-openshift-774cdcb5dd-g4rgb" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.278704 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d714ad20-f905-48c1-b6c1-934f2dd4f330-v4-0-config-system-cliconfig\") pod \"oauth-openshift-774cdcb5dd-g4rgb\" (UID: \"d714ad20-f905-48c1-b6c1-934f2dd4f330\") " pod="openshift-authentication/oauth-openshift-774cdcb5dd-g4rgb" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.279106 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d714ad20-f905-48c1-b6c1-934f2dd4f330-audit-policies\") pod \"oauth-openshift-774cdcb5dd-g4rgb\" (UID: \"d714ad20-f905-48c1-b6c1-934f2dd4f330\") " pod="openshift-authentication/oauth-openshift-774cdcb5dd-g4rgb" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.283112 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d714ad20-f905-48c1-b6c1-934f2dd4f330-v4-0-config-system-session\") pod \"oauth-openshift-774cdcb5dd-g4rgb\" (UID: \"d714ad20-f905-48c1-b6c1-934f2dd4f330\") " pod="openshift-authentication/oauth-openshift-774cdcb5dd-g4rgb" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.284111 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d714ad20-f905-48c1-b6c1-934f2dd4f330-v4-0-config-user-template-login\") pod \"oauth-openshift-774cdcb5dd-g4rgb\" (UID: \"d714ad20-f905-48c1-b6c1-934f2dd4f330\") " pod="openshift-authentication/oauth-openshift-774cdcb5dd-g4rgb" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.286348 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d714ad20-f905-48c1-b6c1-934f2dd4f330-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-774cdcb5dd-g4rgb\" (UID: \"d714ad20-f905-48c1-b6c1-934f2dd4f330\") " pod="openshift-authentication/oauth-openshift-774cdcb5dd-g4rgb" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.286663 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d714ad20-f905-48c1-b6c1-934f2dd4f330-v4-0-config-system-router-certs\") pod \"oauth-openshift-774cdcb5dd-g4rgb\" (UID: \"d714ad20-f905-48c1-b6c1-934f2dd4f330\") " pod="openshift-authentication/oauth-openshift-774cdcb5dd-g4rgb" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.287571 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d714ad20-f905-48c1-b6c1-934f2dd4f330-v4-0-config-user-template-error\") pod \"oauth-openshift-774cdcb5dd-g4rgb\" (UID: \"d714ad20-f905-48c1-b6c1-934f2dd4f330\") " pod="openshift-authentication/oauth-openshift-774cdcb5dd-g4rgb" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.297963 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d714ad20-f905-48c1-b6c1-934f2dd4f330-v4-0-config-system-serving-cert\") pod \"oauth-openshift-774cdcb5dd-g4rgb\" (UID: \"d714ad20-f905-48c1-b6c1-934f2dd4f330\") " pod="openshift-authentication/oauth-openshift-774cdcb5dd-g4rgb" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.299858 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d714ad20-f905-48c1-b6c1-934f2dd4f330-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-774cdcb5dd-g4rgb\" (UID: \"d714ad20-f905-48c1-b6c1-934f2dd4f330\") " pod="openshift-authentication/oauth-openshift-774cdcb5dd-g4rgb" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.301571 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d714ad20-f905-48c1-b6c1-934f2dd4f330-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-774cdcb5dd-g4rgb\" (UID: \"d714ad20-f905-48c1-b6c1-934f2dd4f330\") " pod="openshift-authentication/oauth-openshift-774cdcb5dd-g4rgb" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.307680 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sk5p7\" (UniqueName: \"kubernetes.io/projected/d714ad20-f905-48c1-b6c1-934f2dd4f330-kube-api-access-sk5p7\") pod \"oauth-openshift-774cdcb5dd-g4rgb\" (UID: \"d714ad20-f905-48c1-b6c1-934f2dd4f330\") " pod="openshift-authentication/oauth-openshift-774cdcb5dd-g4rgb" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.386456 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.435163 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-774cdcb5dd-g4rgb" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.448620 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.453760 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.458059 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.503390 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.587793 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.714955 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.723132 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.748745 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.817277 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.831133 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.835369 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-774cdcb5dd-g4rgb"] Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.867172 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.940926 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 14 07:03:08 crc kubenswrapper[4893]: I0314 07:03:08.987780 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 14 07:03:09 crc kubenswrapper[4893]: I0314 07:03:09.156903 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 14 07:03:09 crc kubenswrapper[4893]: I0314 07:03:09.240166 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 14 07:03:09 crc kubenswrapper[4893]: I0314 07:03:09.248090 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 14 07:03:09 crc kubenswrapper[4893]: I0314 07:03:09.509004 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 14 07:03:09 crc kubenswrapper[4893]: I0314 07:03:09.658965 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 14 07:03:09 crc kubenswrapper[4893]: I0314 07:03:09.709545 4893 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 14 07:03:09 crc kubenswrapper[4893]: I0314 07:03:09.747150 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 14 07:03:09 crc kubenswrapper[4893]: I0314 07:03:09.785194 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 14 07:03:09 crc kubenswrapper[4893]: I0314 07:03:09.858984 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-774cdcb5dd-g4rgb_d714ad20-f905-48c1-b6c1-934f2dd4f330/oauth-openshift/0.log" Mar 14 07:03:09 crc kubenswrapper[4893]: I0314 07:03:09.859038 4893 generic.go:334] "Generic (PLEG): container finished" podID="d714ad20-f905-48c1-b6c1-934f2dd4f330" containerID="e78e1848f785d24cb8f6b856c324832a7e3f1df670619b98fb7461e751d2848e" exitCode=255 Mar 14 07:03:09 crc kubenswrapper[4893]: I0314 07:03:09.859073 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-774cdcb5dd-g4rgb" event={"ID":"d714ad20-f905-48c1-b6c1-934f2dd4f330","Type":"ContainerDied","Data":"e78e1848f785d24cb8f6b856c324832a7e3f1df670619b98fb7461e751d2848e"} Mar 14 07:03:09 crc kubenswrapper[4893]: I0314 07:03:09.859102 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-774cdcb5dd-g4rgb" event={"ID":"d714ad20-f905-48c1-b6c1-934f2dd4f330","Type":"ContainerStarted","Data":"30323fb20b705e852076c3db1ffbec5b07866d09790b77eef4394c53cc3de774"} Mar 14 07:03:09 crc kubenswrapper[4893]: I0314 07:03:09.859557 4893 scope.go:117] "RemoveContainer" containerID="e78e1848f785d24cb8f6b856c324832a7e3f1df670619b98fb7461e751d2848e" Mar 14 07:03:10 crc kubenswrapper[4893]: I0314 07:03:10.009116 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 14 07:03:10 crc kubenswrapper[4893]: I0314 07:03:10.043052 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 14 07:03:10 crc kubenswrapper[4893]: I0314 07:03:10.362318 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 14 07:03:10 crc kubenswrapper[4893]: I0314 07:03:10.538714 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 14 07:03:10 crc kubenswrapper[4893]: I0314 07:03:10.580278 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 14 07:03:10 crc kubenswrapper[4893]: I0314 07:03:10.802789 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 14 07:03:10 crc kubenswrapper[4893]: I0314 07:03:10.871355 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-774cdcb5dd-g4rgb_d714ad20-f905-48c1-b6c1-934f2dd4f330/oauth-openshift/1.log" Mar 14 07:03:10 crc kubenswrapper[4893]: I0314 07:03:10.872740 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-774cdcb5dd-g4rgb_d714ad20-f905-48c1-b6c1-934f2dd4f330/oauth-openshift/0.log" Mar 14 07:03:10 crc kubenswrapper[4893]: I0314 07:03:10.872819 4893 generic.go:334] "Generic (PLEG): container finished" podID="d714ad20-f905-48c1-b6c1-934f2dd4f330" containerID="283a7a39b01fc688d95215d0c1346491bbd236b1b0b48f64f3c3dab67dbfab4c" exitCode=255 Mar 14 07:03:10 crc kubenswrapper[4893]: I0314 07:03:10.872871 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-774cdcb5dd-g4rgb" event={"ID":"d714ad20-f905-48c1-b6c1-934f2dd4f330","Type":"ContainerDied","Data":"283a7a39b01fc688d95215d0c1346491bbd236b1b0b48f64f3c3dab67dbfab4c"} Mar 14 07:03:10 crc kubenswrapper[4893]: I0314 07:03:10.872923 4893 scope.go:117] "RemoveContainer" containerID="e78e1848f785d24cb8f6b856c324832a7e3f1df670619b98fb7461e751d2848e" Mar 14 07:03:10 crc kubenswrapper[4893]: I0314 07:03:10.873738 4893 scope.go:117] "RemoveContainer" containerID="283a7a39b01fc688d95215d0c1346491bbd236b1b0b48f64f3c3dab67dbfab4c" Mar 14 07:03:10 crc kubenswrapper[4893]: E0314 07:03:10.873980 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 10s restarting failed container=oauth-openshift pod=oauth-openshift-774cdcb5dd-g4rgb_openshift-authentication(d714ad20-f905-48c1-b6c1-934f2dd4f330)\"" pod="openshift-authentication/oauth-openshift-774cdcb5dd-g4rgb" podUID="d714ad20-f905-48c1-b6c1-934f2dd4f330" Mar 14 07:03:10 crc kubenswrapper[4893]: I0314 07:03:10.933363 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 14 07:03:11 crc kubenswrapper[4893]: I0314 07:03:11.018365 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 14 07:03:11 crc kubenswrapper[4893]: I0314 07:03:11.041922 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 14 07:03:11 crc kubenswrapper[4893]: I0314 07:03:11.044644 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 14 07:03:11 crc kubenswrapper[4893]: I0314 07:03:11.347198 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 14 07:03:11 crc kubenswrapper[4893]: I0314 07:03:11.661849 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 14 07:03:11 crc kubenswrapper[4893]: I0314 07:03:11.712789 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 14 07:03:11 crc kubenswrapper[4893]: I0314 07:03:11.886233 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-774cdcb5dd-g4rgb_d714ad20-f905-48c1-b6c1-934f2dd4f330/oauth-openshift/1.log" Mar 14 07:03:11 crc kubenswrapper[4893]: I0314 07:03:11.887178 4893 scope.go:117] "RemoveContainer" containerID="283a7a39b01fc688d95215d0c1346491bbd236b1b0b48f64f3c3dab67dbfab4c" Mar 14 07:03:11 crc kubenswrapper[4893]: E0314 07:03:11.887797 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 10s restarting failed container=oauth-openshift pod=oauth-openshift-774cdcb5dd-g4rgb_openshift-authentication(d714ad20-f905-48c1-b6c1-934f2dd4f330)\"" pod="openshift-authentication/oauth-openshift-774cdcb5dd-g4rgb" podUID="d714ad20-f905-48c1-b6c1-934f2dd4f330" Mar 14 07:03:12 crc kubenswrapper[4893]: I0314 07:03:12.060772 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 14 07:03:12 crc kubenswrapper[4893]: I0314 07:03:12.266489 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 14 07:03:12 crc kubenswrapper[4893]: I0314 07:03:12.934621 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 14 07:03:13 crc kubenswrapper[4893]: I0314 07:03:13.467827 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 14 07:03:13 crc kubenswrapper[4893]: I0314 07:03:13.467901 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 07:03:13 crc kubenswrapper[4893]: I0314 07:03:13.555869 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 14 07:03:13 crc kubenswrapper[4893]: I0314 07:03:13.555941 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 14 07:03:13 crc kubenswrapper[4893]: I0314 07:03:13.555992 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 14 07:03:13 crc kubenswrapper[4893]: I0314 07:03:13.556012 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 14 07:03:13 crc kubenswrapper[4893]: I0314 07:03:13.556026 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 14 07:03:13 crc kubenswrapper[4893]: I0314 07:03:13.556040 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:03:13 crc kubenswrapper[4893]: I0314 07:03:13.556097 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:03:13 crc kubenswrapper[4893]: I0314 07:03:13.556165 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:03:13 crc kubenswrapper[4893]: I0314 07:03:13.556195 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:03:13 crc kubenswrapper[4893]: I0314 07:03:13.556229 4893 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:13 crc kubenswrapper[4893]: I0314 07:03:13.556245 4893 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:13 crc kubenswrapper[4893]: I0314 07:03:13.567668 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:03:13 crc kubenswrapper[4893]: I0314 07:03:13.657488 4893 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:13 crc kubenswrapper[4893]: I0314 07:03:13.657580 4893 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:13 crc kubenswrapper[4893]: I0314 07:03:13.657608 4893 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:13 crc kubenswrapper[4893]: I0314 07:03:13.808222 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 14 07:03:13 crc kubenswrapper[4893]: I0314 07:03:13.899897 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 14 07:03:13 crc kubenswrapper[4893]: I0314 07:03:13.900034 4893 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="f3947ae54f8768c62816ae39099452483ad6c05295536b2b977b188ce1d87d1a" exitCode=137 Mar 14 07:03:13 crc kubenswrapper[4893]: I0314 07:03:13.900117 4893 scope.go:117] "RemoveContainer" containerID="f3947ae54f8768c62816ae39099452483ad6c05295536b2b977b188ce1d87d1a" Mar 14 07:03:13 crc kubenswrapper[4893]: I0314 07:03:13.900155 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 07:03:13 crc kubenswrapper[4893]: I0314 07:03:13.917103 4893 scope.go:117] "RemoveContainer" containerID="f3947ae54f8768c62816ae39099452483ad6c05295536b2b977b188ce1d87d1a" Mar 14 07:03:13 crc kubenswrapper[4893]: E0314 07:03:13.917561 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3947ae54f8768c62816ae39099452483ad6c05295536b2b977b188ce1d87d1a\": container with ID starting with f3947ae54f8768c62816ae39099452483ad6c05295536b2b977b188ce1d87d1a not found: ID does not exist" containerID="f3947ae54f8768c62816ae39099452483ad6c05295536b2b977b188ce1d87d1a" Mar 14 07:03:13 crc kubenswrapper[4893]: I0314 07:03:13.917610 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3947ae54f8768c62816ae39099452483ad6c05295536b2b977b188ce1d87d1a"} err="failed to get container status \"f3947ae54f8768c62816ae39099452483ad6c05295536b2b977b188ce1d87d1a\": rpc error: code = NotFound desc = could not find container \"f3947ae54f8768c62816ae39099452483ad6c05295536b2b977b188ce1d87d1a\": container with ID starting with f3947ae54f8768c62816ae39099452483ad6c05295536b2b977b188ce1d87d1a not found: ID does not exist" Mar 14 07:03:15 crc kubenswrapper[4893]: I0314 07:03:15.384934 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 14 07:03:15 crc kubenswrapper[4893]: I0314 07:03:15.385776 4893 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Mar 14 07:03:15 crc kubenswrapper[4893]: I0314 07:03:15.398029 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 14 07:03:15 crc kubenswrapper[4893]: I0314 07:03:15.398082 4893 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="bbccefe8-a3cb-4cd6-a92e-be78b189ed1a" Mar 14 07:03:15 crc kubenswrapper[4893]: I0314 07:03:15.402283 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 14 07:03:15 crc kubenswrapper[4893]: I0314 07:03:15.402333 4893 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="bbccefe8-a3cb-4cd6-a92e-be78b189ed1a" Mar 14 07:03:18 crc kubenswrapper[4893]: I0314 07:03:18.436049 4893 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication/oauth-openshift-774cdcb5dd-g4rgb" Mar 14 07:03:18 crc kubenswrapper[4893]: I0314 07:03:18.436414 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-774cdcb5dd-g4rgb" Mar 14 07:03:18 crc kubenswrapper[4893]: I0314 07:03:18.437435 4893 scope.go:117] "RemoveContainer" containerID="283a7a39b01fc688d95215d0c1346491bbd236b1b0b48f64f3c3dab67dbfab4c" Mar 14 07:03:18 crc kubenswrapper[4893]: E0314 07:03:18.438012 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 10s restarting failed container=oauth-openshift pod=oauth-openshift-774cdcb5dd-g4rgb_openshift-authentication(d714ad20-f905-48c1-b6c1-934f2dd4f330)\"" pod="openshift-authentication/oauth-openshift-774cdcb5dd-g4rgb" podUID="d714ad20-f905-48c1-b6c1-934f2dd4f330" Mar 14 07:03:25 crc kubenswrapper[4893]: I0314 07:03:25.272509 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 14 07:03:25 crc kubenswrapper[4893]: I0314 07:03:25.829968 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5655bc4dbd-26tjh"] Mar 14 07:03:25 crc kubenswrapper[4893]: I0314 07:03:25.830230 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5655bc4dbd-26tjh" podUID="fd3b7325-46db-44d8-9c67-cde475def14a" containerName="controller-manager" containerID="cri-o://f4de5eb4111e51cb92497df15674cddc8e3b7857637f956cf29506c6ebefeaeb" gracePeriod=30 Mar 14 07:03:25 crc kubenswrapper[4893]: I0314 07:03:25.835019 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59689957cc-nqtk2"] Mar 14 07:03:25 crc kubenswrapper[4893]: I0314 07:03:25.835217 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-59689957cc-nqtk2" podUID="cdc6aea3-c426-4e6b-9f31-cdace87b288e" containerName="route-controller-manager" containerID="cri-o://8b36934016777d8402938fd1d798fc42a0fe65f4ac02866e089873c00e82370e" gracePeriod=30 Mar 14 07:03:25 crc kubenswrapper[4893]: I0314 07:03:25.973051 4893 generic.go:334] "Generic (PLEG): container finished" podID="cdc6aea3-c426-4e6b-9f31-cdace87b288e" containerID="8b36934016777d8402938fd1d798fc42a0fe65f4ac02866e089873c00e82370e" exitCode=0 Mar 14 07:03:25 crc kubenswrapper[4893]: I0314 07:03:25.973196 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-59689957cc-nqtk2" event={"ID":"cdc6aea3-c426-4e6b-9f31-cdace87b288e","Type":"ContainerDied","Data":"8b36934016777d8402938fd1d798fc42a0fe65f4ac02866e089873c00e82370e"} Mar 14 07:03:25 crc kubenswrapper[4893]: I0314 07:03:25.974781 4893 generic.go:334] "Generic (PLEG): container finished" podID="fd3b7325-46db-44d8-9c67-cde475def14a" containerID="f4de5eb4111e51cb92497df15674cddc8e3b7857637f956cf29506c6ebefeaeb" exitCode=0 Mar 14 07:03:25 crc kubenswrapper[4893]: I0314 07:03:25.974842 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5655bc4dbd-26tjh" event={"ID":"fd3b7325-46db-44d8-9c67-cde475def14a","Type":"ContainerDied","Data":"f4de5eb4111e51cb92497df15674cddc8e3b7857637f956cf29506c6ebefeaeb"} Mar 14 07:03:26 crc kubenswrapper[4893]: I0314 07:03:26.233983 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-59689957cc-nqtk2" Mar 14 07:03:26 crc kubenswrapper[4893]: I0314 07:03:26.238493 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5655bc4dbd-26tjh" Mar 14 07:03:26 crc kubenswrapper[4893]: I0314 07:03:26.318348 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cdc6aea3-c426-4e6b-9f31-cdace87b288e-client-ca\") pod \"cdc6aea3-c426-4e6b-9f31-cdace87b288e\" (UID: \"cdc6aea3-c426-4e6b-9f31-cdace87b288e\") " Mar 14 07:03:26 crc kubenswrapper[4893]: I0314 07:03:26.318393 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvrsr\" (UniqueName: \"kubernetes.io/projected/cdc6aea3-c426-4e6b-9f31-cdace87b288e-kube-api-access-tvrsr\") pod \"cdc6aea3-c426-4e6b-9f31-cdace87b288e\" (UID: \"cdc6aea3-c426-4e6b-9f31-cdace87b288e\") " Mar 14 07:03:26 crc kubenswrapper[4893]: I0314 07:03:26.318421 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fd3b7325-46db-44d8-9c67-cde475def14a-proxy-ca-bundles\") pod \"fd3b7325-46db-44d8-9c67-cde475def14a\" (UID: \"fd3b7325-46db-44d8-9c67-cde475def14a\") " Mar 14 07:03:26 crc kubenswrapper[4893]: I0314 07:03:26.318437 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd3b7325-46db-44d8-9c67-cde475def14a-serving-cert\") pod \"fd3b7325-46db-44d8-9c67-cde475def14a\" (UID: \"fd3b7325-46db-44d8-9c67-cde475def14a\") " Mar 14 07:03:26 crc kubenswrapper[4893]: I0314 07:03:26.318453 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cdc6aea3-c426-4e6b-9f31-cdace87b288e-config\") pod \"cdc6aea3-c426-4e6b-9f31-cdace87b288e\" (UID: \"cdc6aea3-c426-4e6b-9f31-cdace87b288e\") " Mar 14 07:03:26 crc kubenswrapper[4893]: I0314 07:03:26.318477 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fd3b7325-46db-44d8-9c67-cde475def14a-client-ca\") pod \"fd3b7325-46db-44d8-9c67-cde475def14a\" (UID: \"fd3b7325-46db-44d8-9c67-cde475def14a\") " Mar 14 07:03:26 crc kubenswrapper[4893]: I0314 07:03:26.318507 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzz2p\" (UniqueName: \"kubernetes.io/projected/fd3b7325-46db-44d8-9c67-cde475def14a-kube-api-access-wzz2p\") pod \"fd3b7325-46db-44d8-9c67-cde475def14a\" (UID: \"fd3b7325-46db-44d8-9c67-cde475def14a\") " Mar 14 07:03:26 crc kubenswrapper[4893]: I0314 07:03:26.318555 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd3b7325-46db-44d8-9c67-cde475def14a-config\") pod \"fd3b7325-46db-44d8-9c67-cde475def14a\" (UID: \"fd3b7325-46db-44d8-9c67-cde475def14a\") " Mar 14 07:03:26 crc kubenswrapper[4893]: I0314 07:03:26.318601 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cdc6aea3-c426-4e6b-9f31-cdace87b288e-serving-cert\") pod \"cdc6aea3-c426-4e6b-9f31-cdace87b288e\" (UID: \"cdc6aea3-c426-4e6b-9f31-cdace87b288e\") " Mar 14 07:03:26 crc kubenswrapper[4893]: I0314 07:03:26.319513 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd3b7325-46db-44d8-9c67-cde475def14a-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "fd3b7325-46db-44d8-9c67-cde475def14a" (UID: "fd3b7325-46db-44d8-9c67-cde475def14a"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:03:26 crc kubenswrapper[4893]: I0314 07:03:26.319855 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cdc6aea3-c426-4e6b-9f31-cdace87b288e-client-ca" (OuterVolumeSpecName: "client-ca") pod "cdc6aea3-c426-4e6b-9f31-cdace87b288e" (UID: "cdc6aea3-c426-4e6b-9f31-cdace87b288e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:03:26 crc kubenswrapper[4893]: I0314 07:03:26.319910 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cdc6aea3-c426-4e6b-9f31-cdace87b288e-config" (OuterVolumeSpecName: "config") pod "cdc6aea3-c426-4e6b-9f31-cdace87b288e" (UID: "cdc6aea3-c426-4e6b-9f31-cdace87b288e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:03:26 crc kubenswrapper[4893]: I0314 07:03:26.319961 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd3b7325-46db-44d8-9c67-cde475def14a-config" (OuterVolumeSpecName: "config") pod "fd3b7325-46db-44d8-9c67-cde475def14a" (UID: "fd3b7325-46db-44d8-9c67-cde475def14a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:03:26 crc kubenswrapper[4893]: I0314 07:03:26.320365 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd3b7325-46db-44d8-9c67-cde475def14a-client-ca" (OuterVolumeSpecName: "client-ca") pod "fd3b7325-46db-44d8-9c67-cde475def14a" (UID: "fd3b7325-46db-44d8-9c67-cde475def14a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:03:26 crc kubenswrapper[4893]: I0314 07:03:26.323573 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd3b7325-46db-44d8-9c67-cde475def14a-kube-api-access-wzz2p" (OuterVolumeSpecName: "kube-api-access-wzz2p") pod "fd3b7325-46db-44d8-9c67-cde475def14a" (UID: "fd3b7325-46db-44d8-9c67-cde475def14a"). InnerVolumeSpecName "kube-api-access-wzz2p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:03:26 crc kubenswrapper[4893]: I0314 07:03:26.323702 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdc6aea3-c426-4e6b-9f31-cdace87b288e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "cdc6aea3-c426-4e6b-9f31-cdace87b288e" (UID: "cdc6aea3-c426-4e6b-9f31-cdace87b288e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:03:26 crc kubenswrapper[4893]: I0314 07:03:26.323695 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdc6aea3-c426-4e6b-9f31-cdace87b288e-kube-api-access-tvrsr" (OuterVolumeSpecName: "kube-api-access-tvrsr") pod "cdc6aea3-c426-4e6b-9f31-cdace87b288e" (UID: "cdc6aea3-c426-4e6b-9f31-cdace87b288e"). InnerVolumeSpecName "kube-api-access-tvrsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:03:26 crc kubenswrapper[4893]: I0314 07:03:26.324635 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd3b7325-46db-44d8-9c67-cde475def14a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "fd3b7325-46db-44d8-9c67-cde475def14a" (UID: "fd3b7325-46db-44d8-9c67-cde475def14a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:03:26 crc kubenswrapper[4893]: I0314 07:03:26.419568 4893 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cdc6aea3-c426-4e6b-9f31-cdace87b288e-client-ca\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:26 crc kubenswrapper[4893]: I0314 07:03:26.419728 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvrsr\" (UniqueName: \"kubernetes.io/projected/cdc6aea3-c426-4e6b-9f31-cdace87b288e-kube-api-access-tvrsr\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:26 crc kubenswrapper[4893]: I0314 07:03:26.419755 4893 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fd3b7325-46db-44d8-9c67-cde475def14a-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:26 crc kubenswrapper[4893]: I0314 07:03:26.419766 4893 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd3b7325-46db-44d8-9c67-cde475def14a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:26 crc kubenswrapper[4893]: I0314 07:03:26.419777 4893 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cdc6aea3-c426-4e6b-9f31-cdace87b288e-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:26 crc kubenswrapper[4893]: I0314 07:03:26.419787 4893 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fd3b7325-46db-44d8-9c67-cde475def14a-client-ca\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:26 crc kubenswrapper[4893]: I0314 07:03:26.419797 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzz2p\" (UniqueName: \"kubernetes.io/projected/fd3b7325-46db-44d8-9c67-cde475def14a-kube-api-access-wzz2p\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:26 crc kubenswrapper[4893]: I0314 07:03:26.419806 4893 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd3b7325-46db-44d8-9c67-cde475def14a-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:26 crc kubenswrapper[4893]: I0314 07:03:26.419817 4893 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cdc6aea3-c426-4e6b-9f31-cdace87b288e-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:26 crc kubenswrapper[4893]: I0314 07:03:26.983719 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-59689957cc-nqtk2" event={"ID":"cdc6aea3-c426-4e6b-9f31-cdace87b288e","Type":"ContainerDied","Data":"e5f4c52cf914d1d29b7dfef16bafdbd0da7f205d2228ccee125af67c15d7ae22"} Mar 14 07:03:26 crc kubenswrapper[4893]: I0314 07:03:26.983775 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-59689957cc-nqtk2" Mar 14 07:03:26 crc kubenswrapper[4893]: I0314 07:03:26.983842 4893 scope.go:117] "RemoveContainer" containerID="8b36934016777d8402938fd1d798fc42a0fe65f4ac02866e089873c00e82370e" Mar 14 07:03:26 crc kubenswrapper[4893]: I0314 07:03:26.987482 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5655bc4dbd-26tjh" event={"ID":"fd3b7325-46db-44d8-9c67-cde475def14a","Type":"ContainerDied","Data":"c4aa915ee540138089f1183215ee8d7e6e9c2ab268c13ec53f83e15f22ee3d83"} Mar 14 07:03:26 crc kubenswrapper[4893]: I0314 07:03:26.987707 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5655bc4dbd-26tjh" Mar 14 07:03:27 crc kubenswrapper[4893]: I0314 07:03:27.004307 4893 scope.go:117] "RemoveContainer" containerID="f4de5eb4111e51cb92497df15674cddc8e3b7857637f956cf29506c6ebefeaeb" Mar 14 07:03:27 crc kubenswrapper[4893]: I0314 07:03:27.055131 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59689957cc-nqtk2"] Mar 14 07:03:27 crc kubenswrapper[4893]: I0314 07:03:27.062623 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59689957cc-nqtk2"] Mar 14 07:03:27 crc kubenswrapper[4893]: I0314 07:03:27.068653 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5655bc4dbd-26tjh"] Mar 14 07:03:27 crc kubenswrapper[4893]: I0314 07:03:27.072912 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5655bc4dbd-26tjh"] Mar 14 07:03:27 crc kubenswrapper[4893]: I0314 07:03:27.139953 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-776ff7b55f-vvfkg"] Mar 14 07:03:27 crc kubenswrapper[4893]: E0314 07:03:27.140373 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdc6aea3-c426-4e6b-9f31-cdace87b288e" containerName="route-controller-manager" Mar 14 07:03:27 crc kubenswrapper[4893]: I0314 07:03:27.140407 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdc6aea3-c426-4e6b-9f31-cdace87b288e" containerName="route-controller-manager" Mar 14 07:03:27 crc kubenswrapper[4893]: E0314 07:03:27.140440 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd3b7325-46db-44d8-9c67-cde475def14a" containerName="controller-manager" Mar 14 07:03:27 crc kubenswrapper[4893]: I0314 07:03:27.140458 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd3b7325-46db-44d8-9c67-cde475def14a" containerName="controller-manager" Mar 14 07:03:27 crc kubenswrapper[4893]: I0314 07:03:27.141667 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd3b7325-46db-44d8-9c67-cde475def14a" containerName="controller-manager" Mar 14 07:03:27 crc kubenswrapper[4893]: I0314 07:03:27.141712 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdc6aea3-c426-4e6b-9f31-cdace87b288e" containerName="route-controller-manager" Mar 14 07:03:27 crc kubenswrapper[4893]: I0314 07:03:27.142497 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-776ff7b55f-vvfkg" Mar 14 07:03:27 crc kubenswrapper[4893]: I0314 07:03:27.145070 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-668b86bfc6-xxt8j"] Mar 14 07:03:27 crc kubenswrapper[4893]: I0314 07:03:27.145623 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 14 07:03:27 crc kubenswrapper[4893]: I0314 07:03:27.145773 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 14 07:03:27 crc kubenswrapper[4893]: I0314 07:03:27.145978 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-668b86bfc6-xxt8j" Mar 14 07:03:27 crc kubenswrapper[4893]: I0314 07:03:27.149871 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 14 07:03:27 crc kubenswrapper[4893]: I0314 07:03:27.150056 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 14 07:03:27 crc kubenswrapper[4893]: I0314 07:03:27.150166 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 14 07:03:27 crc kubenswrapper[4893]: I0314 07:03:27.150582 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 14 07:03:27 crc kubenswrapper[4893]: I0314 07:03:27.152859 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 14 07:03:27 crc kubenswrapper[4893]: I0314 07:03:27.153104 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 14 07:03:27 crc kubenswrapper[4893]: I0314 07:03:27.153315 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 14 07:03:27 crc kubenswrapper[4893]: I0314 07:03:27.153534 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 14 07:03:27 crc kubenswrapper[4893]: I0314 07:03:27.153657 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 14 07:03:27 crc kubenswrapper[4893]: I0314 07:03:27.153675 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 14 07:03:27 crc kubenswrapper[4893]: I0314 07:03:27.155665 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-668b86bfc6-xxt8j"] Mar 14 07:03:27 crc kubenswrapper[4893]: I0314 07:03:27.159170 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-776ff7b55f-vvfkg"] Mar 14 07:03:27 crc kubenswrapper[4893]: I0314 07:03:27.162138 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 14 07:03:27 crc kubenswrapper[4893]: I0314 07:03:27.229788 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdmv9\" (UniqueName: \"kubernetes.io/projected/13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4-kube-api-access-bdmv9\") pod \"controller-manager-776ff7b55f-vvfkg\" (UID: \"13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4\") " pod="openshift-controller-manager/controller-manager-776ff7b55f-vvfkg" Mar 14 07:03:27 crc kubenswrapper[4893]: I0314 07:03:27.229835 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4-config\") pod \"controller-manager-776ff7b55f-vvfkg\" (UID: \"13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4\") " pod="openshift-controller-manager/controller-manager-776ff7b55f-vvfkg" Mar 14 07:03:27 crc kubenswrapper[4893]: I0314 07:03:27.229864 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f14e9df0-fe2a-4c08-b357-db8a4472e116-serving-cert\") pod \"route-controller-manager-668b86bfc6-xxt8j\" (UID: \"f14e9df0-fe2a-4c08-b357-db8a4472e116\") " pod="openshift-route-controller-manager/route-controller-manager-668b86bfc6-xxt8j" Mar 14 07:03:27 crc kubenswrapper[4893]: I0314 07:03:27.229882 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f14e9df0-fe2a-4c08-b357-db8a4472e116-config\") pod \"route-controller-manager-668b86bfc6-xxt8j\" (UID: \"f14e9df0-fe2a-4c08-b357-db8a4472e116\") " pod="openshift-route-controller-manager/route-controller-manager-668b86bfc6-xxt8j" Mar 14 07:03:27 crc kubenswrapper[4893]: I0314 07:03:27.229911 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f14e9df0-fe2a-4c08-b357-db8a4472e116-client-ca\") pod \"route-controller-manager-668b86bfc6-xxt8j\" (UID: \"f14e9df0-fe2a-4c08-b357-db8a4472e116\") " pod="openshift-route-controller-manager/route-controller-manager-668b86bfc6-xxt8j" Mar 14 07:03:27 crc kubenswrapper[4893]: I0314 07:03:27.230066 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4-proxy-ca-bundles\") pod \"controller-manager-776ff7b55f-vvfkg\" (UID: \"13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4\") " pod="openshift-controller-manager/controller-manager-776ff7b55f-vvfkg" Mar 14 07:03:27 crc kubenswrapper[4893]: I0314 07:03:27.230177 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4-serving-cert\") pod \"controller-manager-776ff7b55f-vvfkg\" (UID: \"13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4\") " pod="openshift-controller-manager/controller-manager-776ff7b55f-vvfkg" Mar 14 07:03:27 crc kubenswrapper[4893]: I0314 07:03:27.230221 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4-client-ca\") pod \"controller-manager-776ff7b55f-vvfkg\" (UID: \"13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4\") " pod="openshift-controller-manager/controller-manager-776ff7b55f-vvfkg" Mar 14 07:03:27 crc kubenswrapper[4893]: I0314 07:03:27.230264 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwlq4\" (UniqueName: \"kubernetes.io/projected/f14e9df0-fe2a-4c08-b357-db8a4472e116-kube-api-access-xwlq4\") pod \"route-controller-manager-668b86bfc6-xxt8j\" (UID: \"f14e9df0-fe2a-4c08-b357-db8a4472e116\") " pod="openshift-route-controller-manager/route-controller-manager-668b86bfc6-xxt8j" Mar 14 07:03:27 crc kubenswrapper[4893]: I0314 07:03:27.331591 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f14e9df0-fe2a-4c08-b357-db8a4472e116-serving-cert\") pod \"route-controller-manager-668b86bfc6-xxt8j\" (UID: \"f14e9df0-fe2a-4c08-b357-db8a4472e116\") " pod="openshift-route-controller-manager/route-controller-manager-668b86bfc6-xxt8j" Mar 14 07:03:27 crc kubenswrapper[4893]: I0314 07:03:27.331632 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f14e9df0-fe2a-4c08-b357-db8a4472e116-config\") pod \"route-controller-manager-668b86bfc6-xxt8j\" (UID: \"f14e9df0-fe2a-4c08-b357-db8a4472e116\") " pod="openshift-route-controller-manager/route-controller-manager-668b86bfc6-xxt8j" Mar 14 07:03:27 crc kubenswrapper[4893]: I0314 07:03:27.331655 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f14e9df0-fe2a-4c08-b357-db8a4472e116-client-ca\") pod \"route-controller-manager-668b86bfc6-xxt8j\" (UID: \"f14e9df0-fe2a-4c08-b357-db8a4472e116\") " pod="openshift-route-controller-manager/route-controller-manager-668b86bfc6-xxt8j" Mar 14 07:03:27 crc kubenswrapper[4893]: I0314 07:03:27.331680 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4-proxy-ca-bundles\") pod \"controller-manager-776ff7b55f-vvfkg\" (UID: \"13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4\") " pod="openshift-controller-manager/controller-manager-776ff7b55f-vvfkg" Mar 14 07:03:27 crc kubenswrapper[4893]: I0314 07:03:27.331725 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4-serving-cert\") pod \"controller-manager-776ff7b55f-vvfkg\" (UID: \"13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4\") " pod="openshift-controller-manager/controller-manager-776ff7b55f-vvfkg" Mar 14 07:03:27 crc kubenswrapper[4893]: I0314 07:03:27.331776 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4-client-ca\") pod \"controller-manager-776ff7b55f-vvfkg\" (UID: \"13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4\") " pod="openshift-controller-manager/controller-manager-776ff7b55f-vvfkg" Mar 14 07:03:27 crc kubenswrapper[4893]: I0314 07:03:27.331817 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwlq4\" (UniqueName: \"kubernetes.io/projected/f14e9df0-fe2a-4c08-b357-db8a4472e116-kube-api-access-xwlq4\") pod \"route-controller-manager-668b86bfc6-xxt8j\" (UID: \"f14e9df0-fe2a-4c08-b357-db8a4472e116\") " pod="openshift-route-controller-manager/route-controller-manager-668b86bfc6-xxt8j" Mar 14 07:03:27 crc kubenswrapper[4893]: I0314 07:03:27.332093 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdmv9\" (UniqueName: \"kubernetes.io/projected/13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4-kube-api-access-bdmv9\") pod \"controller-manager-776ff7b55f-vvfkg\" (UID: \"13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4\") " pod="openshift-controller-manager/controller-manager-776ff7b55f-vvfkg" Mar 14 07:03:27 crc kubenswrapper[4893]: I0314 07:03:27.332148 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4-config\") pod \"controller-manager-776ff7b55f-vvfkg\" (UID: \"13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4\") " pod="openshift-controller-manager/controller-manager-776ff7b55f-vvfkg" Mar 14 07:03:27 crc kubenswrapper[4893]: I0314 07:03:27.333184 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4-client-ca\") pod \"controller-manager-776ff7b55f-vvfkg\" (UID: \"13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4\") " pod="openshift-controller-manager/controller-manager-776ff7b55f-vvfkg" Mar 14 07:03:27 crc kubenswrapper[4893]: I0314 07:03:27.333839 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f14e9df0-fe2a-4c08-b357-db8a4472e116-client-ca\") pod \"route-controller-manager-668b86bfc6-xxt8j\" (UID: \"f14e9df0-fe2a-4c08-b357-db8a4472e116\") " pod="openshift-route-controller-manager/route-controller-manager-668b86bfc6-xxt8j" Mar 14 07:03:27 crc kubenswrapper[4893]: I0314 07:03:27.334585 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f14e9df0-fe2a-4c08-b357-db8a4472e116-config\") pod \"route-controller-manager-668b86bfc6-xxt8j\" (UID: \"f14e9df0-fe2a-4c08-b357-db8a4472e116\") " pod="openshift-route-controller-manager/route-controller-manager-668b86bfc6-xxt8j" Mar 14 07:03:27 crc kubenswrapper[4893]: I0314 07:03:27.334596 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4-proxy-ca-bundles\") pod \"controller-manager-776ff7b55f-vvfkg\" (UID: \"13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4\") " pod="openshift-controller-manager/controller-manager-776ff7b55f-vvfkg" Mar 14 07:03:27 crc kubenswrapper[4893]: I0314 07:03:27.335380 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4-config\") pod \"controller-manager-776ff7b55f-vvfkg\" (UID: \"13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4\") " pod="openshift-controller-manager/controller-manager-776ff7b55f-vvfkg" Mar 14 07:03:27 crc kubenswrapper[4893]: I0314 07:03:27.336968 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4-serving-cert\") pod \"controller-manager-776ff7b55f-vvfkg\" (UID: \"13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4\") " pod="openshift-controller-manager/controller-manager-776ff7b55f-vvfkg" Mar 14 07:03:27 crc kubenswrapper[4893]: I0314 07:03:27.336987 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f14e9df0-fe2a-4c08-b357-db8a4472e116-serving-cert\") pod \"route-controller-manager-668b86bfc6-xxt8j\" (UID: \"f14e9df0-fe2a-4c08-b357-db8a4472e116\") " pod="openshift-route-controller-manager/route-controller-manager-668b86bfc6-xxt8j" Mar 14 07:03:27 crc kubenswrapper[4893]: I0314 07:03:27.347743 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdmv9\" (UniqueName: \"kubernetes.io/projected/13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4-kube-api-access-bdmv9\") pod \"controller-manager-776ff7b55f-vvfkg\" (UID: \"13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4\") " pod="openshift-controller-manager/controller-manager-776ff7b55f-vvfkg" Mar 14 07:03:27 crc kubenswrapper[4893]: I0314 07:03:27.352209 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwlq4\" (UniqueName: \"kubernetes.io/projected/f14e9df0-fe2a-4c08-b357-db8a4472e116-kube-api-access-xwlq4\") pod \"route-controller-manager-668b86bfc6-xxt8j\" (UID: \"f14e9df0-fe2a-4c08-b357-db8a4472e116\") " pod="openshift-route-controller-manager/route-controller-manager-668b86bfc6-xxt8j" Mar 14 07:03:27 crc kubenswrapper[4893]: I0314 07:03:27.384784 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdc6aea3-c426-4e6b-9f31-cdace87b288e" path="/var/lib/kubelet/pods/cdc6aea3-c426-4e6b-9f31-cdace87b288e/volumes" Mar 14 07:03:27 crc kubenswrapper[4893]: I0314 07:03:27.386059 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd3b7325-46db-44d8-9c67-cde475def14a" path="/var/lib/kubelet/pods/fd3b7325-46db-44d8-9c67-cde475def14a/volumes" Mar 14 07:03:27 crc kubenswrapper[4893]: I0314 07:03:27.487182 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-776ff7b55f-vvfkg" Mar 14 07:03:27 crc kubenswrapper[4893]: I0314 07:03:27.496847 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-668b86bfc6-xxt8j" Mar 14 07:03:27 crc kubenswrapper[4893]: I0314 07:03:27.965418 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-668b86bfc6-xxt8j"] Mar 14 07:03:27 crc kubenswrapper[4893]: I0314 07:03:27.971182 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-776ff7b55f-vvfkg"] Mar 14 07:03:27 crc kubenswrapper[4893]: W0314 07:03:27.980373 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13a98fc7_1d4e_4ddc_bf42_3c6bd46dd6b4.slice/crio-b1261e72336785a90485351e185cbc3e972ea15b19350befbc4c4fccda511bad WatchSource:0}: Error finding container b1261e72336785a90485351e185cbc3e972ea15b19350befbc4c4fccda511bad: Status 404 returned error can't find the container with id b1261e72336785a90485351e185cbc3e972ea15b19350befbc4c4fccda511bad Mar 14 07:03:28 crc kubenswrapper[4893]: I0314 07:03:28.000046 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-668b86bfc6-xxt8j" event={"ID":"f14e9df0-fe2a-4c08-b357-db8a4472e116","Type":"ContainerStarted","Data":"bd1cb6418bd638a06634147e7dd57d7da0458376610ed33ca8738eaf1fad11ea"} Mar 14 07:03:28 crc kubenswrapper[4893]: I0314 07:03:28.001104 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-776ff7b55f-vvfkg" event={"ID":"13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4","Type":"ContainerStarted","Data":"b1261e72336785a90485351e185cbc3e972ea15b19350befbc4c4fccda511bad"} Mar 14 07:03:29 crc kubenswrapper[4893]: I0314 07:03:29.015926 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-668b86bfc6-xxt8j" event={"ID":"f14e9df0-fe2a-4c08-b357-db8a4472e116","Type":"ContainerStarted","Data":"0f66b8cd02e2aba4273a17d6ed4c68031985b0d3ced08f257394b4961343614e"} Mar 14 07:03:29 crc kubenswrapper[4893]: I0314 07:03:29.016193 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-668b86bfc6-xxt8j" Mar 14 07:03:29 crc kubenswrapper[4893]: I0314 07:03:29.019434 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-776ff7b55f-vvfkg" event={"ID":"13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4","Type":"ContainerStarted","Data":"336878cdfbec88d6507bdc515aca002d7e48fdc1caae23e863901db060c8009c"} Mar 14 07:03:29 crc kubenswrapper[4893]: I0314 07:03:29.019733 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-776ff7b55f-vvfkg" Mar 14 07:03:29 crc kubenswrapper[4893]: I0314 07:03:29.026021 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-668b86bfc6-xxt8j" Mar 14 07:03:29 crc kubenswrapper[4893]: I0314 07:03:29.026448 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-776ff7b55f-vvfkg" Mar 14 07:03:29 crc kubenswrapper[4893]: I0314 07:03:29.038229 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-668b86bfc6-xxt8j" podStartSLOduration=4.038216348 podStartE2EDuration="4.038216348s" podCreationTimestamp="2026-03-14 07:03:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:03:29.036772529 +0000 UTC m=+288.298949361" watchObservedRunningTime="2026-03-14 07:03:29.038216348 +0000 UTC m=+288.300393140" Mar 14 07:03:29 crc kubenswrapper[4893]: I0314 07:03:29.090755 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-776ff7b55f-vvfkg" podStartSLOduration=4.090735902 podStartE2EDuration="4.090735902s" podCreationTimestamp="2026-03-14 07:03:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:03:29.085772216 +0000 UTC m=+288.347949038" watchObservedRunningTime="2026-03-14 07:03:29.090735902 +0000 UTC m=+288.352912704" Mar 14 07:03:29 crc kubenswrapper[4893]: I0314 07:03:29.731923 4893 patch_prober.go:28] interesting pod/machine-config-daemon-d4x6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:03:29 crc kubenswrapper[4893]: I0314 07:03:29.732773 4893 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:03:29 crc kubenswrapper[4893]: I0314 07:03:29.732837 4893 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" Mar 14 07:03:29 crc kubenswrapper[4893]: I0314 07:03:29.733728 4893 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0b93320f866f07b1494ab844854d58a4a60af1526c128c8f2df7794c38234a32"} pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 07:03:29 crc kubenswrapper[4893]: I0314 07:03:29.733807 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" containerID="cri-o://0b93320f866f07b1494ab844854d58a4a60af1526c128c8f2df7794c38234a32" gracePeriod=600 Mar 14 07:03:30 crc kubenswrapper[4893]: I0314 07:03:30.027750 4893 generic.go:334] "Generic (PLEG): container finished" podID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerID="0b93320f866f07b1494ab844854d58a4a60af1526c128c8f2df7794c38234a32" exitCode=0 Mar 14 07:03:30 crc kubenswrapper[4893]: I0314 07:03:30.028459 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" event={"ID":"ad6724e5-48cf-4417-ae51-b1cb8c6af70d","Type":"ContainerDied","Data":"0b93320f866f07b1494ab844854d58a4a60af1526c128c8f2df7794c38234a32"} Mar 14 07:03:31 crc kubenswrapper[4893]: I0314 07:03:31.037483 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" event={"ID":"ad6724e5-48cf-4417-ae51-b1cb8c6af70d","Type":"ContainerStarted","Data":"468464e3d7d61a6b60ffc50852de07cc978583465cf598b48439ac3af53b576d"} Mar 14 07:03:32 crc kubenswrapper[4893]: I0314 07:03:32.377366 4893 scope.go:117] "RemoveContainer" containerID="283a7a39b01fc688d95215d0c1346491bbd236b1b0b48f64f3c3dab67dbfab4c" Mar 14 07:03:33 crc kubenswrapper[4893]: I0314 07:03:33.047183 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-774cdcb5dd-g4rgb_d714ad20-f905-48c1-b6c1-934f2dd4f330/oauth-openshift/1.log" Mar 14 07:03:33 crc kubenswrapper[4893]: I0314 07:03:33.047580 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-774cdcb5dd-g4rgb" event={"ID":"d714ad20-f905-48c1-b6c1-934f2dd4f330","Type":"ContainerStarted","Data":"ecb0c96bbe50980e06d482eeb4d34fbf63bc412d8a4ff81bf4a9d076e8beef31"} Mar 14 07:03:33 crc kubenswrapper[4893]: I0314 07:03:33.048028 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-774cdcb5dd-g4rgb" Mar 14 07:03:33 crc kubenswrapper[4893]: I0314 07:03:33.053332 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-774cdcb5dd-g4rgb" Mar 14 07:03:33 crc kubenswrapper[4893]: I0314 07:03:33.074989 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-774cdcb5dd-g4rgb" podStartSLOduration=76.074964714 podStartE2EDuration="1m16.074964714s" podCreationTimestamp="2026-03-14 07:02:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:03:33.069592226 +0000 UTC m=+292.331769058" watchObservedRunningTime="2026-03-14 07:03:33.074964714 +0000 UTC m=+292.337141516" Mar 14 07:03:36 crc kubenswrapper[4893]: I0314 07:03:36.069239 4893 generic.go:334] "Generic (PLEG): container finished" podID="982402e2-823c-4c34-a446-7a2b05e9a00d" containerID="5f3284fcbc15fce7debc65c5d3d3aa0c60e2954160f012152ed144eed9cb67f9" exitCode=0 Mar 14 07:03:36 crc kubenswrapper[4893]: I0314 07:03:36.069296 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jn8vg" event={"ID":"982402e2-823c-4c34-a446-7a2b05e9a00d","Type":"ContainerDied","Data":"5f3284fcbc15fce7debc65c5d3d3aa0c60e2954160f012152ed144eed9cb67f9"} Mar 14 07:03:36 crc kubenswrapper[4893]: I0314 07:03:36.070502 4893 scope.go:117] "RemoveContainer" containerID="5f3284fcbc15fce7debc65c5d3d3aa0c60e2954160f012152ed144eed9cb67f9" Mar 14 07:03:37 crc kubenswrapper[4893]: I0314 07:03:37.078148 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jn8vg" event={"ID":"982402e2-823c-4c34-a446-7a2b05e9a00d","Type":"ContainerStarted","Data":"89f358feb51dba0355db67ac2e3a115b052bc955a3b295f35fc40fcf1fad24aa"} Mar 14 07:03:37 crc kubenswrapper[4893]: I0314 07:03:37.078921 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-jn8vg" Mar 14 07:03:37 crc kubenswrapper[4893]: I0314 07:03:37.081571 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-jn8vg" Mar 14 07:03:45 crc kubenswrapper[4893]: I0314 07:03:45.834913 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-776ff7b55f-vvfkg"] Mar 14 07:03:45 crc kubenswrapper[4893]: I0314 07:03:45.836035 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-776ff7b55f-vvfkg" podUID="13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4" containerName="controller-manager" containerID="cri-o://336878cdfbec88d6507bdc515aca002d7e48fdc1caae23e863901db060c8009c" gracePeriod=30 Mar 14 07:03:45 crc kubenswrapper[4893]: I0314 07:03:45.848640 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-668b86bfc6-xxt8j"] Mar 14 07:03:45 crc kubenswrapper[4893]: I0314 07:03:45.848948 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-668b86bfc6-xxt8j" podUID="f14e9df0-fe2a-4c08-b357-db8a4472e116" containerName="route-controller-manager" containerID="cri-o://0f66b8cd02e2aba4273a17d6ed4c68031985b0d3ced08f257394b4961343614e" gracePeriod=30 Mar 14 07:03:46 crc kubenswrapper[4893]: I0314 07:03:46.137467 4893 generic.go:334] "Generic (PLEG): container finished" podID="f14e9df0-fe2a-4c08-b357-db8a4472e116" containerID="0f66b8cd02e2aba4273a17d6ed4c68031985b0d3ced08f257394b4961343614e" exitCode=0 Mar 14 07:03:46 crc kubenswrapper[4893]: I0314 07:03:46.137607 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-668b86bfc6-xxt8j" event={"ID":"f14e9df0-fe2a-4c08-b357-db8a4472e116","Type":"ContainerDied","Data":"0f66b8cd02e2aba4273a17d6ed4c68031985b0d3ced08f257394b4961343614e"} Mar 14 07:03:46 crc kubenswrapper[4893]: I0314 07:03:46.139831 4893 generic.go:334] "Generic (PLEG): container finished" podID="13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4" containerID="336878cdfbec88d6507bdc515aca002d7e48fdc1caae23e863901db060c8009c" exitCode=0 Mar 14 07:03:46 crc kubenswrapper[4893]: I0314 07:03:46.139897 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-776ff7b55f-vvfkg" event={"ID":"13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4","Type":"ContainerDied","Data":"336878cdfbec88d6507bdc515aca002d7e48fdc1caae23e863901db060c8009c"} Mar 14 07:03:46 crc kubenswrapper[4893]: I0314 07:03:46.362377 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-668b86bfc6-xxt8j" Mar 14 07:03:46 crc kubenswrapper[4893]: I0314 07:03:46.407723 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwlq4\" (UniqueName: \"kubernetes.io/projected/f14e9df0-fe2a-4c08-b357-db8a4472e116-kube-api-access-xwlq4\") pod \"f14e9df0-fe2a-4c08-b357-db8a4472e116\" (UID: \"f14e9df0-fe2a-4c08-b357-db8a4472e116\") " Mar 14 07:03:46 crc kubenswrapper[4893]: I0314 07:03:46.407897 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f14e9df0-fe2a-4c08-b357-db8a4472e116-serving-cert\") pod \"f14e9df0-fe2a-4c08-b357-db8a4472e116\" (UID: \"f14e9df0-fe2a-4c08-b357-db8a4472e116\") " Mar 14 07:03:46 crc kubenswrapper[4893]: I0314 07:03:46.408082 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f14e9df0-fe2a-4c08-b357-db8a4472e116-config\") pod \"f14e9df0-fe2a-4c08-b357-db8a4472e116\" (UID: \"f14e9df0-fe2a-4c08-b357-db8a4472e116\") " Mar 14 07:03:46 crc kubenswrapper[4893]: I0314 07:03:46.409613 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f14e9df0-fe2a-4c08-b357-db8a4472e116-config" (OuterVolumeSpecName: "config") pod "f14e9df0-fe2a-4c08-b357-db8a4472e116" (UID: "f14e9df0-fe2a-4c08-b357-db8a4472e116"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:03:46 crc kubenswrapper[4893]: I0314 07:03:46.410359 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f14e9df0-fe2a-4c08-b357-db8a4472e116-client-ca\") pod \"f14e9df0-fe2a-4c08-b357-db8a4472e116\" (UID: \"f14e9df0-fe2a-4c08-b357-db8a4472e116\") " Mar 14 07:03:46 crc kubenswrapper[4893]: I0314 07:03:46.411678 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f14e9df0-fe2a-4c08-b357-db8a4472e116-client-ca" (OuterVolumeSpecName: "client-ca") pod "f14e9df0-fe2a-4c08-b357-db8a4472e116" (UID: "f14e9df0-fe2a-4c08-b357-db8a4472e116"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:03:46 crc kubenswrapper[4893]: I0314 07:03:46.411879 4893 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f14e9df0-fe2a-4c08-b357-db8a4472e116-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:46 crc kubenswrapper[4893]: I0314 07:03:46.411915 4893 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f14e9df0-fe2a-4c08-b357-db8a4472e116-client-ca\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:46 crc kubenswrapper[4893]: I0314 07:03:46.414117 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f14e9df0-fe2a-4c08-b357-db8a4472e116-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f14e9df0-fe2a-4c08-b357-db8a4472e116" (UID: "f14e9df0-fe2a-4c08-b357-db8a4472e116"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:03:46 crc kubenswrapper[4893]: I0314 07:03:46.414636 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f14e9df0-fe2a-4c08-b357-db8a4472e116-kube-api-access-xwlq4" (OuterVolumeSpecName: "kube-api-access-xwlq4") pod "f14e9df0-fe2a-4c08-b357-db8a4472e116" (UID: "f14e9df0-fe2a-4c08-b357-db8a4472e116"). InnerVolumeSpecName "kube-api-access-xwlq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:03:46 crc kubenswrapper[4893]: I0314 07:03:46.451677 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-776ff7b55f-vvfkg" Mar 14 07:03:46 crc kubenswrapper[4893]: I0314 07:03:46.513246 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4-config\") pod \"13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4\" (UID: \"13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4\") " Mar 14 07:03:46 crc kubenswrapper[4893]: I0314 07:03:46.513291 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4-proxy-ca-bundles\") pod \"13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4\" (UID: \"13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4\") " Mar 14 07:03:46 crc kubenswrapper[4893]: I0314 07:03:46.513350 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4-client-ca\") pod \"13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4\" (UID: \"13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4\") " Mar 14 07:03:46 crc kubenswrapper[4893]: I0314 07:03:46.513397 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4-serving-cert\") pod \"13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4\" (UID: \"13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4\") " Mar 14 07:03:46 crc kubenswrapper[4893]: I0314 07:03:46.513434 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdmv9\" (UniqueName: \"kubernetes.io/projected/13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4-kube-api-access-bdmv9\") pod \"13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4\" (UID: \"13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4\") " Mar 14 07:03:46 crc kubenswrapper[4893]: I0314 07:03:46.513674 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwlq4\" (UniqueName: \"kubernetes.io/projected/f14e9df0-fe2a-4c08-b357-db8a4472e116-kube-api-access-xwlq4\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:46 crc kubenswrapper[4893]: I0314 07:03:46.513688 4893 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f14e9df0-fe2a-4c08-b357-db8a4472e116-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:46 crc kubenswrapper[4893]: I0314 07:03:46.514298 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4-client-ca" (OuterVolumeSpecName: "client-ca") pod "13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4" (UID: "13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:03:46 crc kubenswrapper[4893]: I0314 07:03:46.514430 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4" (UID: "13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:03:46 crc kubenswrapper[4893]: I0314 07:03:46.514443 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4-config" (OuterVolumeSpecName: "config") pod "13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4" (UID: "13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:03:46 crc kubenswrapper[4893]: I0314 07:03:46.517078 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4" (UID: "13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:03:46 crc kubenswrapper[4893]: I0314 07:03:46.517103 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4-kube-api-access-bdmv9" (OuterVolumeSpecName: "kube-api-access-bdmv9") pod "13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4" (UID: "13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4"). InnerVolumeSpecName "kube-api-access-bdmv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:03:46 crc kubenswrapper[4893]: I0314 07:03:46.614918 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdmv9\" (UniqueName: \"kubernetes.io/projected/13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4-kube-api-access-bdmv9\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:46 crc kubenswrapper[4893]: I0314 07:03:46.614967 4893 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:46 crc kubenswrapper[4893]: I0314 07:03:46.614980 4893 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:46 crc kubenswrapper[4893]: I0314 07:03:46.614991 4893 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4-client-ca\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:46 crc kubenswrapper[4893]: I0314 07:03:46.615001 4893 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:47 crc kubenswrapper[4893]: I0314 07:03:47.150609 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-776ff7b55f-vvfkg" event={"ID":"13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4","Type":"ContainerDied","Data":"b1261e72336785a90485351e185cbc3e972ea15b19350befbc4c4fccda511bad"} Mar 14 07:03:47 crc kubenswrapper[4893]: I0314 07:03:47.150875 4893 scope.go:117] "RemoveContainer" containerID="336878cdfbec88d6507bdc515aca002d7e48fdc1caae23e863901db060c8009c" Mar 14 07:03:47 crc kubenswrapper[4893]: I0314 07:03:47.151053 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-776ff7b55f-vvfkg" Mar 14 07:03:47 crc kubenswrapper[4893]: I0314 07:03:47.154489 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-668b86bfc6-xxt8j" event={"ID":"f14e9df0-fe2a-4c08-b357-db8a4472e116","Type":"ContainerDied","Data":"bd1cb6418bd638a06634147e7dd57d7da0458376610ed33ca8738eaf1fad11ea"} Mar 14 07:03:47 crc kubenswrapper[4893]: I0314 07:03:47.154640 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-668b86bfc6-xxt8j" Mar 14 07:03:47 crc kubenswrapper[4893]: I0314 07:03:47.161725 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-787bf4545f-9wxhj"] Mar 14 07:03:47 crc kubenswrapper[4893]: E0314 07:03:47.162202 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4" containerName="controller-manager" Mar 14 07:03:47 crc kubenswrapper[4893]: I0314 07:03:47.162243 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4" containerName="controller-manager" Mar 14 07:03:47 crc kubenswrapper[4893]: E0314 07:03:47.162284 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f14e9df0-fe2a-4c08-b357-db8a4472e116" containerName="route-controller-manager" Mar 14 07:03:47 crc kubenswrapper[4893]: I0314 07:03:47.162307 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="f14e9df0-fe2a-4c08-b357-db8a4472e116" containerName="route-controller-manager" Mar 14 07:03:47 crc kubenswrapper[4893]: I0314 07:03:47.162607 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="f14e9df0-fe2a-4c08-b357-db8a4472e116" containerName="route-controller-manager" Mar 14 07:03:47 crc kubenswrapper[4893]: I0314 07:03:47.162664 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4" containerName="controller-manager" Mar 14 07:03:47 crc kubenswrapper[4893]: I0314 07:03:47.164159 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-787bf4545f-9wxhj" Mar 14 07:03:47 crc kubenswrapper[4893]: I0314 07:03:47.171513 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 14 07:03:47 crc kubenswrapper[4893]: I0314 07:03:47.172033 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 14 07:03:47 crc kubenswrapper[4893]: I0314 07:03:47.172325 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 14 07:03:47 crc kubenswrapper[4893]: I0314 07:03:47.174124 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 14 07:03:47 crc kubenswrapper[4893]: I0314 07:03:47.174590 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 14 07:03:47 crc kubenswrapper[4893]: I0314 07:03:47.175599 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56d66966bb-xqpfr"] Mar 14 07:03:47 crc kubenswrapper[4893]: I0314 07:03:47.175754 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 14 07:03:47 crc kubenswrapper[4893]: I0314 07:03:47.176492 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-56d66966bb-xqpfr" Mar 14 07:03:47 crc kubenswrapper[4893]: I0314 07:03:47.183109 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 14 07:03:47 crc kubenswrapper[4893]: I0314 07:03:47.183743 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 14 07:03:47 crc kubenswrapper[4893]: I0314 07:03:47.184305 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 14 07:03:47 crc kubenswrapper[4893]: I0314 07:03:47.184691 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 14 07:03:47 crc kubenswrapper[4893]: I0314 07:03:47.185217 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 14 07:03:47 crc kubenswrapper[4893]: I0314 07:03:47.185460 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-787bf4545f-9wxhj"] Mar 14 07:03:47 crc kubenswrapper[4893]: I0314 07:03:47.185671 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 14 07:03:47 crc kubenswrapper[4893]: I0314 07:03:47.189072 4893 scope.go:117] "RemoveContainer" containerID="0f66b8cd02e2aba4273a17d6ed4c68031985b0d3ced08f257394b4961343614e" Mar 14 07:03:47 crc kubenswrapper[4893]: I0314 07:03:47.190970 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 14 07:03:47 crc kubenswrapper[4893]: I0314 07:03:47.193471 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56d66966bb-xqpfr"] Mar 14 07:03:47 crc kubenswrapper[4893]: I0314 07:03:47.223307 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/46b8a6a5-7819-4638-93ca-3c87fad77aa7-client-ca\") pod \"route-controller-manager-56d66966bb-xqpfr\" (UID: \"46b8a6a5-7819-4638-93ca-3c87fad77aa7\") " pod="openshift-route-controller-manager/route-controller-manager-56d66966bb-xqpfr" Mar 14 07:03:47 crc kubenswrapper[4893]: I0314 07:03:47.223423 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/03157997-93f8-47c6-9b18-4d56a651c9c7-client-ca\") pod \"controller-manager-787bf4545f-9wxhj\" (UID: \"03157997-93f8-47c6-9b18-4d56a651c9c7\") " pod="openshift-controller-manager/controller-manager-787bf4545f-9wxhj" Mar 14 07:03:47 crc kubenswrapper[4893]: I0314 07:03:47.223473 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46b8a6a5-7819-4638-93ca-3c87fad77aa7-config\") pod \"route-controller-manager-56d66966bb-xqpfr\" (UID: \"46b8a6a5-7819-4638-93ca-3c87fad77aa7\") " pod="openshift-route-controller-manager/route-controller-manager-56d66966bb-xqpfr" Mar 14 07:03:47 crc kubenswrapper[4893]: I0314 07:03:47.223568 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/03157997-93f8-47c6-9b18-4d56a651c9c7-proxy-ca-bundles\") pod \"controller-manager-787bf4545f-9wxhj\" (UID: \"03157997-93f8-47c6-9b18-4d56a651c9c7\") " pod="openshift-controller-manager/controller-manager-787bf4545f-9wxhj" Mar 14 07:03:47 crc kubenswrapper[4893]: I0314 07:03:47.223685 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03157997-93f8-47c6-9b18-4d56a651c9c7-serving-cert\") pod \"controller-manager-787bf4545f-9wxhj\" (UID: \"03157997-93f8-47c6-9b18-4d56a651c9c7\") " pod="openshift-controller-manager/controller-manager-787bf4545f-9wxhj" Mar 14 07:03:47 crc kubenswrapper[4893]: I0314 07:03:47.223760 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2h79\" (UniqueName: \"kubernetes.io/projected/03157997-93f8-47c6-9b18-4d56a651c9c7-kube-api-access-c2h79\") pod \"controller-manager-787bf4545f-9wxhj\" (UID: \"03157997-93f8-47c6-9b18-4d56a651c9c7\") " pod="openshift-controller-manager/controller-manager-787bf4545f-9wxhj" Mar 14 07:03:47 crc kubenswrapper[4893]: I0314 07:03:47.223815 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9p9kd\" (UniqueName: \"kubernetes.io/projected/46b8a6a5-7819-4638-93ca-3c87fad77aa7-kube-api-access-9p9kd\") pod \"route-controller-manager-56d66966bb-xqpfr\" (UID: \"46b8a6a5-7819-4638-93ca-3c87fad77aa7\") " pod="openshift-route-controller-manager/route-controller-manager-56d66966bb-xqpfr" Mar 14 07:03:47 crc kubenswrapper[4893]: I0314 07:03:47.223849 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03157997-93f8-47c6-9b18-4d56a651c9c7-config\") pod \"controller-manager-787bf4545f-9wxhj\" (UID: \"03157997-93f8-47c6-9b18-4d56a651c9c7\") " pod="openshift-controller-manager/controller-manager-787bf4545f-9wxhj" Mar 14 07:03:47 crc kubenswrapper[4893]: I0314 07:03:47.223913 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/46b8a6a5-7819-4638-93ca-3c87fad77aa7-serving-cert\") pod \"route-controller-manager-56d66966bb-xqpfr\" (UID: \"46b8a6a5-7819-4638-93ca-3c87fad77aa7\") " pod="openshift-route-controller-manager/route-controller-manager-56d66966bb-xqpfr" Mar 14 07:03:47 crc kubenswrapper[4893]: I0314 07:03:47.248453 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-776ff7b55f-vvfkg"] Mar 14 07:03:47 crc kubenswrapper[4893]: I0314 07:03:47.250106 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-776ff7b55f-vvfkg"] Mar 14 07:03:47 crc kubenswrapper[4893]: I0314 07:03:47.257543 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-668b86bfc6-xxt8j"] Mar 14 07:03:47 crc kubenswrapper[4893]: I0314 07:03:47.262172 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-668b86bfc6-xxt8j"] Mar 14 07:03:47 crc kubenswrapper[4893]: I0314 07:03:47.325610 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/46b8a6a5-7819-4638-93ca-3c87fad77aa7-client-ca\") pod \"route-controller-manager-56d66966bb-xqpfr\" (UID: \"46b8a6a5-7819-4638-93ca-3c87fad77aa7\") " pod="openshift-route-controller-manager/route-controller-manager-56d66966bb-xqpfr" Mar 14 07:03:47 crc kubenswrapper[4893]: I0314 07:03:47.325700 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/03157997-93f8-47c6-9b18-4d56a651c9c7-client-ca\") pod \"controller-manager-787bf4545f-9wxhj\" (UID: \"03157997-93f8-47c6-9b18-4d56a651c9c7\") " pod="openshift-controller-manager/controller-manager-787bf4545f-9wxhj" Mar 14 07:03:47 crc kubenswrapper[4893]: I0314 07:03:47.325742 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46b8a6a5-7819-4638-93ca-3c87fad77aa7-config\") pod \"route-controller-manager-56d66966bb-xqpfr\" (UID: \"46b8a6a5-7819-4638-93ca-3c87fad77aa7\") " pod="openshift-route-controller-manager/route-controller-manager-56d66966bb-xqpfr" Mar 14 07:03:47 crc kubenswrapper[4893]: I0314 07:03:47.325824 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/03157997-93f8-47c6-9b18-4d56a651c9c7-proxy-ca-bundles\") pod \"controller-manager-787bf4545f-9wxhj\" (UID: \"03157997-93f8-47c6-9b18-4d56a651c9c7\") " pod="openshift-controller-manager/controller-manager-787bf4545f-9wxhj" Mar 14 07:03:47 crc kubenswrapper[4893]: I0314 07:03:47.326968 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/03157997-93f8-47c6-9b18-4d56a651c9c7-client-ca\") pod \"controller-manager-787bf4545f-9wxhj\" (UID: \"03157997-93f8-47c6-9b18-4d56a651c9c7\") " pod="openshift-controller-manager/controller-manager-787bf4545f-9wxhj" Mar 14 07:03:47 crc kubenswrapper[4893]: I0314 07:03:47.327039 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/46b8a6a5-7819-4638-93ca-3c87fad77aa7-client-ca\") pod \"route-controller-manager-56d66966bb-xqpfr\" (UID: \"46b8a6a5-7819-4638-93ca-3c87fad77aa7\") " pod="openshift-route-controller-manager/route-controller-manager-56d66966bb-xqpfr" Mar 14 07:03:47 crc kubenswrapper[4893]: I0314 07:03:47.327436 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46b8a6a5-7819-4638-93ca-3c87fad77aa7-config\") pod \"route-controller-manager-56d66966bb-xqpfr\" (UID: \"46b8a6a5-7819-4638-93ca-3c87fad77aa7\") " pod="openshift-route-controller-manager/route-controller-manager-56d66966bb-xqpfr" Mar 14 07:03:47 crc kubenswrapper[4893]: I0314 07:03:47.327586 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03157997-93f8-47c6-9b18-4d56a651c9c7-serving-cert\") pod \"controller-manager-787bf4545f-9wxhj\" (UID: \"03157997-93f8-47c6-9b18-4d56a651c9c7\") " pod="openshift-controller-manager/controller-manager-787bf4545f-9wxhj" Mar 14 07:03:47 crc kubenswrapper[4893]: I0314 07:03:47.327629 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2h79\" (UniqueName: \"kubernetes.io/projected/03157997-93f8-47c6-9b18-4d56a651c9c7-kube-api-access-c2h79\") pod \"controller-manager-787bf4545f-9wxhj\" (UID: \"03157997-93f8-47c6-9b18-4d56a651c9c7\") " pod="openshift-controller-manager/controller-manager-787bf4545f-9wxhj" Mar 14 07:03:47 crc kubenswrapper[4893]: I0314 07:03:47.327663 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9p9kd\" (UniqueName: \"kubernetes.io/projected/46b8a6a5-7819-4638-93ca-3c87fad77aa7-kube-api-access-9p9kd\") pod \"route-controller-manager-56d66966bb-xqpfr\" (UID: \"46b8a6a5-7819-4638-93ca-3c87fad77aa7\") " pod="openshift-route-controller-manager/route-controller-manager-56d66966bb-xqpfr" Mar 14 07:03:47 crc kubenswrapper[4893]: I0314 07:03:47.327687 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03157997-93f8-47c6-9b18-4d56a651c9c7-config\") pod \"controller-manager-787bf4545f-9wxhj\" (UID: \"03157997-93f8-47c6-9b18-4d56a651c9c7\") " pod="openshift-controller-manager/controller-manager-787bf4545f-9wxhj" Mar 14 07:03:47 crc kubenswrapper[4893]: I0314 07:03:47.327721 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/46b8a6a5-7819-4638-93ca-3c87fad77aa7-serving-cert\") pod \"route-controller-manager-56d66966bb-xqpfr\" (UID: \"46b8a6a5-7819-4638-93ca-3c87fad77aa7\") " pod="openshift-route-controller-manager/route-controller-manager-56d66966bb-xqpfr" Mar 14 07:03:47 crc kubenswrapper[4893]: I0314 07:03:47.327626 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/03157997-93f8-47c6-9b18-4d56a651c9c7-proxy-ca-bundles\") pod \"controller-manager-787bf4545f-9wxhj\" (UID: \"03157997-93f8-47c6-9b18-4d56a651c9c7\") " pod="openshift-controller-manager/controller-manager-787bf4545f-9wxhj" Mar 14 07:03:47 crc kubenswrapper[4893]: I0314 07:03:47.329661 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03157997-93f8-47c6-9b18-4d56a651c9c7-config\") pod \"controller-manager-787bf4545f-9wxhj\" (UID: \"03157997-93f8-47c6-9b18-4d56a651c9c7\") " pod="openshift-controller-manager/controller-manager-787bf4545f-9wxhj" Mar 14 07:03:47 crc kubenswrapper[4893]: I0314 07:03:47.336235 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03157997-93f8-47c6-9b18-4d56a651c9c7-serving-cert\") pod \"controller-manager-787bf4545f-9wxhj\" (UID: \"03157997-93f8-47c6-9b18-4d56a651c9c7\") " pod="openshift-controller-manager/controller-manager-787bf4545f-9wxhj" Mar 14 07:03:47 crc kubenswrapper[4893]: I0314 07:03:47.344091 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/46b8a6a5-7819-4638-93ca-3c87fad77aa7-serving-cert\") pod \"route-controller-manager-56d66966bb-xqpfr\" (UID: \"46b8a6a5-7819-4638-93ca-3c87fad77aa7\") " pod="openshift-route-controller-manager/route-controller-manager-56d66966bb-xqpfr" Mar 14 07:03:47 crc kubenswrapper[4893]: I0314 07:03:47.344289 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9p9kd\" (UniqueName: \"kubernetes.io/projected/46b8a6a5-7819-4638-93ca-3c87fad77aa7-kube-api-access-9p9kd\") pod \"route-controller-manager-56d66966bb-xqpfr\" (UID: \"46b8a6a5-7819-4638-93ca-3c87fad77aa7\") " pod="openshift-route-controller-manager/route-controller-manager-56d66966bb-xqpfr" Mar 14 07:03:47 crc kubenswrapper[4893]: I0314 07:03:47.347034 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2h79\" (UniqueName: \"kubernetes.io/projected/03157997-93f8-47c6-9b18-4d56a651c9c7-kube-api-access-c2h79\") pod \"controller-manager-787bf4545f-9wxhj\" (UID: \"03157997-93f8-47c6-9b18-4d56a651c9c7\") " pod="openshift-controller-manager/controller-manager-787bf4545f-9wxhj" Mar 14 07:03:47 crc kubenswrapper[4893]: I0314 07:03:47.383924 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4" path="/var/lib/kubelet/pods/13a98fc7-1d4e-4ddc-bf42-3c6bd46dd6b4/volumes" Mar 14 07:03:47 crc kubenswrapper[4893]: I0314 07:03:47.384902 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f14e9df0-fe2a-4c08-b357-db8a4472e116" path="/var/lib/kubelet/pods/f14e9df0-fe2a-4c08-b357-db8a4472e116/volumes" Mar 14 07:03:47 crc kubenswrapper[4893]: I0314 07:03:47.506500 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-787bf4545f-9wxhj" Mar 14 07:03:47 crc kubenswrapper[4893]: I0314 07:03:47.510945 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-56d66966bb-xqpfr" Mar 14 07:03:47 crc kubenswrapper[4893]: I0314 07:03:47.742442 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-787bf4545f-9wxhj"] Mar 14 07:03:47 crc kubenswrapper[4893]: I0314 07:03:47.999470 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56d66966bb-xqpfr"] Mar 14 07:03:48 crc kubenswrapper[4893]: W0314 07:03:48.007633 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46b8a6a5_7819_4638_93ca_3c87fad77aa7.slice/crio-497afbbb0b49c485ad1213b4ecf64aac4f79d4be36f4ad5b1e42180c98dde1da WatchSource:0}: Error finding container 497afbbb0b49c485ad1213b4ecf64aac4f79d4be36f4ad5b1e42180c98dde1da: Status 404 returned error can't find the container with id 497afbbb0b49c485ad1213b4ecf64aac4f79d4be36f4ad5b1e42180c98dde1da Mar 14 07:03:48 crc kubenswrapper[4893]: I0314 07:03:48.168844 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-56d66966bb-xqpfr" event={"ID":"46b8a6a5-7819-4638-93ca-3c87fad77aa7","Type":"ContainerStarted","Data":"497afbbb0b49c485ad1213b4ecf64aac4f79d4be36f4ad5b1e42180c98dde1da"} Mar 14 07:03:48 crc kubenswrapper[4893]: I0314 07:03:48.178870 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-787bf4545f-9wxhj" event={"ID":"03157997-93f8-47c6-9b18-4d56a651c9c7","Type":"ContainerStarted","Data":"2166b8dd8cc71920c9c93096d5c1a662b57333c64d949b0920136dd5d0073b44"} Mar 14 07:03:48 crc kubenswrapper[4893]: I0314 07:03:48.178908 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-787bf4545f-9wxhj" event={"ID":"03157997-93f8-47c6-9b18-4d56a651c9c7","Type":"ContainerStarted","Data":"8dfa8e4d9085ca44ca1283840a1dc58ed7f5bb2a8e98018b17bf4a8cef14cff1"} Mar 14 07:03:48 crc kubenswrapper[4893]: I0314 07:03:48.179673 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-787bf4545f-9wxhj" Mar 14 07:03:48 crc kubenswrapper[4893]: I0314 07:03:48.189262 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-787bf4545f-9wxhj" Mar 14 07:03:48 crc kubenswrapper[4893]: I0314 07:03:48.200076 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-787bf4545f-9wxhj" podStartSLOduration=3.200056183 podStartE2EDuration="3.200056183s" podCreationTimestamp="2026-03-14 07:03:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:03:48.200038202 +0000 UTC m=+307.462215004" watchObservedRunningTime="2026-03-14 07:03:48.200056183 +0000 UTC m=+307.462232985" Mar 14 07:03:49 crc kubenswrapper[4893]: I0314 07:03:49.185901 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-56d66966bb-xqpfr" event={"ID":"46b8a6a5-7819-4638-93ca-3c87fad77aa7","Type":"ContainerStarted","Data":"8deb1cbfde5aa73e14e4686fde1e459f592765835a1ef9de62f4799307cc1e3d"} Mar 14 07:03:49 crc kubenswrapper[4893]: I0314 07:03:49.207025 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-56d66966bb-xqpfr" podStartSLOduration=4.207005542 podStartE2EDuration="4.207005542s" podCreationTimestamp="2026-03-14 07:03:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:03:49.202728385 +0000 UTC m=+308.464905197" watchObservedRunningTime="2026-03-14 07:03:49.207005542 +0000 UTC m=+308.469182334" Mar 14 07:03:50 crc kubenswrapper[4893]: I0314 07:03:50.200418 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-56d66966bb-xqpfr" Mar 14 07:03:50 crc kubenswrapper[4893]: I0314 07:03:50.205385 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-56d66966bb-xqpfr" Mar 14 07:04:00 crc kubenswrapper[4893]: I0314 07:04:00.180761 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557864-q48px"] Mar 14 07:04:00 crc kubenswrapper[4893]: I0314 07:04:00.182991 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557864-q48px" Mar 14 07:04:00 crc kubenswrapper[4893]: I0314 07:04:00.187256 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-44qb7" Mar 14 07:04:00 crc kubenswrapper[4893]: I0314 07:04:00.187277 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 07:04:00 crc kubenswrapper[4893]: I0314 07:04:00.187920 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 07:04:00 crc kubenswrapper[4893]: I0314 07:04:00.188830 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557864-q48px"] Mar 14 07:04:00 crc kubenswrapper[4893]: I0314 07:04:00.328789 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvgck\" (UniqueName: \"kubernetes.io/projected/f9d0f343-4f63-426f-89b3-35f0217cc0a2-kube-api-access-cvgck\") pod \"auto-csr-approver-29557864-q48px\" (UID: \"f9d0f343-4f63-426f-89b3-35f0217cc0a2\") " pod="openshift-infra/auto-csr-approver-29557864-q48px" Mar 14 07:04:00 crc kubenswrapper[4893]: I0314 07:04:00.429950 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvgck\" (UniqueName: \"kubernetes.io/projected/f9d0f343-4f63-426f-89b3-35f0217cc0a2-kube-api-access-cvgck\") pod \"auto-csr-approver-29557864-q48px\" (UID: \"f9d0f343-4f63-426f-89b3-35f0217cc0a2\") " pod="openshift-infra/auto-csr-approver-29557864-q48px" Mar 14 07:04:00 crc kubenswrapper[4893]: I0314 07:04:00.449561 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvgck\" (UniqueName: \"kubernetes.io/projected/f9d0f343-4f63-426f-89b3-35f0217cc0a2-kube-api-access-cvgck\") pod \"auto-csr-approver-29557864-q48px\" (UID: \"f9d0f343-4f63-426f-89b3-35f0217cc0a2\") " pod="openshift-infra/auto-csr-approver-29557864-q48px" Mar 14 07:04:00 crc kubenswrapper[4893]: I0314 07:04:00.542503 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557864-q48px" Mar 14 07:04:01 crc kubenswrapper[4893]: I0314 07:04:01.001232 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557864-q48px"] Mar 14 07:04:01 crc kubenswrapper[4893]: I0314 07:04:01.261190 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557864-q48px" event={"ID":"f9d0f343-4f63-426f-89b3-35f0217cc0a2","Type":"ContainerStarted","Data":"30fd43586bbcd9f66a9760f288dae78e89a2c0aba2cdcc8ae2bb7a28d9890139"} Mar 14 07:04:03 crc kubenswrapper[4893]: I0314 07:04:03.277338 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557864-q48px" event={"ID":"f9d0f343-4f63-426f-89b3-35f0217cc0a2","Type":"ContainerStarted","Data":"6256f0440c97ff0fcb9199ae6921c71b190e6043f9ca3fdb6999d582514a13a4"} Mar 14 07:04:03 crc kubenswrapper[4893]: I0314 07:04:03.290960 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557864-q48px" podStartSLOduration=1.2282102209999999 podStartE2EDuration="3.290940593s" podCreationTimestamp="2026-03-14 07:04:00 +0000 UTC" firstStartedPulling="2026-03-14 07:04:01.012213939 +0000 UTC m=+320.274390771" lastFinishedPulling="2026-03-14 07:04:03.074944351 +0000 UTC m=+322.337121143" observedRunningTime="2026-03-14 07:04:03.287979051 +0000 UTC m=+322.550155843" watchObservedRunningTime="2026-03-14 07:04:03.290940593 +0000 UTC m=+322.553117385" Mar 14 07:04:04 crc kubenswrapper[4893]: I0314 07:04:04.286979 4893 generic.go:334] "Generic (PLEG): container finished" podID="f9d0f343-4f63-426f-89b3-35f0217cc0a2" containerID="6256f0440c97ff0fcb9199ae6921c71b190e6043f9ca3fdb6999d582514a13a4" exitCode=0 Mar 14 07:04:04 crc kubenswrapper[4893]: I0314 07:04:04.287081 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557864-q48px" event={"ID":"f9d0f343-4f63-426f-89b3-35f0217cc0a2","Type":"ContainerDied","Data":"6256f0440c97ff0fcb9199ae6921c71b190e6043f9ca3fdb6999d582514a13a4"} Mar 14 07:04:05 crc kubenswrapper[4893]: I0314 07:04:05.646141 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557864-q48px" Mar 14 07:04:05 crc kubenswrapper[4893]: I0314 07:04:05.806912 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvgck\" (UniqueName: \"kubernetes.io/projected/f9d0f343-4f63-426f-89b3-35f0217cc0a2-kube-api-access-cvgck\") pod \"f9d0f343-4f63-426f-89b3-35f0217cc0a2\" (UID: \"f9d0f343-4f63-426f-89b3-35f0217cc0a2\") " Mar 14 07:04:05 crc kubenswrapper[4893]: I0314 07:04:05.816075 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9d0f343-4f63-426f-89b3-35f0217cc0a2-kube-api-access-cvgck" (OuterVolumeSpecName: "kube-api-access-cvgck") pod "f9d0f343-4f63-426f-89b3-35f0217cc0a2" (UID: "f9d0f343-4f63-426f-89b3-35f0217cc0a2"). InnerVolumeSpecName "kube-api-access-cvgck". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:04:05 crc kubenswrapper[4893]: I0314 07:04:05.909387 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvgck\" (UniqueName: \"kubernetes.io/projected/f9d0f343-4f63-426f-89b3-35f0217cc0a2-kube-api-access-cvgck\") on node \"crc\" DevicePath \"\"" Mar 14 07:04:06 crc kubenswrapper[4893]: I0314 07:04:06.299590 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557864-q48px" event={"ID":"f9d0f343-4f63-426f-89b3-35f0217cc0a2","Type":"ContainerDied","Data":"30fd43586bbcd9f66a9760f288dae78e89a2c0aba2cdcc8ae2bb7a28d9890139"} Mar 14 07:04:06 crc kubenswrapper[4893]: I0314 07:04:06.299633 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30fd43586bbcd9f66a9760f288dae78e89a2c0aba2cdcc8ae2bb7a28d9890139" Mar 14 07:04:06 crc kubenswrapper[4893]: I0314 07:04:06.299718 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557864-q48px" Mar 14 07:04:15 crc kubenswrapper[4893]: I0314 07:04:15.686583 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56d66966bb-xqpfr"] Mar 14 07:04:15 crc kubenswrapper[4893]: I0314 07:04:15.687291 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-56d66966bb-xqpfr" podUID="46b8a6a5-7819-4638-93ca-3c87fad77aa7" containerName="route-controller-manager" containerID="cri-o://8deb1cbfde5aa73e14e4686fde1e459f592765835a1ef9de62f4799307cc1e3d" gracePeriod=30 Mar 14 07:04:16 crc kubenswrapper[4893]: I0314 07:04:16.269143 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-56d66966bb-xqpfr" Mar 14 07:04:16 crc kubenswrapper[4893]: I0314 07:04:16.354739 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/46b8a6a5-7819-4638-93ca-3c87fad77aa7-serving-cert\") pod \"46b8a6a5-7819-4638-93ca-3c87fad77aa7\" (UID: \"46b8a6a5-7819-4638-93ca-3c87fad77aa7\") " Mar 14 07:04:16 crc kubenswrapper[4893]: I0314 07:04:16.354825 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9p9kd\" (UniqueName: \"kubernetes.io/projected/46b8a6a5-7819-4638-93ca-3c87fad77aa7-kube-api-access-9p9kd\") pod \"46b8a6a5-7819-4638-93ca-3c87fad77aa7\" (UID: \"46b8a6a5-7819-4638-93ca-3c87fad77aa7\") " Mar 14 07:04:16 crc kubenswrapper[4893]: I0314 07:04:16.354871 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/46b8a6a5-7819-4638-93ca-3c87fad77aa7-client-ca\") pod \"46b8a6a5-7819-4638-93ca-3c87fad77aa7\" (UID: \"46b8a6a5-7819-4638-93ca-3c87fad77aa7\") " Mar 14 07:04:16 crc kubenswrapper[4893]: I0314 07:04:16.354956 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46b8a6a5-7819-4638-93ca-3c87fad77aa7-config\") pod \"46b8a6a5-7819-4638-93ca-3c87fad77aa7\" (UID: \"46b8a6a5-7819-4638-93ca-3c87fad77aa7\") " Mar 14 07:04:16 crc kubenswrapper[4893]: I0314 07:04:16.355712 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46b8a6a5-7819-4638-93ca-3c87fad77aa7-config" (OuterVolumeSpecName: "config") pod "46b8a6a5-7819-4638-93ca-3c87fad77aa7" (UID: "46b8a6a5-7819-4638-93ca-3c87fad77aa7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:04:16 crc kubenswrapper[4893]: I0314 07:04:16.355735 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46b8a6a5-7819-4638-93ca-3c87fad77aa7-client-ca" (OuterVolumeSpecName: "client-ca") pod "46b8a6a5-7819-4638-93ca-3c87fad77aa7" (UID: "46b8a6a5-7819-4638-93ca-3c87fad77aa7"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:04:16 crc kubenswrapper[4893]: I0314 07:04:16.360166 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46b8a6a5-7819-4638-93ca-3c87fad77aa7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "46b8a6a5-7819-4638-93ca-3c87fad77aa7" (UID: "46b8a6a5-7819-4638-93ca-3c87fad77aa7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:04:16 crc kubenswrapper[4893]: I0314 07:04:16.360144 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46b8a6a5-7819-4638-93ca-3c87fad77aa7-kube-api-access-9p9kd" (OuterVolumeSpecName: "kube-api-access-9p9kd") pod "46b8a6a5-7819-4638-93ca-3c87fad77aa7" (UID: "46b8a6a5-7819-4638-93ca-3c87fad77aa7"). InnerVolumeSpecName "kube-api-access-9p9kd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:04:16 crc kubenswrapper[4893]: I0314 07:04:16.367603 4893 generic.go:334] "Generic (PLEG): container finished" podID="46b8a6a5-7819-4638-93ca-3c87fad77aa7" containerID="8deb1cbfde5aa73e14e4686fde1e459f592765835a1ef9de62f4799307cc1e3d" exitCode=0 Mar 14 07:04:16 crc kubenswrapper[4893]: I0314 07:04:16.367681 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-56d66966bb-xqpfr" event={"ID":"46b8a6a5-7819-4638-93ca-3c87fad77aa7","Type":"ContainerDied","Data":"8deb1cbfde5aa73e14e4686fde1e459f592765835a1ef9de62f4799307cc1e3d"} Mar 14 07:04:16 crc kubenswrapper[4893]: I0314 07:04:16.367726 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-56d66966bb-xqpfr" event={"ID":"46b8a6a5-7819-4638-93ca-3c87fad77aa7","Type":"ContainerDied","Data":"497afbbb0b49c485ad1213b4ecf64aac4f79d4be36f4ad5b1e42180c98dde1da"} Mar 14 07:04:16 crc kubenswrapper[4893]: I0314 07:04:16.367753 4893 scope.go:117] "RemoveContainer" containerID="8deb1cbfde5aa73e14e4686fde1e459f592765835a1ef9de62f4799307cc1e3d" Mar 14 07:04:16 crc kubenswrapper[4893]: I0314 07:04:16.367931 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-56d66966bb-xqpfr" Mar 14 07:04:16 crc kubenswrapper[4893]: I0314 07:04:16.411001 4893 scope.go:117] "RemoveContainer" containerID="8deb1cbfde5aa73e14e4686fde1e459f592765835a1ef9de62f4799307cc1e3d" Mar 14 07:04:16 crc kubenswrapper[4893]: E0314 07:04:16.411508 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8deb1cbfde5aa73e14e4686fde1e459f592765835a1ef9de62f4799307cc1e3d\": container with ID starting with 8deb1cbfde5aa73e14e4686fde1e459f592765835a1ef9de62f4799307cc1e3d not found: ID does not exist" containerID="8deb1cbfde5aa73e14e4686fde1e459f592765835a1ef9de62f4799307cc1e3d" Mar 14 07:04:16 crc kubenswrapper[4893]: I0314 07:04:16.411538 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56d66966bb-xqpfr"] Mar 14 07:04:16 crc kubenswrapper[4893]: I0314 07:04:16.411555 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8deb1cbfde5aa73e14e4686fde1e459f592765835a1ef9de62f4799307cc1e3d"} err="failed to get container status \"8deb1cbfde5aa73e14e4686fde1e459f592765835a1ef9de62f4799307cc1e3d\": rpc error: code = NotFound desc = could not find container \"8deb1cbfde5aa73e14e4686fde1e459f592765835a1ef9de62f4799307cc1e3d\": container with ID starting with 8deb1cbfde5aa73e14e4686fde1e459f592765835a1ef9de62f4799307cc1e3d not found: ID does not exist" Mar 14 07:04:16 crc kubenswrapper[4893]: I0314 07:04:16.414862 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56d66966bb-xqpfr"] Mar 14 07:04:16 crc kubenswrapper[4893]: I0314 07:04:16.456435 4893 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/46b8a6a5-7819-4638-93ca-3c87fad77aa7-client-ca\") on node \"crc\" DevicePath \"\"" Mar 14 07:04:16 crc kubenswrapper[4893]: I0314 07:04:16.456746 4893 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46b8a6a5-7819-4638-93ca-3c87fad77aa7-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:04:16 crc kubenswrapper[4893]: I0314 07:04:16.456835 4893 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/46b8a6a5-7819-4638-93ca-3c87fad77aa7-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:04:16 crc kubenswrapper[4893]: I0314 07:04:16.456972 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9p9kd\" (UniqueName: \"kubernetes.io/projected/46b8a6a5-7819-4638-93ca-3c87fad77aa7-kube-api-access-9p9kd\") on node \"crc\" DevicePath \"\"" Mar 14 07:04:17 crc kubenswrapper[4893]: I0314 07:04:17.172798 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-668b86bfc6-qvsgw"] Mar 14 07:04:17 crc kubenswrapper[4893]: E0314 07:04:17.174141 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46b8a6a5-7819-4638-93ca-3c87fad77aa7" containerName="route-controller-manager" Mar 14 07:04:17 crc kubenswrapper[4893]: I0314 07:04:17.174204 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="46b8a6a5-7819-4638-93ca-3c87fad77aa7" containerName="route-controller-manager" Mar 14 07:04:17 crc kubenswrapper[4893]: E0314 07:04:17.174235 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9d0f343-4f63-426f-89b3-35f0217cc0a2" containerName="oc" Mar 14 07:04:17 crc kubenswrapper[4893]: I0314 07:04:17.174254 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9d0f343-4f63-426f-89b3-35f0217cc0a2" containerName="oc" Mar 14 07:04:17 crc kubenswrapper[4893]: I0314 07:04:17.174487 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="46b8a6a5-7819-4638-93ca-3c87fad77aa7" containerName="route-controller-manager" Mar 14 07:04:17 crc kubenswrapper[4893]: I0314 07:04:17.174599 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9d0f343-4f63-426f-89b3-35f0217cc0a2" containerName="oc" Mar 14 07:04:17 crc kubenswrapper[4893]: I0314 07:04:17.175289 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-668b86bfc6-qvsgw" Mar 14 07:04:17 crc kubenswrapper[4893]: I0314 07:04:17.177969 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 14 07:04:17 crc kubenswrapper[4893]: I0314 07:04:17.178388 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 14 07:04:17 crc kubenswrapper[4893]: I0314 07:04:17.178582 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 14 07:04:17 crc kubenswrapper[4893]: I0314 07:04:17.178580 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 14 07:04:17 crc kubenswrapper[4893]: I0314 07:04:17.179197 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 14 07:04:17 crc kubenswrapper[4893]: I0314 07:04:17.179556 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 14 07:04:17 crc kubenswrapper[4893]: I0314 07:04:17.189259 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-668b86bfc6-qvsgw"] Mar 14 07:04:17 crc kubenswrapper[4893]: I0314 07:04:17.269025 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82cpz\" (UniqueName: \"kubernetes.io/projected/e5bc2301-0803-4d84-bfbd-5b2d35da31fa-kube-api-access-82cpz\") pod \"route-controller-manager-668b86bfc6-qvsgw\" (UID: \"e5bc2301-0803-4d84-bfbd-5b2d35da31fa\") " pod="openshift-route-controller-manager/route-controller-manager-668b86bfc6-qvsgw" Mar 14 07:04:17 crc kubenswrapper[4893]: I0314 07:04:17.269152 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e5bc2301-0803-4d84-bfbd-5b2d35da31fa-serving-cert\") pod \"route-controller-manager-668b86bfc6-qvsgw\" (UID: \"e5bc2301-0803-4d84-bfbd-5b2d35da31fa\") " pod="openshift-route-controller-manager/route-controller-manager-668b86bfc6-qvsgw" Mar 14 07:04:17 crc kubenswrapper[4893]: I0314 07:04:17.269376 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e5bc2301-0803-4d84-bfbd-5b2d35da31fa-client-ca\") pod \"route-controller-manager-668b86bfc6-qvsgw\" (UID: \"e5bc2301-0803-4d84-bfbd-5b2d35da31fa\") " pod="openshift-route-controller-manager/route-controller-manager-668b86bfc6-qvsgw" Mar 14 07:04:17 crc kubenswrapper[4893]: I0314 07:04:17.269632 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5bc2301-0803-4d84-bfbd-5b2d35da31fa-config\") pod \"route-controller-manager-668b86bfc6-qvsgw\" (UID: \"e5bc2301-0803-4d84-bfbd-5b2d35da31fa\") " pod="openshift-route-controller-manager/route-controller-manager-668b86bfc6-qvsgw" Mar 14 07:04:17 crc kubenswrapper[4893]: I0314 07:04:17.371178 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e5bc2301-0803-4d84-bfbd-5b2d35da31fa-serving-cert\") pod \"route-controller-manager-668b86bfc6-qvsgw\" (UID: \"e5bc2301-0803-4d84-bfbd-5b2d35da31fa\") " pod="openshift-route-controller-manager/route-controller-manager-668b86bfc6-qvsgw" Mar 14 07:04:17 crc kubenswrapper[4893]: I0314 07:04:17.371305 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e5bc2301-0803-4d84-bfbd-5b2d35da31fa-client-ca\") pod \"route-controller-manager-668b86bfc6-qvsgw\" (UID: \"e5bc2301-0803-4d84-bfbd-5b2d35da31fa\") " pod="openshift-route-controller-manager/route-controller-manager-668b86bfc6-qvsgw" Mar 14 07:04:17 crc kubenswrapper[4893]: I0314 07:04:17.371393 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5bc2301-0803-4d84-bfbd-5b2d35da31fa-config\") pod \"route-controller-manager-668b86bfc6-qvsgw\" (UID: \"e5bc2301-0803-4d84-bfbd-5b2d35da31fa\") " pod="openshift-route-controller-manager/route-controller-manager-668b86bfc6-qvsgw" Mar 14 07:04:17 crc kubenswrapper[4893]: I0314 07:04:17.371472 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82cpz\" (UniqueName: \"kubernetes.io/projected/e5bc2301-0803-4d84-bfbd-5b2d35da31fa-kube-api-access-82cpz\") pod \"route-controller-manager-668b86bfc6-qvsgw\" (UID: \"e5bc2301-0803-4d84-bfbd-5b2d35da31fa\") " pod="openshift-route-controller-manager/route-controller-manager-668b86bfc6-qvsgw" Mar 14 07:04:17 crc kubenswrapper[4893]: I0314 07:04:17.373898 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e5bc2301-0803-4d84-bfbd-5b2d35da31fa-client-ca\") pod \"route-controller-manager-668b86bfc6-qvsgw\" (UID: \"e5bc2301-0803-4d84-bfbd-5b2d35da31fa\") " pod="openshift-route-controller-manager/route-controller-manager-668b86bfc6-qvsgw" Mar 14 07:04:17 crc kubenswrapper[4893]: I0314 07:04:17.374888 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5bc2301-0803-4d84-bfbd-5b2d35da31fa-config\") pod \"route-controller-manager-668b86bfc6-qvsgw\" (UID: \"e5bc2301-0803-4d84-bfbd-5b2d35da31fa\") " pod="openshift-route-controller-manager/route-controller-manager-668b86bfc6-qvsgw" Mar 14 07:04:17 crc kubenswrapper[4893]: I0314 07:04:17.391066 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e5bc2301-0803-4d84-bfbd-5b2d35da31fa-serving-cert\") pod \"route-controller-manager-668b86bfc6-qvsgw\" (UID: \"e5bc2301-0803-4d84-bfbd-5b2d35da31fa\") " pod="openshift-route-controller-manager/route-controller-manager-668b86bfc6-qvsgw" Mar 14 07:04:17 crc kubenswrapper[4893]: I0314 07:04:17.401878 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46b8a6a5-7819-4638-93ca-3c87fad77aa7" path="/var/lib/kubelet/pods/46b8a6a5-7819-4638-93ca-3c87fad77aa7/volumes" Mar 14 07:04:17 crc kubenswrapper[4893]: I0314 07:04:17.402780 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82cpz\" (UniqueName: \"kubernetes.io/projected/e5bc2301-0803-4d84-bfbd-5b2d35da31fa-kube-api-access-82cpz\") pod \"route-controller-manager-668b86bfc6-qvsgw\" (UID: \"e5bc2301-0803-4d84-bfbd-5b2d35da31fa\") " pod="openshift-route-controller-manager/route-controller-manager-668b86bfc6-qvsgw" Mar 14 07:04:17 crc kubenswrapper[4893]: I0314 07:04:17.514164 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-668b86bfc6-qvsgw" Mar 14 07:04:17 crc kubenswrapper[4893]: I0314 07:04:17.882994 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-668b86bfc6-qvsgw"] Mar 14 07:04:18 crc kubenswrapper[4893]: I0314 07:04:18.390594 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-668b86bfc6-qvsgw" event={"ID":"e5bc2301-0803-4d84-bfbd-5b2d35da31fa","Type":"ContainerStarted","Data":"2a23771dadfc220468a104587b1afa238edf19e010400c8005f19385043611b4"} Mar 14 07:04:18 crc kubenswrapper[4893]: I0314 07:04:18.390859 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-668b86bfc6-qvsgw" Mar 14 07:04:18 crc kubenswrapper[4893]: I0314 07:04:18.390873 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-668b86bfc6-qvsgw" event={"ID":"e5bc2301-0803-4d84-bfbd-5b2d35da31fa","Type":"ContainerStarted","Data":"18bffb798649d8bec85ca958b5f998e195499c8a799972b48c6e85b1a80925cd"} Mar 14 07:04:18 crc kubenswrapper[4893]: I0314 07:04:18.415644 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-668b86bfc6-qvsgw" podStartSLOduration=3.415627211 podStartE2EDuration="3.415627211s" podCreationTimestamp="2026-03-14 07:04:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:04:18.413329378 +0000 UTC m=+337.675506190" watchObservedRunningTime="2026-03-14 07:04:18.415627211 +0000 UTC m=+337.677804003" Mar 14 07:04:18 crc kubenswrapper[4893]: I0314 07:04:18.638699 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-668b86bfc6-qvsgw" Mar 14 07:04:45 crc kubenswrapper[4893]: I0314 07:04:45.824424 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-787bf4545f-9wxhj"] Mar 14 07:04:45 crc kubenswrapper[4893]: I0314 07:04:45.825246 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-787bf4545f-9wxhj" podUID="03157997-93f8-47c6-9b18-4d56a651c9c7" containerName="controller-manager" containerID="cri-o://2166b8dd8cc71920c9c93096d5c1a662b57333c64d949b0920136dd5d0073b44" gracePeriod=30 Mar 14 07:04:46 crc kubenswrapper[4893]: I0314 07:04:46.234660 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-787bf4545f-9wxhj" Mar 14 07:04:46 crc kubenswrapper[4893]: I0314 07:04:46.372072 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/03157997-93f8-47c6-9b18-4d56a651c9c7-client-ca\") pod \"03157997-93f8-47c6-9b18-4d56a651c9c7\" (UID: \"03157997-93f8-47c6-9b18-4d56a651c9c7\") " Mar 14 07:04:46 crc kubenswrapper[4893]: I0314 07:04:46.372149 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2h79\" (UniqueName: \"kubernetes.io/projected/03157997-93f8-47c6-9b18-4d56a651c9c7-kube-api-access-c2h79\") pod \"03157997-93f8-47c6-9b18-4d56a651c9c7\" (UID: \"03157997-93f8-47c6-9b18-4d56a651c9c7\") " Mar 14 07:04:46 crc kubenswrapper[4893]: I0314 07:04:46.372189 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03157997-93f8-47c6-9b18-4d56a651c9c7-config\") pod \"03157997-93f8-47c6-9b18-4d56a651c9c7\" (UID: \"03157997-93f8-47c6-9b18-4d56a651c9c7\") " Mar 14 07:04:46 crc kubenswrapper[4893]: I0314 07:04:46.372246 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/03157997-93f8-47c6-9b18-4d56a651c9c7-proxy-ca-bundles\") pod \"03157997-93f8-47c6-9b18-4d56a651c9c7\" (UID: \"03157997-93f8-47c6-9b18-4d56a651c9c7\") " Mar 14 07:04:46 crc kubenswrapper[4893]: I0314 07:04:46.372276 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03157997-93f8-47c6-9b18-4d56a651c9c7-serving-cert\") pod \"03157997-93f8-47c6-9b18-4d56a651c9c7\" (UID: \"03157997-93f8-47c6-9b18-4d56a651c9c7\") " Mar 14 07:04:46 crc kubenswrapper[4893]: I0314 07:04:46.373014 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03157997-93f8-47c6-9b18-4d56a651c9c7-client-ca" (OuterVolumeSpecName: "client-ca") pod "03157997-93f8-47c6-9b18-4d56a651c9c7" (UID: "03157997-93f8-47c6-9b18-4d56a651c9c7"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:04:46 crc kubenswrapper[4893]: I0314 07:04:46.373311 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03157997-93f8-47c6-9b18-4d56a651c9c7-config" (OuterVolumeSpecName: "config") pod "03157997-93f8-47c6-9b18-4d56a651c9c7" (UID: "03157997-93f8-47c6-9b18-4d56a651c9c7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:04:46 crc kubenswrapper[4893]: I0314 07:04:46.373429 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03157997-93f8-47c6-9b18-4d56a651c9c7-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "03157997-93f8-47c6-9b18-4d56a651c9c7" (UID: "03157997-93f8-47c6-9b18-4d56a651c9c7"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:04:46 crc kubenswrapper[4893]: I0314 07:04:46.378538 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03157997-93f8-47c6-9b18-4d56a651c9c7-kube-api-access-c2h79" (OuterVolumeSpecName: "kube-api-access-c2h79") pod "03157997-93f8-47c6-9b18-4d56a651c9c7" (UID: "03157997-93f8-47c6-9b18-4d56a651c9c7"). InnerVolumeSpecName "kube-api-access-c2h79". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:04:46 crc kubenswrapper[4893]: I0314 07:04:46.379651 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03157997-93f8-47c6-9b18-4d56a651c9c7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "03157997-93f8-47c6-9b18-4d56a651c9c7" (UID: "03157997-93f8-47c6-9b18-4d56a651c9c7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:04:46 crc kubenswrapper[4893]: I0314 07:04:46.473332 4893 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03157997-93f8-47c6-9b18-4d56a651c9c7-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:04:46 crc kubenswrapper[4893]: I0314 07:04:46.473371 4893 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/03157997-93f8-47c6-9b18-4d56a651c9c7-client-ca\") on node \"crc\" DevicePath \"\"" Mar 14 07:04:46 crc kubenswrapper[4893]: I0314 07:04:46.473381 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2h79\" (UniqueName: \"kubernetes.io/projected/03157997-93f8-47c6-9b18-4d56a651c9c7-kube-api-access-c2h79\") on node \"crc\" DevicePath \"\"" Mar 14 07:04:46 crc kubenswrapper[4893]: I0314 07:04:46.473391 4893 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03157997-93f8-47c6-9b18-4d56a651c9c7-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:04:46 crc kubenswrapper[4893]: I0314 07:04:46.473399 4893 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/03157997-93f8-47c6-9b18-4d56a651c9c7-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 14 07:04:46 crc kubenswrapper[4893]: I0314 07:04:46.546384 4893 generic.go:334] "Generic (PLEG): container finished" podID="03157997-93f8-47c6-9b18-4d56a651c9c7" containerID="2166b8dd8cc71920c9c93096d5c1a662b57333c64d949b0920136dd5d0073b44" exitCode=0 Mar 14 07:04:46 crc kubenswrapper[4893]: I0314 07:04:46.546433 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-787bf4545f-9wxhj" event={"ID":"03157997-93f8-47c6-9b18-4d56a651c9c7","Type":"ContainerDied","Data":"2166b8dd8cc71920c9c93096d5c1a662b57333c64d949b0920136dd5d0073b44"} Mar 14 07:04:46 crc kubenswrapper[4893]: I0314 07:04:46.546468 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-787bf4545f-9wxhj" event={"ID":"03157997-93f8-47c6-9b18-4d56a651c9c7","Type":"ContainerDied","Data":"8dfa8e4d9085ca44ca1283840a1dc58ed7f5bb2a8e98018b17bf4a8cef14cff1"} Mar 14 07:04:46 crc kubenswrapper[4893]: I0314 07:04:46.546492 4893 scope.go:117] "RemoveContainer" containerID="2166b8dd8cc71920c9c93096d5c1a662b57333c64d949b0920136dd5d0073b44" Mar 14 07:04:46 crc kubenswrapper[4893]: I0314 07:04:46.546463 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-787bf4545f-9wxhj" Mar 14 07:04:46 crc kubenswrapper[4893]: I0314 07:04:46.572273 4893 scope.go:117] "RemoveContainer" containerID="2166b8dd8cc71920c9c93096d5c1a662b57333c64d949b0920136dd5d0073b44" Mar 14 07:04:46 crc kubenswrapper[4893]: E0314 07:04:46.575589 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2166b8dd8cc71920c9c93096d5c1a662b57333c64d949b0920136dd5d0073b44\": container with ID starting with 2166b8dd8cc71920c9c93096d5c1a662b57333c64d949b0920136dd5d0073b44 not found: ID does not exist" containerID="2166b8dd8cc71920c9c93096d5c1a662b57333c64d949b0920136dd5d0073b44" Mar 14 07:04:46 crc kubenswrapper[4893]: I0314 07:04:46.575674 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2166b8dd8cc71920c9c93096d5c1a662b57333c64d949b0920136dd5d0073b44"} err="failed to get container status \"2166b8dd8cc71920c9c93096d5c1a662b57333c64d949b0920136dd5d0073b44\": rpc error: code = NotFound desc = could not find container \"2166b8dd8cc71920c9c93096d5c1a662b57333c64d949b0920136dd5d0073b44\": container with ID starting with 2166b8dd8cc71920c9c93096d5c1a662b57333c64d949b0920136dd5d0073b44 not found: ID does not exist" Mar 14 07:04:46 crc kubenswrapper[4893]: I0314 07:04:46.583466 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-787bf4545f-9wxhj"] Mar 14 07:04:46 crc kubenswrapper[4893]: I0314 07:04:46.590286 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-787bf4545f-9wxhj"] Mar 14 07:04:47 crc kubenswrapper[4893]: I0314 07:04:47.189413 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-776ff7b55f-ht999"] Mar 14 07:04:47 crc kubenswrapper[4893]: E0314 07:04:47.189683 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03157997-93f8-47c6-9b18-4d56a651c9c7" containerName="controller-manager" Mar 14 07:04:47 crc kubenswrapper[4893]: I0314 07:04:47.189698 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="03157997-93f8-47c6-9b18-4d56a651c9c7" containerName="controller-manager" Mar 14 07:04:47 crc kubenswrapper[4893]: I0314 07:04:47.189800 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="03157997-93f8-47c6-9b18-4d56a651c9c7" containerName="controller-manager" Mar 14 07:04:47 crc kubenswrapper[4893]: I0314 07:04:47.190160 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-776ff7b55f-ht999" Mar 14 07:04:47 crc kubenswrapper[4893]: I0314 07:04:47.193244 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 14 07:04:47 crc kubenswrapper[4893]: I0314 07:04:47.193330 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 14 07:04:47 crc kubenswrapper[4893]: I0314 07:04:47.193247 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 14 07:04:47 crc kubenswrapper[4893]: I0314 07:04:47.193720 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 14 07:04:47 crc kubenswrapper[4893]: I0314 07:04:47.194020 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 14 07:04:47 crc kubenswrapper[4893]: I0314 07:04:47.194204 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 14 07:04:47 crc kubenswrapper[4893]: I0314 07:04:47.208170 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 14 07:04:47 crc kubenswrapper[4893]: I0314 07:04:47.214989 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-776ff7b55f-ht999"] Mar 14 07:04:47 crc kubenswrapper[4893]: I0314 07:04:47.283182 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e0e563b0-57e0-48a5-842a-bf0a9797c334-client-ca\") pod \"controller-manager-776ff7b55f-ht999\" (UID: \"e0e563b0-57e0-48a5-842a-bf0a9797c334\") " pod="openshift-controller-manager/controller-manager-776ff7b55f-ht999" Mar 14 07:04:47 crc kubenswrapper[4893]: I0314 07:04:47.283252 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0e563b0-57e0-48a5-842a-bf0a9797c334-config\") pod \"controller-manager-776ff7b55f-ht999\" (UID: \"e0e563b0-57e0-48a5-842a-bf0a9797c334\") " pod="openshift-controller-manager/controller-manager-776ff7b55f-ht999" Mar 14 07:04:47 crc kubenswrapper[4893]: I0314 07:04:47.283291 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e0e563b0-57e0-48a5-842a-bf0a9797c334-proxy-ca-bundles\") pod \"controller-manager-776ff7b55f-ht999\" (UID: \"e0e563b0-57e0-48a5-842a-bf0a9797c334\") " pod="openshift-controller-manager/controller-manager-776ff7b55f-ht999" Mar 14 07:04:47 crc kubenswrapper[4893]: I0314 07:04:47.283326 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0e563b0-57e0-48a5-842a-bf0a9797c334-serving-cert\") pod \"controller-manager-776ff7b55f-ht999\" (UID: \"e0e563b0-57e0-48a5-842a-bf0a9797c334\") " pod="openshift-controller-manager/controller-manager-776ff7b55f-ht999" Mar 14 07:04:47 crc kubenswrapper[4893]: I0314 07:04:47.283427 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zzrg\" (UniqueName: \"kubernetes.io/projected/e0e563b0-57e0-48a5-842a-bf0a9797c334-kube-api-access-4zzrg\") pod \"controller-manager-776ff7b55f-ht999\" (UID: \"e0e563b0-57e0-48a5-842a-bf0a9797c334\") " pod="openshift-controller-manager/controller-manager-776ff7b55f-ht999" Mar 14 07:04:47 crc kubenswrapper[4893]: I0314 07:04:47.385027 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e0e563b0-57e0-48a5-842a-bf0a9797c334-client-ca\") pod \"controller-manager-776ff7b55f-ht999\" (UID: \"e0e563b0-57e0-48a5-842a-bf0a9797c334\") " pod="openshift-controller-manager/controller-manager-776ff7b55f-ht999" Mar 14 07:04:47 crc kubenswrapper[4893]: I0314 07:04:47.385117 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0e563b0-57e0-48a5-842a-bf0a9797c334-config\") pod \"controller-manager-776ff7b55f-ht999\" (UID: \"e0e563b0-57e0-48a5-842a-bf0a9797c334\") " pod="openshift-controller-manager/controller-manager-776ff7b55f-ht999" Mar 14 07:04:47 crc kubenswrapper[4893]: I0314 07:04:47.385173 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e0e563b0-57e0-48a5-842a-bf0a9797c334-proxy-ca-bundles\") pod \"controller-manager-776ff7b55f-ht999\" (UID: \"e0e563b0-57e0-48a5-842a-bf0a9797c334\") " pod="openshift-controller-manager/controller-manager-776ff7b55f-ht999" Mar 14 07:04:47 crc kubenswrapper[4893]: I0314 07:04:47.385225 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0e563b0-57e0-48a5-842a-bf0a9797c334-serving-cert\") pod \"controller-manager-776ff7b55f-ht999\" (UID: \"e0e563b0-57e0-48a5-842a-bf0a9797c334\") " pod="openshift-controller-manager/controller-manager-776ff7b55f-ht999" Mar 14 07:04:47 crc kubenswrapper[4893]: I0314 07:04:47.386067 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e0e563b0-57e0-48a5-842a-bf0a9797c334-client-ca\") pod \"controller-manager-776ff7b55f-ht999\" (UID: \"e0e563b0-57e0-48a5-842a-bf0a9797c334\") " pod="openshift-controller-manager/controller-manager-776ff7b55f-ht999" Mar 14 07:04:47 crc kubenswrapper[4893]: I0314 07:04:47.386258 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zzrg\" (UniqueName: \"kubernetes.io/projected/e0e563b0-57e0-48a5-842a-bf0a9797c334-kube-api-access-4zzrg\") pod \"controller-manager-776ff7b55f-ht999\" (UID: \"e0e563b0-57e0-48a5-842a-bf0a9797c334\") " pod="openshift-controller-manager/controller-manager-776ff7b55f-ht999" Mar 14 07:04:47 crc kubenswrapper[4893]: I0314 07:04:47.387277 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e0e563b0-57e0-48a5-842a-bf0a9797c334-proxy-ca-bundles\") pod \"controller-manager-776ff7b55f-ht999\" (UID: \"e0e563b0-57e0-48a5-842a-bf0a9797c334\") " pod="openshift-controller-manager/controller-manager-776ff7b55f-ht999" Mar 14 07:04:47 crc kubenswrapper[4893]: I0314 07:04:47.387355 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0e563b0-57e0-48a5-842a-bf0a9797c334-config\") pod \"controller-manager-776ff7b55f-ht999\" (UID: \"e0e563b0-57e0-48a5-842a-bf0a9797c334\") " pod="openshift-controller-manager/controller-manager-776ff7b55f-ht999" Mar 14 07:04:47 crc kubenswrapper[4893]: I0314 07:04:47.387499 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03157997-93f8-47c6-9b18-4d56a651c9c7" path="/var/lib/kubelet/pods/03157997-93f8-47c6-9b18-4d56a651c9c7/volumes" Mar 14 07:04:47 crc kubenswrapper[4893]: I0314 07:04:47.390800 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0e563b0-57e0-48a5-842a-bf0a9797c334-serving-cert\") pod \"controller-manager-776ff7b55f-ht999\" (UID: \"e0e563b0-57e0-48a5-842a-bf0a9797c334\") " pod="openshift-controller-manager/controller-manager-776ff7b55f-ht999" Mar 14 07:04:47 crc kubenswrapper[4893]: I0314 07:04:47.405088 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zzrg\" (UniqueName: \"kubernetes.io/projected/e0e563b0-57e0-48a5-842a-bf0a9797c334-kube-api-access-4zzrg\") pod \"controller-manager-776ff7b55f-ht999\" (UID: \"e0e563b0-57e0-48a5-842a-bf0a9797c334\") " pod="openshift-controller-manager/controller-manager-776ff7b55f-ht999" Mar 14 07:04:47 crc kubenswrapper[4893]: I0314 07:04:47.508192 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-776ff7b55f-ht999" Mar 14 07:04:47 crc kubenswrapper[4893]: I0314 07:04:47.681444 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-776ff7b55f-ht999"] Mar 14 07:04:47 crc kubenswrapper[4893]: W0314 07:04:47.691973 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0e563b0_57e0_48a5_842a_bf0a9797c334.slice/crio-18bb408d0bea646b41e8d8cdd44e544d97444abb6391139d22482a78823fb033 WatchSource:0}: Error finding container 18bb408d0bea646b41e8d8cdd44e544d97444abb6391139d22482a78823fb033: Status 404 returned error can't find the container with id 18bb408d0bea646b41e8d8cdd44e544d97444abb6391139d22482a78823fb033 Mar 14 07:04:48 crc kubenswrapper[4893]: I0314 07:04:48.563429 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-776ff7b55f-ht999" event={"ID":"e0e563b0-57e0-48a5-842a-bf0a9797c334","Type":"ContainerStarted","Data":"dbcb1f99095e4ac80bf0a26f78bd47efd6e4edd0d82c7606f2149a9296419e30"} Mar 14 07:04:48 crc kubenswrapper[4893]: I0314 07:04:48.563889 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-776ff7b55f-ht999" event={"ID":"e0e563b0-57e0-48a5-842a-bf0a9797c334","Type":"ContainerStarted","Data":"18bb408d0bea646b41e8d8cdd44e544d97444abb6391139d22482a78823fb033"} Mar 14 07:04:48 crc kubenswrapper[4893]: I0314 07:04:48.563941 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-776ff7b55f-ht999" Mar 14 07:04:48 crc kubenswrapper[4893]: I0314 07:04:48.567343 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-776ff7b55f-ht999" Mar 14 07:04:48 crc kubenswrapper[4893]: I0314 07:04:48.585324 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-776ff7b55f-ht999" podStartSLOduration=3.585302075 podStartE2EDuration="3.585302075s" podCreationTimestamp="2026-03-14 07:04:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:04:48.58513119 +0000 UTC m=+367.847308002" watchObservedRunningTime="2026-03-14 07:04:48.585302075 +0000 UTC m=+367.847478877" Mar 14 07:04:52 crc kubenswrapper[4893]: I0314 07:04:52.365392 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-ml6qq"] Mar 14 07:04:52 crc kubenswrapper[4893]: I0314 07:04:52.366734 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-ml6qq" Mar 14 07:04:52 crc kubenswrapper[4893]: I0314 07:04:52.384208 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-ml6qq"] Mar 14 07:04:52 crc kubenswrapper[4893]: I0314 07:04:52.472545 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhmcf\" (UniqueName: \"kubernetes.io/projected/6239fc19-abc8-4f3a-9f3e-5c56f901c501-kube-api-access-dhmcf\") pod \"image-registry-66df7c8f76-ml6qq\" (UID: \"6239fc19-abc8-4f3a-9f3e-5c56f901c501\") " pod="openshift-image-registry/image-registry-66df7c8f76-ml6qq" Mar 14 07:04:52 crc kubenswrapper[4893]: I0314 07:04:52.472611 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6239fc19-abc8-4f3a-9f3e-5c56f901c501-registry-certificates\") pod \"image-registry-66df7c8f76-ml6qq\" (UID: \"6239fc19-abc8-4f3a-9f3e-5c56f901c501\") " pod="openshift-image-registry/image-registry-66df7c8f76-ml6qq" Mar 14 07:04:52 crc kubenswrapper[4893]: I0314 07:04:52.472667 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6239fc19-abc8-4f3a-9f3e-5c56f901c501-installation-pull-secrets\") pod \"image-registry-66df7c8f76-ml6qq\" (UID: \"6239fc19-abc8-4f3a-9f3e-5c56f901c501\") " pod="openshift-image-registry/image-registry-66df7c8f76-ml6qq" Mar 14 07:04:52 crc kubenswrapper[4893]: I0314 07:04:52.472691 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6239fc19-abc8-4f3a-9f3e-5c56f901c501-trusted-ca\") pod \"image-registry-66df7c8f76-ml6qq\" (UID: \"6239fc19-abc8-4f3a-9f3e-5c56f901c501\") " pod="openshift-image-registry/image-registry-66df7c8f76-ml6qq" Mar 14 07:04:52 crc kubenswrapper[4893]: I0314 07:04:52.472723 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6239fc19-abc8-4f3a-9f3e-5c56f901c501-registry-tls\") pod \"image-registry-66df7c8f76-ml6qq\" (UID: \"6239fc19-abc8-4f3a-9f3e-5c56f901c501\") " pod="openshift-image-registry/image-registry-66df7c8f76-ml6qq" Mar 14 07:04:52 crc kubenswrapper[4893]: I0314 07:04:52.472767 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-ml6qq\" (UID: \"6239fc19-abc8-4f3a-9f3e-5c56f901c501\") " pod="openshift-image-registry/image-registry-66df7c8f76-ml6qq" Mar 14 07:04:52 crc kubenswrapper[4893]: I0314 07:04:52.472797 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6239fc19-abc8-4f3a-9f3e-5c56f901c501-bound-sa-token\") pod \"image-registry-66df7c8f76-ml6qq\" (UID: \"6239fc19-abc8-4f3a-9f3e-5c56f901c501\") " pod="openshift-image-registry/image-registry-66df7c8f76-ml6qq" Mar 14 07:04:52 crc kubenswrapper[4893]: I0314 07:04:52.472817 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6239fc19-abc8-4f3a-9f3e-5c56f901c501-ca-trust-extracted\") pod \"image-registry-66df7c8f76-ml6qq\" (UID: \"6239fc19-abc8-4f3a-9f3e-5c56f901c501\") " pod="openshift-image-registry/image-registry-66df7c8f76-ml6qq" Mar 14 07:04:52 crc kubenswrapper[4893]: I0314 07:04:52.496721 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-ml6qq\" (UID: \"6239fc19-abc8-4f3a-9f3e-5c56f901c501\") " pod="openshift-image-registry/image-registry-66df7c8f76-ml6qq" Mar 14 07:04:52 crc kubenswrapper[4893]: I0314 07:04:52.574074 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6239fc19-abc8-4f3a-9f3e-5c56f901c501-bound-sa-token\") pod \"image-registry-66df7c8f76-ml6qq\" (UID: \"6239fc19-abc8-4f3a-9f3e-5c56f901c501\") " pod="openshift-image-registry/image-registry-66df7c8f76-ml6qq" Mar 14 07:04:52 crc kubenswrapper[4893]: I0314 07:04:52.574127 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6239fc19-abc8-4f3a-9f3e-5c56f901c501-ca-trust-extracted\") pod \"image-registry-66df7c8f76-ml6qq\" (UID: \"6239fc19-abc8-4f3a-9f3e-5c56f901c501\") " pod="openshift-image-registry/image-registry-66df7c8f76-ml6qq" Mar 14 07:04:52 crc kubenswrapper[4893]: I0314 07:04:52.574157 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhmcf\" (UniqueName: \"kubernetes.io/projected/6239fc19-abc8-4f3a-9f3e-5c56f901c501-kube-api-access-dhmcf\") pod \"image-registry-66df7c8f76-ml6qq\" (UID: \"6239fc19-abc8-4f3a-9f3e-5c56f901c501\") " pod="openshift-image-registry/image-registry-66df7c8f76-ml6qq" Mar 14 07:04:52 crc kubenswrapper[4893]: I0314 07:04:52.574179 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6239fc19-abc8-4f3a-9f3e-5c56f901c501-registry-certificates\") pod \"image-registry-66df7c8f76-ml6qq\" (UID: \"6239fc19-abc8-4f3a-9f3e-5c56f901c501\") " pod="openshift-image-registry/image-registry-66df7c8f76-ml6qq" Mar 14 07:04:52 crc kubenswrapper[4893]: I0314 07:04:52.574200 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6239fc19-abc8-4f3a-9f3e-5c56f901c501-installation-pull-secrets\") pod \"image-registry-66df7c8f76-ml6qq\" (UID: \"6239fc19-abc8-4f3a-9f3e-5c56f901c501\") " pod="openshift-image-registry/image-registry-66df7c8f76-ml6qq" Mar 14 07:04:52 crc kubenswrapper[4893]: I0314 07:04:52.574224 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6239fc19-abc8-4f3a-9f3e-5c56f901c501-trusted-ca\") pod \"image-registry-66df7c8f76-ml6qq\" (UID: \"6239fc19-abc8-4f3a-9f3e-5c56f901c501\") " pod="openshift-image-registry/image-registry-66df7c8f76-ml6qq" Mar 14 07:04:52 crc kubenswrapper[4893]: I0314 07:04:52.574247 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6239fc19-abc8-4f3a-9f3e-5c56f901c501-registry-tls\") pod \"image-registry-66df7c8f76-ml6qq\" (UID: \"6239fc19-abc8-4f3a-9f3e-5c56f901c501\") " pod="openshift-image-registry/image-registry-66df7c8f76-ml6qq" Mar 14 07:04:52 crc kubenswrapper[4893]: I0314 07:04:52.574812 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6239fc19-abc8-4f3a-9f3e-5c56f901c501-ca-trust-extracted\") pod \"image-registry-66df7c8f76-ml6qq\" (UID: \"6239fc19-abc8-4f3a-9f3e-5c56f901c501\") " pod="openshift-image-registry/image-registry-66df7c8f76-ml6qq" Mar 14 07:04:52 crc kubenswrapper[4893]: I0314 07:04:52.575656 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6239fc19-abc8-4f3a-9f3e-5c56f901c501-registry-certificates\") pod \"image-registry-66df7c8f76-ml6qq\" (UID: \"6239fc19-abc8-4f3a-9f3e-5c56f901c501\") " pod="openshift-image-registry/image-registry-66df7c8f76-ml6qq" Mar 14 07:04:52 crc kubenswrapper[4893]: I0314 07:04:52.575761 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6239fc19-abc8-4f3a-9f3e-5c56f901c501-trusted-ca\") pod \"image-registry-66df7c8f76-ml6qq\" (UID: \"6239fc19-abc8-4f3a-9f3e-5c56f901c501\") " pod="openshift-image-registry/image-registry-66df7c8f76-ml6qq" Mar 14 07:04:52 crc kubenswrapper[4893]: I0314 07:04:52.580011 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6239fc19-abc8-4f3a-9f3e-5c56f901c501-installation-pull-secrets\") pod \"image-registry-66df7c8f76-ml6qq\" (UID: \"6239fc19-abc8-4f3a-9f3e-5c56f901c501\") " pod="openshift-image-registry/image-registry-66df7c8f76-ml6qq" Mar 14 07:04:52 crc kubenswrapper[4893]: I0314 07:04:52.580039 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6239fc19-abc8-4f3a-9f3e-5c56f901c501-registry-tls\") pod \"image-registry-66df7c8f76-ml6qq\" (UID: \"6239fc19-abc8-4f3a-9f3e-5c56f901c501\") " pod="openshift-image-registry/image-registry-66df7c8f76-ml6qq" Mar 14 07:04:52 crc kubenswrapper[4893]: I0314 07:04:52.589798 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6239fc19-abc8-4f3a-9f3e-5c56f901c501-bound-sa-token\") pod \"image-registry-66df7c8f76-ml6qq\" (UID: \"6239fc19-abc8-4f3a-9f3e-5c56f901c501\") " pod="openshift-image-registry/image-registry-66df7c8f76-ml6qq" Mar 14 07:04:52 crc kubenswrapper[4893]: I0314 07:04:52.591466 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhmcf\" (UniqueName: \"kubernetes.io/projected/6239fc19-abc8-4f3a-9f3e-5c56f901c501-kube-api-access-dhmcf\") pod \"image-registry-66df7c8f76-ml6qq\" (UID: \"6239fc19-abc8-4f3a-9f3e-5c56f901c501\") " pod="openshift-image-registry/image-registry-66df7c8f76-ml6qq" Mar 14 07:04:52 crc kubenswrapper[4893]: I0314 07:04:52.686057 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-ml6qq" Mar 14 07:04:53 crc kubenswrapper[4893]: I0314 07:04:53.066357 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-ml6qq"] Mar 14 07:04:53 crc kubenswrapper[4893]: I0314 07:04:53.592401 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-ml6qq" event={"ID":"6239fc19-abc8-4f3a-9f3e-5c56f901c501","Type":"ContainerStarted","Data":"413952f63e91d69182fadb417fc9456f81ad6b63c1e4066f5a749f3df06e57f9"} Mar 14 07:04:53 crc kubenswrapper[4893]: I0314 07:04:53.592751 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-ml6qq" event={"ID":"6239fc19-abc8-4f3a-9f3e-5c56f901c501","Type":"ContainerStarted","Data":"278037015e06971b4e688ab766217294538191d80cbc13604eb65e54f49717bc"} Mar 14 07:04:53 crc kubenswrapper[4893]: I0314 07:04:53.593538 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-ml6qq" Mar 14 07:04:57 crc kubenswrapper[4893]: I0314 07:04:57.574876 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-ml6qq" podStartSLOduration=5.574854498 podStartE2EDuration="5.574854498s" podCreationTimestamp="2026-03-14 07:04:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:04:53.617215631 +0000 UTC m=+372.879392423" watchObservedRunningTime="2026-03-14 07:04:57.574854498 +0000 UTC m=+376.837031320" Mar 14 07:04:57 crc kubenswrapper[4893]: I0314 07:04:57.600432 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-727v2"] Mar 14 07:04:57 crc kubenswrapper[4893]: I0314 07:04:57.602272 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-727v2" podUID="6eb806cc-dc34-40ff-b7d5-c33a575822ec" containerName="registry-server" containerID="cri-o://9d6100f269a5eee60da7afefd8b16d60b063b020809933d71539e771c2ce3e0c" gracePeriod=30 Mar 14 07:04:57 crc kubenswrapper[4893]: I0314 07:04:57.609165 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cmmld"] Mar 14 07:04:57 crc kubenswrapper[4893]: I0314 07:04:57.609420 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-cmmld" podUID="b1c55410-c44f-483c-801a-de26ae05a415" containerName="registry-server" containerID="cri-o://a87c86d88bef580462a8e9ebed7a0ba4ca8c00813eaf95769652acf77b42e80e" gracePeriod=30 Mar 14 07:04:57 crc kubenswrapper[4893]: I0314 07:04:57.617756 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jn8vg"] Mar 14 07:04:57 crc kubenswrapper[4893]: I0314 07:04:57.618054 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-jn8vg" podUID="982402e2-823c-4c34-a446-7a2b05e9a00d" containerName="marketplace-operator" containerID="cri-o://89f358feb51dba0355db67ac2e3a115b052bc955a3b295f35fc40fcf1fad24aa" gracePeriod=30 Mar 14 07:04:57 crc kubenswrapper[4893]: I0314 07:04:57.630047 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gn5xf"] Mar 14 07:04:57 crc kubenswrapper[4893]: I0314 07:04:57.630322 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gn5xf" podUID="212d2416-8201-4cae-a8b9-3121de2e8348" containerName="registry-server" containerID="cri-o://4546f9d0ef1636dd5954132141088ef1541985f7f0def6c3e49004f0e256679a" gracePeriod=30 Mar 14 07:04:57 crc kubenswrapper[4893]: I0314 07:04:57.635827 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8kqzg"] Mar 14 07:04:57 crc kubenswrapper[4893]: I0314 07:04:57.636627 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8kqzg" Mar 14 07:04:57 crc kubenswrapper[4893]: I0314 07:04:57.657371 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xsf27"] Mar 14 07:04:57 crc kubenswrapper[4893]: I0314 07:04:57.657603 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xsf27" podUID="3a28de63-7c73-4b79-9242-7dda511afc68" containerName="registry-server" containerID="cri-o://0698659b76e1744d7d3996eb027486edc06d5e35daf95ff74aefeffb4d220573" gracePeriod=30 Mar 14 07:04:57 crc kubenswrapper[4893]: I0314 07:04:57.665898 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8kqzg"] Mar 14 07:04:57 crc kubenswrapper[4893]: E0314 07:04:57.683000 4893 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4546f9d0ef1636dd5954132141088ef1541985f7f0def6c3e49004f0e256679a is running failed: container process not found" containerID="4546f9d0ef1636dd5954132141088ef1541985f7f0def6c3e49004f0e256679a" cmd=["grpc_health_probe","-addr=:50051"] Mar 14 07:04:57 crc kubenswrapper[4893]: E0314 07:04:57.683395 4893 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4546f9d0ef1636dd5954132141088ef1541985f7f0def6c3e49004f0e256679a is running failed: container process not found" containerID="4546f9d0ef1636dd5954132141088ef1541985f7f0def6c3e49004f0e256679a" cmd=["grpc_health_probe","-addr=:50051"] Mar 14 07:04:57 crc kubenswrapper[4893]: E0314 07:04:57.683902 4893 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4546f9d0ef1636dd5954132141088ef1541985f7f0def6c3e49004f0e256679a is running failed: container process not found" containerID="4546f9d0ef1636dd5954132141088ef1541985f7f0def6c3e49004f0e256679a" cmd=["grpc_health_probe","-addr=:50051"] Mar 14 07:04:57 crc kubenswrapper[4893]: E0314 07:04:57.683929 4893 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4546f9d0ef1636dd5954132141088ef1541985f7f0def6c3e49004f0e256679a is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-gn5xf" podUID="212d2416-8201-4cae-a8b9-3121de2e8348" containerName="registry-server" Mar 14 07:04:57 crc kubenswrapper[4893]: I0314 07:04:57.759895 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbjqt\" (UniqueName: \"kubernetes.io/projected/34d6b9ea-7ada-4230-a712-fa63c21038a1-kube-api-access-zbjqt\") pod \"marketplace-operator-79b997595-8kqzg\" (UID: \"34d6b9ea-7ada-4230-a712-fa63c21038a1\") " pod="openshift-marketplace/marketplace-operator-79b997595-8kqzg" Mar 14 07:04:57 crc kubenswrapper[4893]: I0314 07:04:57.759950 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/34d6b9ea-7ada-4230-a712-fa63c21038a1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8kqzg\" (UID: \"34d6b9ea-7ada-4230-a712-fa63c21038a1\") " pod="openshift-marketplace/marketplace-operator-79b997595-8kqzg" Mar 14 07:04:57 crc kubenswrapper[4893]: I0314 07:04:57.760016 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/34d6b9ea-7ada-4230-a712-fa63c21038a1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8kqzg\" (UID: \"34d6b9ea-7ada-4230-a712-fa63c21038a1\") " pod="openshift-marketplace/marketplace-operator-79b997595-8kqzg" Mar 14 07:04:57 crc kubenswrapper[4893]: I0314 07:04:57.860954 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/34d6b9ea-7ada-4230-a712-fa63c21038a1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8kqzg\" (UID: \"34d6b9ea-7ada-4230-a712-fa63c21038a1\") " pod="openshift-marketplace/marketplace-operator-79b997595-8kqzg" Mar 14 07:04:57 crc kubenswrapper[4893]: I0314 07:04:57.861022 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbjqt\" (UniqueName: \"kubernetes.io/projected/34d6b9ea-7ada-4230-a712-fa63c21038a1-kube-api-access-zbjqt\") pod \"marketplace-operator-79b997595-8kqzg\" (UID: \"34d6b9ea-7ada-4230-a712-fa63c21038a1\") " pod="openshift-marketplace/marketplace-operator-79b997595-8kqzg" Mar 14 07:04:57 crc kubenswrapper[4893]: I0314 07:04:57.861051 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/34d6b9ea-7ada-4230-a712-fa63c21038a1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8kqzg\" (UID: \"34d6b9ea-7ada-4230-a712-fa63c21038a1\") " pod="openshift-marketplace/marketplace-operator-79b997595-8kqzg" Mar 14 07:04:57 crc kubenswrapper[4893]: I0314 07:04:57.862304 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/34d6b9ea-7ada-4230-a712-fa63c21038a1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8kqzg\" (UID: \"34d6b9ea-7ada-4230-a712-fa63c21038a1\") " pod="openshift-marketplace/marketplace-operator-79b997595-8kqzg" Mar 14 07:04:57 crc kubenswrapper[4893]: I0314 07:04:57.867668 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/34d6b9ea-7ada-4230-a712-fa63c21038a1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8kqzg\" (UID: \"34d6b9ea-7ada-4230-a712-fa63c21038a1\") " pod="openshift-marketplace/marketplace-operator-79b997595-8kqzg" Mar 14 07:04:57 crc kubenswrapper[4893]: I0314 07:04:57.877819 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbjqt\" (UniqueName: \"kubernetes.io/projected/34d6b9ea-7ada-4230-a712-fa63c21038a1-kube-api-access-zbjqt\") pod \"marketplace-operator-79b997595-8kqzg\" (UID: \"34d6b9ea-7ada-4230-a712-fa63c21038a1\") " pod="openshift-marketplace/marketplace-operator-79b997595-8kqzg" Mar 14 07:04:57 crc kubenswrapper[4893]: I0314 07:04:57.969806 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8kqzg" Mar 14 07:04:58 crc kubenswrapper[4893]: I0314 07:04:58.139496 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-727v2" Mar 14 07:04:58 crc kubenswrapper[4893]: I0314 07:04:58.266292 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6eb806cc-dc34-40ff-b7d5-c33a575822ec-catalog-content\") pod \"6eb806cc-dc34-40ff-b7d5-c33a575822ec\" (UID: \"6eb806cc-dc34-40ff-b7d5-c33a575822ec\") " Mar 14 07:04:58 crc kubenswrapper[4893]: I0314 07:04:58.266337 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6eb806cc-dc34-40ff-b7d5-c33a575822ec-utilities\") pod \"6eb806cc-dc34-40ff-b7d5-c33a575822ec\" (UID: \"6eb806cc-dc34-40ff-b7d5-c33a575822ec\") " Mar 14 07:04:58 crc kubenswrapper[4893]: I0314 07:04:58.266417 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5crnx\" (UniqueName: \"kubernetes.io/projected/6eb806cc-dc34-40ff-b7d5-c33a575822ec-kube-api-access-5crnx\") pod \"6eb806cc-dc34-40ff-b7d5-c33a575822ec\" (UID: \"6eb806cc-dc34-40ff-b7d5-c33a575822ec\") " Mar 14 07:04:58 crc kubenswrapper[4893]: I0314 07:04:58.268496 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6eb806cc-dc34-40ff-b7d5-c33a575822ec-utilities" (OuterVolumeSpecName: "utilities") pod "6eb806cc-dc34-40ff-b7d5-c33a575822ec" (UID: "6eb806cc-dc34-40ff-b7d5-c33a575822ec"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:04:58 crc kubenswrapper[4893]: I0314 07:04:58.270981 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6eb806cc-dc34-40ff-b7d5-c33a575822ec-kube-api-access-5crnx" (OuterVolumeSpecName: "kube-api-access-5crnx") pod "6eb806cc-dc34-40ff-b7d5-c33a575822ec" (UID: "6eb806cc-dc34-40ff-b7d5-c33a575822ec"). InnerVolumeSpecName "kube-api-access-5crnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:04:58 crc kubenswrapper[4893]: I0314 07:04:58.278185 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xsf27" Mar 14 07:04:58 crc kubenswrapper[4893]: I0314 07:04:58.317439 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6eb806cc-dc34-40ff-b7d5-c33a575822ec-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6eb806cc-dc34-40ff-b7d5-c33a575822ec" (UID: "6eb806cc-dc34-40ff-b7d5-c33a575822ec"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:04:58 crc kubenswrapper[4893]: I0314 07:04:58.340273 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gn5xf" Mar 14 07:04:58 crc kubenswrapper[4893]: I0314 07:04:58.343801 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-jn8vg" Mar 14 07:04:58 crc kubenswrapper[4893]: I0314 07:04:58.347647 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cmmld" Mar 14 07:04:58 crc kubenswrapper[4893]: I0314 07:04:58.367237 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldjfm\" (UniqueName: \"kubernetes.io/projected/3a28de63-7c73-4b79-9242-7dda511afc68-kube-api-access-ldjfm\") pod \"3a28de63-7c73-4b79-9242-7dda511afc68\" (UID: \"3a28de63-7c73-4b79-9242-7dda511afc68\") " Mar 14 07:04:58 crc kubenswrapper[4893]: I0314 07:04:58.367296 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a28de63-7c73-4b79-9242-7dda511afc68-catalog-content\") pod \"3a28de63-7c73-4b79-9242-7dda511afc68\" (UID: \"3a28de63-7c73-4b79-9242-7dda511afc68\") " Mar 14 07:04:58 crc kubenswrapper[4893]: I0314 07:04:58.367356 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a28de63-7c73-4b79-9242-7dda511afc68-utilities\") pod \"3a28de63-7c73-4b79-9242-7dda511afc68\" (UID: \"3a28de63-7c73-4b79-9242-7dda511afc68\") " Mar 14 07:04:58 crc kubenswrapper[4893]: I0314 07:04:58.368117 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a28de63-7c73-4b79-9242-7dda511afc68-utilities" (OuterVolumeSpecName: "utilities") pod "3a28de63-7c73-4b79-9242-7dda511afc68" (UID: "3a28de63-7c73-4b79-9242-7dda511afc68"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:04:58 crc kubenswrapper[4893]: I0314 07:04:58.368358 4893 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a28de63-7c73-4b79-9242-7dda511afc68-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 07:04:58 crc kubenswrapper[4893]: I0314 07:04:58.368386 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5crnx\" (UniqueName: \"kubernetes.io/projected/6eb806cc-dc34-40ff-b7d5-c33a575822ec-kube-api-access-5crnx\") on node \"crc\" DevicePath \"\"" Mar 14 07:04:58 crc kubenswrapper[4893]: I0314 07:04:58.368396 4893 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6eb806cc-dc34-40ff-b7d5-c33a575822ec-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 07:04:58 crc kubenswrapper[4893]: I0314 07:04:58.368405 4893 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6eb806cc-dc34-40ff-b7d5-c33a575822ec-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 07:04:58 crc kubenswrapper[4893]: I0314 07:04:58.369689 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a28de63-7c73-4b79-9242-7dda511afc68-kube-api-access-ldjfm" (OuterVolumeSpecName: "kube-api-access-ldjfm") pod "3a28de63-7c73-4b79-9242-7dda511afc68" (UID: "3a28de63-7c73-4b79-9242-7dda511afc68"). InnerVolumeSpecName "kube-api-access-ldjfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.469771 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/982402e2-823c-4c34-a446-7a2b05e9a00d-marketplace-operator-metrics\") pod \"982402e2-823c-4c34-a446-7a2b05e9a00d\" (UID: \"982402e2-823c-4c34-a446-7a2b05e9a00d\") " Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.469830 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1c55410-c44f-483c-801a-de26ae05a415-catalog-content\") pod \"b1c55410-c44f-483c-801a-de26ae05a415\" (UID: \"b1c55410-c44f-483c-801a-de26ae05a415\") " Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.469854 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tr26g\" (UniqueName: \"kubernetes.io/projected/982402e2-823c-4c34-a446-7a2b05e9a00d-kube-api-access-tr26g\") pod \"982402e2-823c-4c34-a446-7a2b05e9a00d\" (UID: \"982402e2-823c-4c34-a446-7a2b05e9a00d\") " Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.469891 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/982402e2-823c-4c34-a446-7a2b05e9a00d-marketplace-trusted-ca\") pod \"982402e2-823c-4c34-a446-7a2b05e9a00d\" (UID: \"982402e2-823c-4c34-a446-7a2b05e9a00d\") " Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.469938 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/212d2416-8201-4cae-a8b9-3121de2e8348-utilities\") pod \"212d2416-8201-4cae-a8b9-3121de2e8348\" (UID: \"212d2416-8201-4cae-a8b9-3121de2e8348\") " Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.469954 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/212d2416-8201-4cae-a8b9-3121de2e8348-catalog-content\") pod \"212d2416-8201-4cae-a8b9-3121de2e8348\" (UID: \"212d2416-8201-4cae-a8b9-3121de2e8348\") " Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.469979 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5jrk\" (UniqueName: \"kubernetes.io/projected/b1c55410-c44f-483c-801a-de26ae05a415-kube-api-access-w5jrk\") pod \"b1c55410-c44f-483c-801a-de26ae05a415\" (UID: \"b1c55410-c44f-483c-801a-de26ae05a415\") " Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.469998 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1c55410-c44f-483c-801a-de26ae05a415-utilities\") pod \"b1c55410-c44f-483c-801a-de26ae05a415\" (UID: \"b1c55410-c44f-483c-801a-de26ae05a415\") " Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.470022 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2m5r\" (UniqueName: \"kubernetes.io/projected/212d2416-8201-4cae-a8b9-3121de2e8348-kube-api-access-d2m5r\") pod \"212d2416-8201-4cae-a8b9-3121de2e8348\" (UID: \"212d2416-8201-4cae-a8b9-3121de2e8348\") " Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.470339 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ldjfm\" (UniqueName: \"kubernetes.io/projected/3a28de63-7c73-4b79-9242-7dda511afc68-kube-api-access-ldjfm\") on node \"crc\" DevicePath \"\"" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.470915 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1c55410-c44f-483c-801a-de26ae05a415-utilities" (OuterVolumeSpecName: "utilities") pod "b1c55410-c44f-483c-801a-de26ae05a415" (UID: "b1c55410-c44f-483c-801a-de26ae05a415"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.471082 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/982402e2-823c-4c34-a446-7a2b05e9a00d-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "982402e2-823c-4c34-a446-7a2b05e9a00d" (UID: "982402e2-823c-4c34-a446-7a2b05e9a00d"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.471095 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/212d2416-8201-4cae-a8b9-3121de2e8348-utilities" (OuterVolumeSpecName: "utilities") pod "212d2416-8201-4cae-a8b9-3121de2e8348" (UID: "212d2416-8201-4cae-a8b9-3121de2e8348"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.473342 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/212d2416-8201-4cae-a8b9-3121de2e8348-kube-api-access-d2m5r" (OuterVolumeSpecName: "kube-api-access-d2m5r") pod "212d2416-8201-4cae-a8b9-3121de2e8348" (UID: "212d2416-8201-4cae-a8b9-3121de2e8348"). InnerVolumeSpecName "kube-api-access-d2m5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.478269 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/982402e2-823c-4c34-a446-7a2b05e9a00d-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "982402e2-823c-4c34-a446-7a2b05e9a00d" (UID: "982402e2-823c-4c34-a446-7a2b05e9a00d"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.478337 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/982402e2-823c-4c34-a446-7a2b05e9a00d-kube-api-access-tr26g" (OuterVolumeSpecName: "kube-api-access-tr26g") pod "982402e2-823c-4c34-a446-7a2b05e9a00d" (UID: "982402e2-823c-4c34-a446-7a2b05e9a00d"). InnerVolumeSpecName "kube-api-access-tr26g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.480132 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1c55410-c44f-483c-801a-de26ae05a415-kube-api-access-w5jrk" (OuterVolumeSpecName: "kube-api-access-w5jrk") pod "b1c55410-c44f-483c-801a-de26ae05a415" (UID: "b1c55410-c44f-483c-801a-de26ae05a415"). InnerVolumeSpecName "kube-api-access-w5jrk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.495887 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/212d2416-8201-4cae-a8b9-3121de2e8348-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "212d2416-8201-4cae-a8b9-3121de2e8348" (UID: "212d2416-8201-4cae-a8b9-3121de2e8348"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.519347 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8kqzg"] Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.535759 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a28de63-7c73-4b79-9242-7dda511afc68-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3a28de63-7c73-4b79-9242-7dda511afc68" (UID: "3a28de63-7c73-4b79-9242-7dda511afc68"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.541793 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1c55410-c44f-483c-801a-de26ae05a415-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b1c55410-c44f-483c-801a-de26ae05a415" (UID: "b1c55410-c44f-483c-801a-de26ae05a415"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.571856 4893 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/982402e2-823c-4c34-a446-7a2b05e9a00d-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.571886 4893 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1c55410-c44f-483c-801a-de26ae05a415-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.571895 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tr26g\" (UniqueName: \"kubernetes.io/projected/982402e2-823c-4c34-a446-7a2b05e9a00d-kube-api-access-tr26g\") on node \"crc\" DevicePath \"\"" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.571905 4893 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/982402e2-823c-4c34-a446-7a2b05e9a00d-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.571914 4893 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a28de63-7c73-4b79-9242-7dda511afc68-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.571923 4893 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/212d2416-8201-4cae-a8b9-3121de2e8348-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.571931 4893 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/212d2416-8201-4cae-a8b9-3121de2e8348-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.571939 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5jrk\" (UniqueName: \"kubernetes.io/projected/b1c55410-c44f-483c-801a-de26ae05a415-kube-api-access-w5jrk\") on node \"crc\" DevicePath \"\"" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.571947 4893 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1c55410-c44f-483c-801a-de26ae05a415-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.571955 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2m5r\" (UniqueName: \"kubernetes.io/projected/212d2416-8201-4cae-a8b9-3121de2e8348-kube-api-access-d2m5r\") on node \"crc\" DevicePath \"\"" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.627484 4893 generic.go:334] "Generic (PLEG): container finished" podID="212d2416-8201-4cae-a8b9-3121de2e8348" containerID="4546f9d0ef1636dd5954132141088ef1541985f7f0def6c3e49004f0e256679a" exitCode=0 Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.627562 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gn5xf" event={"ID":"212d2416-8201-4cae-a8b9-3121de2e8348","Type":"ContainerDied","Data":"4546f9d0ef1636dd5954132141088ef1541985f7f0def6c3e49004f0e256679a"} Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.627612 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gn5xf" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.627636 4893 scope.go:117] "RemoveContainer" containerID="4546f9d0ef1636dd5954132141088ef1541985f7f0def6c3e49004f0e256679a" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.627622 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gn5xf" event={"ID":"212d2416-8201-4cae-a8b9-3121de2e8348","Type":"ContainerDied","Data":"e73e11a07f7d3f2820c603e841a9048a4b79bdab2f7ae5ec2fd94c3462e6a997"} Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.635338 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8kqzg" event={"ID":"34d6b9ea-7ada-4230-a712-fa63c21038a1","Type":"ContainerStarted","Data":"d35fa64ea061b5b02ecee1324e4830a5e379780bcf53a0f3b3aa90c821eb54ba"} Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.637336 4893 generic.go:334] "Generic (PLEG): container finished" podID="982402e2-823c-4c34-a446-7a2b05e9a00d" containerID="89f358feb51dba0355db67ac2e3a115b052bc955a3b295f35fc40fcf1fad24aa" exitCode=0 Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.637443 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-jn8vg" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.637583 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jn8vg" event={"ID":"982402e2-823c-4c34-a446-7a2b05e9a00d","Type":"ContainerDied","Data":"89f358feb51dba0355db67ac2e3a115b052bc955a3b295f35fc40fcf1fad24aa"} Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.637610 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jn8vg" event={"ID":"982402e2-823c-4c34-a446-7a2b05e9a00d","Type":"ContainerDied","Data":"1f861f6f715c76bf703ff7d48a856e92a0410c0cbe8ce41eae5e22329ccf5305"} Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.648430 4893 generic.go:334] "Generic (PLEG): container finished" podID="b1c55410-c44f-483c-801a-de26ae05a415" containerID="a87c86d88bef580462a8e9ebed7a0ba4ca8c00813eaf95769652acf77b42e80e" exitCode=0 Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.648483 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cmmld" event={"ID":"b1c55410-c44f-483c-801a-de26ae05a415","Type":"ContainerDied","Data":"a87c86d88bef580462a8e9ebed7a0ba4ca8c00813eaf95769652acf77b42e80e"} Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.648983 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cmmld" event={"ID":"b1c55410-c44f-483c-801a-de26ae05a415","Type":"ContainerDied","Data":"42f4cdccc077318852a23e6115fa0a24f777f47214a0a9abf2a3f8029dd0491c"} Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.648506 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cmmld" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.652826 4893 scope.go:117] "RemoveContainer" containerID="c96dff5c36296511a0d43c9871a864cf06ebb3b69e83eb1d1a90b327b4e3d214" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.660890 4893 generic.go:334] "Generic (PLEG): container finished" podID="3a28de63-7c73-4b79-9242-7dda511afc68" containerID="0698659b76e1744d7d3996eb027486edc06d5e35daf95ff74aefeffb4d220573" exitCode=0 Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.660970 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xsf27" event={"ID":"3a28de63-7c73-4b79-9242-7dda511afc68","Type":"ContainerDied","Data":"0698659b76e1744d7d3996eb027486edc06d5e35daf95ff74aefeffb4d220573"} Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.661020 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xsf27" event={"ID":"3a28de63-7c73-4b79-9242-7dda511afc68","Type":"ContainerDied","Data":"dfcbbd27839215f5bfe14727bea300df23cbcad886196e54e6bdd5783b284eee"} Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.661123 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xsf27" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.665014 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gn5xf"] Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.670063 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gn5xf"] Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.672970 4893 generic.go:334] "Generic (PLEG): container finished" podID="6eb806cc-dc34-40ff-b7d5-c33a575822ec" containerID="9d6100f269a5eee60da7afefd8b16d60b063b020809933d71539e771c2ce3e0c" exitCode=0 Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.673006 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-727v2" event={"ID":"6eb806cc-dc34-40ff-b7d5-c33a575822ec","Type":"ContainerDied","Data":"9d6100f269a5eee60da7afefd8b16d60b063b020809933d71539e771c2ce3e0c"} Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.673033 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-727v2" event={"ID":"6eb806cc-dc34-40ff-b7d5-c33a575822ec","Type":"ContainerDied","Data":"a6d444db79cf5a51427aefa0db554dbc6509fb62dcc44c8d03cd38ce740eecbc"} Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.673058 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-727v2" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.685205 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jn8vg"] Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.693995 4893 scope.go:117] "RemoveContainer" containerID="0b353a2ceb899b0c40f4537fd4e5c8d75dbe5c8ae4b340db7817680d2b954937" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.695611 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jn8vg"] Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.699382 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cmmld"] Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.704927 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-cmmld"] Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.711597 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xsf27"] Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.723241 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xsf27"] Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.727597 4893 scope.go:117] "RemoveContainer" containerID="4546f9d0ef1636dd5954132141088ef1541985f7f0def6c3e49004f0e256679a" Mar 14 07:04:59 crc kubenswrapper[4893]: E0314 07:04:58.728157 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4546f9d0ef1636dd5954132141088ef1541985f7f0def6c3e49004f0e256679a\": container with ID starting with 4546f9d0ef1636dd5954132141088ef1541985f7f0def6c3e49004f0e256679a not found: ID does not exist" containerID="4546f9d0ef1636dd5954132141088ef1541985f7f0def6c3e49004f0e256679a" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.728190 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4546f9d0ef1636dd5954132141088ef1541985f7f0def6c3e49004f0e256679a"} err="failed to get container status \"4546f9d0ef1636dd5954132141088ef1541985f7f0def6c3e49004f0e256679a\": rpc error: code = NotFound desc = could not find container \"4546f9d0ef1636dd5954132141088ef1541985f7f0def6c3e49004f0e256679a\": container with ID starting with 4546f9d0ef1636dd5954132141088ef1541985f7f0def6c3e49004f0e256679a not found: ID does not exist" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.728234 4893 scope.go:117] "RemoveContainer" containerID="c96dff5c36296511a0d43c9871a864cf06ebb3b69e83eb1d1a90b327b4e3d214" Mar 14 07:04:59 crc kubenswrapper[4893]: E0314 07:04:58.728476 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c96dff5c36296511a0d43c9871a864cf06ebb3b69e83eb1d1a90b327b4e3d214\": container with ID starting with c96dff5c36296511a0d43c9871a864cf06ebb3b69e83eb1d1a90b327b4e3d214 not found: ID does not exist" containerID="c96dff5c36296511a0d43c9871a864cf06ebb3b69e83eb1d1a90b327b4e3d214" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.728504 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c96dff5c36296511a0d43c9871a864cf06ebb3b69e83eb1d1a90b327b4e3d214"} err="failed to get container status \"c96dff5c36296511a0d43c9871a864cf06ebb3b69e83eb1d1a90b327b4e3d214\": rpc error: code = NotFound desc = could not find container \"c96dff5c36296511a0d43c9871a864cf06ebb3b69e83eb1d1a90b327b4e3d214\": container with ID starting with c96dff5c36296511a0d43c9871a864cf06ebb3b69e83eb1d1a90b327b4e3d214 not found: ID does not exist" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.728582 4893 scope.go:117] "RemoveContainer" containerID="0b353a2ceb899b0c40f4537fd4e5c8d75dbe5c8ae4b340db7817680d2b954937" Mar 14 07:04:59 crc kubenswrapper[4893]: E0314 07:04:58.729028 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b353a2ceb899b0c40f4537fd4e5c8d75dbe5c8ae4b340db7817680d2b954937\": container with ID starting with 0b353a2ceb899b0c40f4537fd4e5c8d75dbe5c8ae4b340db7817680d2b954937 not found: ID does not exist" containerID="0b353a2ceb899b0c40f4537fd4e5c8d75dbe5c8ae4b340db7817680d2b954937" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.729078 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b353a2ceb899b0c40f4537fd4e5c8d75dbe5c8ae4b340db7817680d2b954937"} err="failed to get container status \"0b353a2ceb899b0c40f4537fd4e5c8d75dbe5c8ae4b340db7817680d2b954937\": rpc error: code = NotFound desc = could not find container \"0b353a2ceb899b0c40f4537fd4e5c8d75dbe5c8ae4b340db7817680d2b954937\": container with ID starting with 0b353a2ceb899b0c40f4537fd4e5c8d75dbe5c8ae4b340db7817680d2b954937 not found: ID does not exist" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.729112 4893 scope.go:117] "RemoveContainer" containerID="89f358feb51dba0355db67ac2e3a115b052bc955a3b295f35fc40fcf1fad24aa" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.729737 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-727v2"] Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.733551 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-727v2"] Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.784480 4893 scope.go:117] "RemoveContainer" containerID="5f3284fcbc15fce7debc65c5d3d3aa0c60e2954160f012152ed144eed9cb67f9" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.797753 4893 scope.go:117] "RemoveContainer" containerID="89f358feb51dba0355db67ac2e3a115b052bc955a3b295f35fc40fcf1fad24aa" Mar 14 07:04:59 crc kubenswrapper[4893]: E0314 07:04:58.798807 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89f358feb51dba0355db67ac2e3a115b052bc955a3b295f35fc40fcf1fad24aa\": container with ID starting with 89f358feb51dba0355db67ac2e3a115b052bc955a3b295f35fc40fcf1fad24aa not found: ID does not exist" containerID="89f358feb51dba0355db67ac2e3a115b052bc955a3b295f35fc40fcf1fad24aa" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.798879 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89f358feb51dba0355db67ac2e3a115b052bc955a3b295f35fc40fcf1fad24aa"} err="failed to get container status \"89f358feb51dba0355db67ac2e3a115b052bc955a3b295f35fc40fcf1fad24aa\": rpc error: code = NotFound desc = could not find container \"89f358feb51dba0355db67ac2e3a115b052bc955a3b295f35fc40fcf1fad24aa\": container with ID starting with 89f358feb51dba0355db67ac2e3a115b052bc955a3b295f35fc40fcf1fad24aa not found: ID does not exist" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.798911 4893 scope.go:117] "RemoveContainer" containerID="5f3284fcbc15fce7debc65c5d3d3aa0c60e2954160f012152ed144eed9cb67f9" Mar 14 07:04:59 crc kubenswrapper[4893]: E0314 07:04:58.799273 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f3284fcbc15fce7debc65c5d3d3aa0c60e2954160f012152ed144eed9cb67f9\": container with ID starting with 5f3284fcbc15fce7debc65c5d3d3aa0c60e2954160f012152ed144eed9cb67f9 not found: ID does not exist" containerID="5f3284fcbc15fce7debc65c5d3d3aa0c60e2954160f012152ed144eed9cb67f9" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.799297 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f3284fcbc15fce7debc65c5d3d3aa0c60e2954160f012152ed144eed9cb67f9"} err="failed to get container status \"5f3284fcbc15fce7debc65c5d3d3aa0c60e2954160f012152ed144eed9cb67f9\": rpc error: code = NotFound desc = could not find container \"5f3284fcbc15fce7debc65c5d3d3aa0c60e2954160f012152ed144eed9cb67f9\": container with ID starting with 5f3284fcbc15fce7debc65c5d3d3aa0c60e2954160f012152ed144eed9cb67f9 not found: ID does not exist" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.799316 4893 scope.go:117] "RemoveContainer" containerID="a87c86d88bef580462a8e9ebed7a0ba4ca8c00813eaf95769652acf77b42e80e" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.816781 4893 scope.go:117] "RemoveContainer" containerID="7a8758cb5b00521d19942b2c467f2f9ad10bfb18ff5ef123d694466e62bee10e" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.832686 4893 scope.go:117] "RemoveContainer" containerID="20f2f1dc2429b8e6e78d7ae2eed4ef4cfa7fc42b3b0e1bff3994778ef128453b" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.852468 4893 scope.go:117] "RemoveContainer" containerID="a87c86d88bef580462a8e9ebed7a0ba4ca8c00813eaf95769652acf77b42e80e" Mar 14 07:04:59 crc kubenswrapper[4893]: E0314 07:04:58.852894 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a87c86d88bef580462a8e9ebed7a0ba4ca8c00813eaf95769652acf77b42e80e\": container with ID starting with a87c86d88bef580462a8e9ebed7a0ba4ca8c00813eaf95769652acf77b42e80e not found: ID does not exist" containerID="a87c86d88bef580462a8e9ebed7a0ba4ca8c00813eaf95769652acf77b42e80e" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.852917 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a87c86d88bef580462a8e9ebed7a0ba4ca8c00813eaf95769652acf77b42e80e"} err="failed to get container status \"a87c86d88bef580462a8e9ebed7a0ba4ca8c00813eaf95769652acf77b42e80e\": rpc error: code = NotFound desc = could not find container \"a87c86d88bef580462a8e9ebed7a0ba4ca8c00813eaf95769652acf77b42e80e\": container with ID starting with a87c86d88bef580462a8e9ebed7a0ba4ca8c00813eaf95769652acf77b42e80e not found: ID does not exist" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.852942 4893 scope.go:117] "RemoveContainer" containerID="7a8758cb5b00521d19942b2c467f2f9ad10bfb18ff5ef123d694466e62bee10e" Mar 14 07:04:59 crc kubenswrapper[4893]: E0314 07:04:58.853266 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a8758cb5b00521d19942b2c467f2f9ad10bfb18ff5ef123d694466e62bee10e\": container with ID starting with 7a8758cb5b00521d19942b2c467f2f9ad10bfb18ff5ef123d694466e62bee10e not found: ID does not exist" containerID="7a8758cb5b00521d19942b2c467f2f9ad10bfb18ff5ef123d694466e62bee10e" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.853280 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a8758cb5b00521d19942b2c467f2f9ad10bfb18ff5ef123d694466e62bee10e"} err="failed to get container status \"7a8758cb5b00521d19942b2c467f2f9ad10bfb18ff5ef123d694466e62bee10e\": rpc error: code = NotFound desc = could not find container \"7a8758cb5b00521d19942b2c467f2f9ad10bfb18ff5ef123d694466e62bee10e\": container with ID starting with 7a8758cb5b00521d19942b2c467f2f9ad10bfb18ff5ef123d694466e62bee10e not found: ID does not exist" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.853291 4893 scope.go:117] "RemoveContainer" containerID="20f2f1dc2429b8e6e78d7ae2eed4ef4cfa7fc42b3b0e1bff3994778ef128453b" Mar 14 07:04:59 crc kubenswrapper[4893]: E0314 07:04:58.853472 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20f2f1dc2429b8e6e78d7ae2eed4ef4cfa7fc42b3b0e1bff3994778ef128453b\": container with ID starting with 20f2f1dc2429b8e6e78d7ae2eed4ef4cfa7fc42b3b0e1bff3994778ef128453b not found: ID does not exist" containerID="20f2f1dc2429b8e6e78d7ae2eed4ef4cfa7fc42b3b0e1bff3994778ef128453b" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.853486 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20f2f1dc2429b8e6e78d7ae2eed4ef4cfa7fc42b3b0e1bff3994778ef128453b"} err="failed to get container status \"20f2f1dc2429b8e6e78d7ae2eed4ef4cfa7fc42b3b0e1bff3994778ef128453b\": rpc error: code = NotFound desc = could not find container \"20f2f1dc2429b8e6e78d7ae2eed4ef4cfa7fc42b3b0e1bff3994778ef128453b\": container with ID starting with 20f2f1dc2429b8e6e78d7ae2eed4ef4cfa7fc42b3b0e1bff3994778ef128453b not found: ID does not exist" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.853499 4893 scope.go:117] "RemoveContainer" containerID="0698659b76e1744d7d3996eb027486edc06d5e35daf95ff74aefeffb4d220573" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.869734 4893 scope.go:117] "RemoveContainer" containerID="fb9b75f3a9878dbe4059b9037fbce06120ae02a152cd6d13d0b1464cadce476d" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.885448 4893 scope.go:117] "RemoveContainer" containerID="474febce8eacef3d5593565114dfaa7f8dfb770d1b2839934b5abcfa043ad782" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.898761 4893 scope.go:117] "RemoveContainer" containerID="0698659b76e1744d7d3996eb027486edc06d5e35daf95ff74aefeffb4d220573" Mar 14 07:04:59 crc kubenswrapper[4893]: E0314 07:04:58.899098 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0698659b76e1744d7d3996eb027486edc06d5e35daf95ff74aefeffb4d220573\": container with ID starting with 0698659b76e1744d7d3996eb027486edc06d5e35daf95ff74aefeffb4d220573 not found: ID does not exist" containerID="0698659b76e1744d7d3996eb027486edc06d5e35daf95ff74aefeffb4d220573" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.899132 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0698659b76e1744d7d3996eb027486edc06d5e35daf95ff74aefeffb4d220573"} err="failed to get container status \"0698659b76e1744d7d3996eb027486edc06d5e35daf95ff74aefeffb4d220573\": rpc error: code = NotFound desc = could not find container \"0698659b76e1744d7d3996eb027486edc06d5e35daf95ff74aefeffb4d220573\": container with ID starting with 0698659b76e1744d7d3996eb027486edc06d5e35daf95ff74aefeffb4d220573 not found: ID does not exist" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.899152 4893 scope.go:117] "RemoveContainer" containerID="fb9b75f3a9878dbe4059b9037fbce06120ae02a152cd6d13d0b1464cadce476d" Mar 14 07:04:59 crc kubenswrapper[4893]: E0314 07:04:58.899417 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb9b75f3a9878dbe4059b9037fbce06120ae02a152cd6d13d0b1464cadce476d\": container with ID starting with fb9b75f3a9878dbe4059b9037fbce06120ae02a152cd6d13d0b1464cadce476d not found: ID does not exist" containerID="fb9b75f3a9878dbe4059b9037fbce06120ae02a152cd6d13d0b1464cadce476d" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.899433 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb9b75f3a9878dbe4059b9037fbce06120ae02a152cd6d13d0b1464cadce476d"} err="failed to get container status \"fb9b75f3a9878dbe4059b9037fbce06120ae02a152cd6d13d0b1464cadce476d\": rpc error: code = NotFound desc = could not find container \"fb9b75f3a9878dbe4059b9037fbce06120ae02a152cd6d13d0b1464cadce476d\": container with ID starting with fb9b75f3a9878dbe4059b9037fbce06120ae02a152cd6d13d0b1464cadce476d not found: ID does not exist" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.899444 4893 scope.go:117] "RemoveContainer" containerID="474febce8eacef3d5593565114dfaa7f8dfb770d1b2839934b5abcfa043ad782" Mar 14 07:04:59 crc kubenswrapper[4893]: E0314 07:04:58.899646 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"474febce8eacef3d5593565114dfaa7f8dfb770d1b2839934b5abcfa043ad782\": container with ID starting with 474febce8eacef3d5593565114dfaa7f8dfb770d1b2839934b5abcfa043ad782 not found: ID does not exist" containerID="474febce8eacef3d5593565114dfaa7f8dfb770d1b2839934b5abcfa043ad782" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.899663 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"474febce8eacef3d5593565114dfaa7f8dfb770d1b2839934b5abcfa043ad782"} err="failed to get container status \"474febce8eacef3d5593565114dfaa7f8dfb770d1b2839934b5abcfa043ad782\": rpc error: code = NotFound desc = could not find container \"474febce8eacef3d5593565114dfaa7f8dfb770d1b2839934b5abcfa043ad782\": container with ID starting with 474febce8eacef3d5593565114dfaa7f8dfb770d1b2839934b5abcfa043ad782 not found: ID does not exist" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.899675 4893 scope.go:117] "RemoveContainer" containerID="9d6100f269a5eee60da7afefd8b16d60b063b020809933d71539e771c2ce3e0c" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.916610 4893 scope.go:117] "RemoveContainer" containerID="3f802eda512867ab288d2cdd672194b391ec90d6fb96dc8820b75394b2a0478f" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.929162 4893 scope.go:117] "RemoveContainer" containerID="97318b5c1f5183cec550a9425d7cb2e610ec362d17a9b6013829e490c5a55233" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.941467 4893 scope.go:117] "RemoveContainer" containerID="9d6100f269a5eee60da7afefd8b16d60b063b020809933d71539e771c2ce3e0c" Mar 14 07:04:59 crc kubenswrapper[4893]: E0314 07:04:58.941874 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d6100f269a5eee60da7afefd8b16d60b063b020809933d71539e771c2ce3e0c\": container with ID starting with 9d6100f269a5eee60da7afefd8b16d60b063b020809933d71539e771c2ce3e0c not found: ID does not exist" containerID="9d6100f269a5eee60da7afefd8b16d60b063b020809933d71539e771c2ce3e0c" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.941923 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d6100f269a5eee60da7afefd8b16d60b063b020809933d71539e771c2ce3e0c"} err="failed to get container status \"9d6100f269a5eee60da7afefd8b16d60b063b020809933d71539e771c2ce3e0c\": rpc error: code = NotFound desc = could not find container \"9d6100f269a5eee60da7afefd8b16d60b063b020809933d71539e771c2ce3e0c\": container with ID starting with 9d6100f269a5eee60da7afefd8b16d60b063b020809933d71539e771c2ce3e0c not found: ID does not exist" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.941971 4893 scope.go:117] "RemoveContainer" containerID="3f802eda512867ab288d2cdd672194b391ec90d6fb96dc8820b75394b2a0478f" Mar 14 07:04:59 crc kubenswrapper[4893]: E0314 07:04:58.942399 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f802eda512867ab288d2cdd672194b391ec90d6fb96dc8820b75394b2a0478f\": container with ID starting with 3f802eda512867ab288d2cdd672194b391ec90d6fb96dc8820b75394b2a0478f not found: ID does not exist" containerID="3f802eda512867ab288d2cdd672194b391ec90d6fb96dc8820b75394b2a0478f" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.942429 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f802eda512867ab288d2cdd672194b391ec90d6fb96dc8820b75394b2a0478f"} err="failed to get container status \"3f802eda512867ab288d2cdd672194b391ec90d6fb96dc8820b75394b2a0478f\": rpc error: code = NotFound desc = could not find container \"3f802eda512867ab288d2cdd672194b391ec90d6fb96dc8820b75394b2a0478f\": container with ID starting with 3f802eda512867ab288d2cdd672194b391ec90d6fb96dc8820b75394b2a0478f not found: ID does not exist" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.942456 4893 scope.go:117] "RemoveContainer" containerID="97318b5c1f5183cec550a9425d7cb2e610ec362d17a9b6013829e490c5a55233" Mar 14 07:04:59 crc kubenswrapper[4893]: E0314 07:04:58.942730 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97318b5c1f5183cec550a9425d7cb2e610ec362d17a9b6013829e490c5a55233\": container with ID starting with 97318b5c1f5183cec550a9425d7cb2e610ec362d17a9b6013829e490c5a55233 not found: ID does not exist" containerID="97318b5c1f5183cec550a9425d7cb2e610ec362d17a9b6013829e490c5a55233" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:58.942748 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97318b5c1f5183cec550a9425d7cb2e610ec362d17a9b6013829e490c5a55233"} err="failed to get container status \"97318b5c1f5183cec550a9425d7cb2e610ec362d17a9b6013829e490c5a55233\": rpc error: code = NotFound desc = could not find container \"97318b5c1f5183cec550a9425d7cb2e610ec362d17a9b6013829e490c5a55233\": container with ID starting with 97318b5c1f5183cec550a9425d7cb2e610ec362d17a9b6013829e490c5a55233 not found: ID does not exist" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:59.383075 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="212d2416-8201-4cae-a8b9-3121de2e8348" path="/var/lib/kubelet/pods/212d2416-8201-4cae-a8b9-3121de2e8348/volumes" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:59.384071 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a28de63-7c73-4b79-9242-7dda511afc68" path="/var/lib/kubelet/pods/3a28de63-7c73-4b79-9242-7dda511afc68/volumes" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:59.388217 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6eb806cc-dc34-40ff-b7d5-c33a575822ec" path="/var/lib/kubelet/pods/6eb806cc-dc34-40ff-b7d5-c33a575822ec/volumes" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:59.388973 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="982402e2-823c-4c34-a446-7a2b05e9a00d" path="/var/lib/kubelet/pods/982402e2-823c-4c34-a446-7a2b05e9a00d/volumes" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:59.390028 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1c55410-c44f-483c-801a-de26ae05a415" path="/var/lib/kubelet/pods/b1c55410-c44f-483c-801a-de26ae05a415/volumes" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:59.687131 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8kqzg" event={"ID":"34d6b9ea-7ada-4230-a712-fa63c21038a1","Type":"ContainerStarted","Data":"46d67c4b148139fc7aeacc821ab2ca4a6dbf086e214ebabdc84a8f27fb1d7efc"} Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:59.688543 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-8kqzg" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:59.692700 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-8kqzg" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:59.702598 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-8kqzg" podStartSLOduration=2.702578876 podStartE2EDuration="2.702578876s" podCreationTimestamp="2026-03-14 07:04:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:04:59.702091544 +0000 UTC m=+378.964268336" watchObservedRunningTime="2026-03-14 07:04:59.702578876 +0000 UTC m=+378.964755668" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:59.795030 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-t2cjr"] Mar 14 07:04:59 crc kubenswrapper[4893]: E0314 07:04:59.795261 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a28de63-7c73-4b79-9242-7dda511afc68" containerName="extract-utilities" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:59.795282 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a28de63-7c73-4b79-9242-7dda511afc68" containerName="extract-utilities" Mar 14 07:04:59 crc kubenswrapper[4893]: E0314 07:04:59.795292 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6eb806cc-dc34-40ff-b7d5-c33a575822ec" containerName="registry-server" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:59.795299 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="6eb806cc-dc34-40ff-b7d5-c33a575822ec" containerName="registry-server" Mar 14 07:04:59 crc kubenswrapper[4893]: E0314 07:04:59.795307 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a28de63-7c73-4b79-9242-7dda511afc68" containerName="registry-server" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:59.795314 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a28de63-7c73-4b79-9242-7dda511afc68" containerName="registry-server" Mar 14 07:04:59 crc kubenswrapper[4893]: E0314 07:04:59.795323 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6eb806cc-dc34-40ff-b7d5-c33a575822ec" containerName="extract-utilities" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:59.795330 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="6eb806cc-dc34-40ff-b7d5-c33a575822ec" containerName="extract-utilities" Mar 14 07:04:59 crc kubenswrapper[4893]: E0314 07:04:59.795344 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1c55410-c44f-483c-801a-de26ae05a415" containerName="extract-content" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:59.795351 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1c55410-c44f-483c-801a-de26ae05a415" containerName="extract-content" Mar 14 07:04:59 crc kubenswrapper[4893]: E0314 07:04:59.795360 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1c55410-c44f-483c-801a-de26ae05a415" containerName="registry-server" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:59.795367 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1c55410-c44f-483c-801a-de26ae05a415" containerName="registry-server" Mar 14 07:04:59 crc kubenswrapper[4893]: E0314 07:04:59.795377 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="982402e2-823c-4c34-a446-7a2b05e9a00d" containerName="marketplace-operator" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:59.795385 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="982402e2-823c-4c34-a446-7a2b05e9a00d" containerName="marketplace-operator" Mar 14 07:04:59 crc kubenswrapper[4893]: E0314 07:04:59.795393 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="982402e2-823c-4c34-a446-7a2b05e9a00d" containerName="marketplace-operator" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:59.795400 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="982402e2-823c-4c34-a446-7a2b05e9a00d" containerName="marketplace-operator" Mar 14 07:04:59 crc kubenswrapper[4893]: E0314 07:04:59.795412 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="212d2416-8201-4cae-a8b9-3121de2e8348" containerName="registry-server" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:59.795418 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="212d2416-8201-4cae-a8b9-3121de2e8348" containerName="registry-server" Mar 14 07:04:59 crc kubenswrapper[4893]: E0314 07:04:59.795429 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="212d2416-8201-4cae-a8b9-3121de2e8348" containerName="extract-utilities" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:59.795436 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="212d2416-8201-4cae-a8b9-3121de2e8348" containerName="extract-utilities" Mar 14 07:04:59 crc kubenswrapper[4893]: E0314 07:04:59.795446 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6eb806cc-dc34-40ff-b7d5-c33a575822ec" containerName="extract-content" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:59.795454 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="6eb806cc-dc34-40ff-b7d5-c33a575822ec" containerName="extract-content" Mar 14 07:04:59 crc kubenswrapper[4893]: E0314 07:04:59.795465 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a28de63-7c73-4b79-9242-7dda511afc68" containerName="extract-content" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:59.795471 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a28de63-7c73-4b79-9242-7dda511afc68" containerName="extract-content" Mar 14 07:04:59 crc kubenswrapper[4893]: E0314 07:04:59.795484 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1c55410-c44f-483c-801a-de26ae05a415" containerName="extract-utilities" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:59.795492 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1c55410-c44f-483c-801a-de26ae05a415" containerName="extract-utilities" Mar 14 07:04:59 crc kubenswrapper[4893]: E0314 07:04:59.795504 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="212d2416-8201-4cae-a8b9-3121de2e8348" containerName="extract-content" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:59.795512 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="212d2416-8201-4cae-a8b9-3121de2e8348" containerName="extract-content" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:59.795638 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="212d2416-8201-4cae-a8b9-3121de2e8348" containerName="registry-server" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:59.795654 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="982402e2-823c-4c34-a446-7a2b05e9a00d" containerName="marketplace-operator" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:59.795664 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="982402e2-823c-4c34-a446-7a2b05e9a00d" containerName="marketplace-operator" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:59.795675 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="6eb806cc-dc34-40ff-b7d5-c33a575822ec" containerName="registry-server" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:59.795685 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a28de63-7c73-4b79-9242-7dda511afc68" containerName="registry-server" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:59.795693 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1c55410-c44f-483c-801a-de26ae05a415" containerName="registry-server" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:59.796773 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t2cjr" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:59.798466 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:59.801312 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t2cjr"] Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:59.889781 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2bde7b6-4ac1-40b1-b678-c64594684743-utilities\") pod \"redhat-marketplace-t2cjr\" (UID: \"a2bde7b6-4ac1-40b1-b678-c64594684743\") " pod="openshift-marketplace/redhat-marketplace-t2cjr" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:59.889850 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2bde7b6-4ac1-40b1-b678-c64594684743-catalog-content\") pod \"redhat-marketplace-t2cjr\" (UID: \"a2bde7b6-4ac1-40b1-b678-c64594684743\") " pod="openshift-marketplace/redhat-marketplace-t2cjr" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:59.890082 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcz4n\" (UniqueName: \"kubernetes.io/projected/a2bde7b6-4ac1-40b1-b678-c64594684743-kube-api-access-hcz4n\") pod \"redhat-marketplace-t2cjr\" (UID: \"a2bde7b6-4ac1-40b1-b678-c64594684743\") " pod="openshift-marketplace/redhat-marketplace-t2cjr" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:59.989564 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nkk7t"] Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:59.990497 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nkk7t" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:59.990870 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcz4n\" (UniqueName: \"kubernetes.io/projected/a2bde7b6-4ac1-40b1-b678-c64594684743-kube-api-access-hcz4n\") pod \"redhat-marketplace-t2cjr\" (UID: \"a2bde7b6-4ac1-40b1-b678-c64594684743\") " pod="openshift-marketplace/redhat-marketplace-t2cjr" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:59.990949 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2bde7b6-4ac1-40b1-b678-c64594684743-utilities\") pod \"redhat-marketplace-t2cjr\" (UID: \"a2bde7b6-4ac1-40b1-b678-c64594684743\") " pod="openshift-marketplace/redhat-marketplace-t2cjr" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:59.991037 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2bde7b6-4ac1-40b1-b678-c64594684743-catalog-content\") pod \"redhat-marketplace-t2cjr\" (UID: \"a2bde7b6-4ac1-40b1-b678-c64594684743\") " pod="openshift-marketplace/redhat-marketplace-t2cjr" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:59.991442 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2bde7b6-4ac1-40b1-b678-c64594684743-utilities\") pod \"redhat-marketplace-t2cjr\" (UID: \"a2bde7b6-4ac1-40b1-b678-c64594684743\") " pod="openshift-marketplace/redhat-marketplace-t2cjr" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:59.991813 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2bde7b6-4ac1-40b1-b678-c64594684743-catalog-content\") pod \"redhat-marketplace-t2cjr\" (UID: \"a2bde7b6-4ac1-40b1-b678-c64594684743\") " pod="openshift-marketplace/redhat-marketplace-t2cjr" Mar 14 07:04:59 crc kubenswrapper[4893]: I0314 07:04:59.994409 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 14 07:05:00 crc kubenswrapper[4893]: I0314 07:05:00.007099 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nkk7t"] Mar 14 07:05:00 crc kubenswrapper[4893]: I0314 07:05:00.012427 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcz4n\" (UniqueName: \"kubernetes.io/projected/a2bde7b6-4ac1-40b1-b678-c64594684743-kube-api-access-hcz4n\") pod \"redhat-marketplace-t2cjr\" (UID: \"a2bde7b6-4ac1-40b1-b678-c64594684743\") " pod="openshift-marketplace/redhat-marketplace-t2cjr" Mar 14 07:05:00 crc kubenswrapper[4893]: I0314 07:05:00.092585 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wh9m\" (UniqueName: \"kubernetes.io/projected/9e3c87be-1689-4d71-b7e1-d71829082b39-kube-api-access-2wh9m\") pod \"community-operators-nkk7t\" (UID: \"9e3c87be-1689-4d71-b7e1-d71829082b39\") " pod="openshift-marketplace/community-operators-nkk7t" Mar 14 07:05:00 crc kubenswrapper[4893]: I0314 07:05:00.092637 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e3c87be-1689-4d71-b7e1-d71829082b39-utilities\") pod \"community-operators-nkk7t\" (UID: \"9e3c87be-1689-4d71-b7e1-d71829082b39\") " pod="openshift-marketplace/community-operators-nkk7t" Mar 14 07:05:00 crc kubenswrapper[4893]: I0314 07:05:00.092655 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e3c87be-1689-4d71-b7e1-d71829082b39-catalog-content\") pod \"community-operators-nkk7t\" (UID: \"9e3c87be-1689-4d71-b7e1-d71829082b39\") " pod="openshift-marketplace/community-operators-nkk7t" Mar 14 07:05:00 crc kubenswrapper[4893]: I0314 07:05:00.111764 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t2cjr" Mar 14 07:05:00 crc kubenswrapper[4893]: I0314 07:05:00.194547 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wh9m\" (UniqueName: \"kubernetes.io/projected/9e3c87be-1689-4d71-b7e1-d71829082b39-kube-api-access-2wh9m\") pod \"community-operators-nkk7t\" (UID: \"9e3c87be-1689-4d71-b7e1-d71829082b39\") " pod="openshift-marketplace/community-operators-nkk7t" Mar 14 07:05:00 crc kubenswrapper[4893]: I0314 07:05:00.194595 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e3c87be-1689-4d71-b7e1-d71829082b39-utilities\") pod \"community-operators-nkk7t\" (UID: \"9e3c87be-1689-4d71-b7e1-d71829082b39\") " pod="openshift-marketplace/community-operators-nkk7t" Mar 14 07:05:00 crc kubenswrapper[4893]: I0314 07:05:00.194616 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e3c87be-1689-4d71-b7e1-d71829082b39-catalog-content\") pod \"community-operators-nkk7t\" (UID: \"9e3c87be-1689-4d71-b7e1-d71829082b39\") " pod="openshift-marketplace/community-operators-nkk7t" Mar 14 07:05:00 crc kubenswrapper[4893]: I0314 07:05:00.195015 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e3c87be-1689-4d71-b7e1-d71829082b39-catalog-content\") pod \"community-operators-nkk7t\" (UID: \"9e3c87be-1689-4d71-b7e1-d71829082b39\") " pod="openshift-marketplace/community-operators-nkk7t" Mar 14 07:05:00 crc kubenswrapper[4893]: I0314 07:05:00.195446 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e3c87be-1689-4d71-b7e1-d71829082b39-utilities\") pod \"community-operators-nkk7t\" (UID: \"9e3c87be-1689-4d71-b7e1-d71829082b39\") " pod="openshift-marketplace/community-operators-nkk7t" Mar 14 07:05:00 crc kubenswrapper[4893]: I0314 07:05:00.223285 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wh9m\" (UniqueName: \"kubernetes.io/projected/9e3c87be-1689-4d71-b7e1-d71829082b39-kube-api-access-2wh9m\") pod \"community-operators-nkk7t\" (UID: \"9e3c87be-1689-4d71-b7e1-d71829082b39\") " pod="openshift-marketplace/community-operators-nkk7t" Mar 14 07:05:00 crc kubenswrapper[4893]: I0314 07:05:00.308881 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nkk7t" Mar 14 07:05:00 crc kubenswrapper[4893]: I0314 07:05:00.501208 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t2cjr"] Mar 14 07:05:00 crc kubenswrapper[4893]: I0314 07:05:00.677407 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nkk7t"] Mar 14 07:05:00 crc kubenswrapper[4893]: I0314 07:05:00.696047 4893 generic.go:334] "Generic (PLEG): container finished" podID="a2bde7b6-4ac1-40b1-b678-c64594684743" containerID="0af866d23cdfbb5f96028233dc27095d126489722674592fbb986e6b739d9113" exitCode=0 Mar 14 07:05:00 crc kubenswrapper[4893]: I0314 07:05:00.696102 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t2cjr" event={"ID":"a2bde7b6-4ac1-40b1-b678-c64594684743","Type":"ContainerDied","Data":"0af866d23cdfbb5f96028233dc27095d126489722674592fbb986e6b739d9113"} Mar 14 07:05:00 crc kubenswrapper[4893]: I0314 07:05:00.696354 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t2cjr" event={"ID":"a2bde7b6-4ac1-40b1-b678-c64594684743","Type":"ContainerStarted","Data":"e11aaa9d6f9a814390a7f4fc040843a013d0cbf53982d786c5313068b8228256"} Mar 14 07:05:00 crc kubenswrapper[4893]: W0314 07:05:00.711734 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e3c87be_1689_4d71_b7e1_d71829082b39.slice/crio-b09b6b15446dd3e63f6241d8276c9f4b035f3b505842ade22f2c1eb59bd63e97 WatchSource:0}: Error finding container b09b6b15446dd3e63f6241d8276c9f4b035f3b505842ade22f2c1eb59bd63e97: Status 404 returned error can't find the container with id b09b6b15446dd3e63f6241d8276c9f4b035f3b505842ade22f2c1eb59bd63e97 Mar 14 07:05:01 crc kubenswrapper[4893]: I0314 07:05:01.702268 4893 generic.go:334] "Generic (PLEG): container finished" podID="9e3c87be-1689-4d71-b7e1-d71829082b39" containerID="23920811e5af972e9f7e31cd5c859d37dd9b53123db7400b1c8f4b8fd098419c" exitCode=0 Mar 14 07:05:01 crc kubenswrapper[4893]: I0314 07:05:01.702318 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nkk7t" event={"ID":"9e3c87be-1689-4d71-b7e1-d71829082b39","Type":"ContainerDied","Data":"23920811e5af972e9f7e31cd5c859d37dd9b53123db7400b1c8f4b8fd098419c"} Mar 14 07:05:01 crc kubenswrapper[4893]: I0314 07:05:01.702550 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nkk7t" event={"ID":"9e3c87be-1689-4d71-b7e1-d71829082b39","Type":"ContainerStarted","Data":"b09b6b15446dd3e63f6241d8276c9f4b035f3b505842ade22f2c1eb59bd63e97"} Mar 14 07:05:01 crc kubenswrapper[4893]: I0314 07:05:01.705054 4893 generic.go:334] "Generic (PLEG): container finished" podID="a2bde7b6-4ac1-40b1-b678-c64594684743" containerID="f00299eb766259beb88be01498d2fe5bf9c4def1a01315cc760820c7e1914acd" exitCode=0 Mar 14 07:05:01 crc kubenswrapper[4893]: I0314 07:05:01.705211 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t2cjr" event={"ID":"a2bde7b6-4ac1-40b1-b678-c64594684743","Type":"ContainerDied","Data":"f00299eb766259beb88be01498d2fe5bf9c4def1a01315cc760820c7e1914acd"} Mar 14 07:05:02 crc kubenswrapper[4893]: I0314 07:05:02.194360 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xpqn5"] Mar 14 07:05:02 crc kubenswrapper[4893]: I0314 07:05:02.195674 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xpqn5" Mar 14 07:05:02 crc kubenswrapper[4893]: I0314 07:05:02.197228 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 14 07:05:02 crc kubenswrapper[4893]: I0314 07:05:02.202244 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xpqn5"] Mar 14 07:05:02 crc kubenswrapper[4893]: I0314 07:05:02.321952 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57b36d58-d179-46c0-a64b-db5bb0ccc8be-utilities\") pod \"redhat-operators-xpqn5\" (UID: \"57b36d58-d179-46c0-a64b-db5bb0ccc8be\") " pod="openshift-marketplace/redhat-operators-xpqn5" Mar 14 07:05:02 crc kubenswrapper[4893]: I0314 07:05:02.321997 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96hft\" (UniqueName: \"kubernetes.io/projected/57b36d58-d179-46c0-a64b-db5bb0ccc8be-kube-api-access-96hft\") pod \"redhat-operators-xpqn5\" (UID: \"57b36d58-d179-46c0-a64b-db5bb0ccc8be\") " pod="openshift-marketplace/redhat-operators-xpqn5" Mar 14 07:05:02 crc kubenswrapper[4893]: I0314 07:05:02.322048 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57b36d58-d179-46c0-a64b-db5bb0ccc8be-catalog-content\") pod \"redhat-operators-xpqn5\" (UID: \"57b36d58-d179-46c0-a64b-db5bb0ccc8be\") " pod="openshift-marketplace/redhat-operators-xpqn5" Mar 14 07:05:02 crc kubenswrapper[4893]: I0314 07:05:02.397172 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-pcmps"] Mar 14 07:05:02 crc kubenswrapper[4893]: I0314 07:05:02.399392 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pcmps" Mar 14 07:05:02 crc kubenswrapper[4893]: I0314 07:05:02.402544 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 14 07:05:02 crc kubenswrapper[4893]: I0314 07:05:02.408486 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pcmps"] Mar 14 07:05:02 crc kubenswrapper[4893]: I0314 07:05:02.422936 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57b36d58-d179-46c0-a64b-db5bb0ccc8be-utilities\") pod \"redhat-operators-xpqn5\" (UID: \"57b36d58-d179-46c0-a64b-db5bb0ccc8be\") " pod="openshift-marketplace/redhat-operators-xpqn5" Mar 14 07:05:02 crc kubenswrapper[4893]: I0314 07:05:02.422986 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96hft\" (UniqueName: \"kubernetes.io/projected/57b36d58-d179-46c0-a64b-db5bb0ccc8be-kube-api-access-96hft\") pod \"redhat-operators-xpqn5\" (UID: \"57b36d58-d179-46c0-a64b-db5bb0ccc8be\") " pod="openshift-marketplace/redhat-operators-xpqn5" Mar 14 07:05:02 crc kubenswrapper[4893]: I0314 07:05:02.423045 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57b36d58-d179-46c0-a64b-db5bb0ccc8be-catalog-content\") pod \"redhat-operators-xpqn5\" (UID: \"57b36d58-d179-46c0-a64b-db5bb0ccc8be\") " pod="openshift-marketplace/redhat-operators-xpqn5" Mar 14 07:05:02 crc kubenswrapper[4893]: I0314 07:05:02.423434 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57b36d58-d179-46c0-a64b-db5bb0ccc8be-utilities\") pod \"redhat-operators-xpqn5\" (UID: \"57b36d58-d179-46c0-a64b-db5bb0ccc8be\") " pod="openshift-marketplace/redhat-operators-xpqn5" Mar 14 07:05:02 crc kubenswrapper[4893]: I0314 07:05:02.423864 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57b36d58-d179-46c0-a64b-db5bb0ccc8be-catalog-content\") pod \"redhat-operators-xpqn5\" (UID: \"57b36d58-d179-46c0-a64b-db5bb0ccc8be\") " pod="openshift-marketplace/redhat-operators-xpqn5" Mar 14 07:05:02 crc kubenswrapper[4893]: I0314 07:05:02.445785 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96hft\" (UniqueName: \"kubernetes.io/projected/57b36d58-d179-46c0-a64b-db5bb0ccc8be-kube-api-access-96hft\") pod \"redhat-operators-xpqn5\" (UID: \"57b36d58-d179-46c0-a64b-db5bb0ccc8be\") " pod="openshift-marketplace/redhat-operators-xpqn5" Mar 14 07:05:02 crc kubenswrapper[4893]: I0314 07:05:02.512864 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xpqn5" Mar 14 07:05:02 crc kubenswrapper[4893]: I0314 07:05:02.524688 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d21f5443-3331-4559-814d-5a68ee167fa5-utilities\") pod \"certified-operators-pcmps\" (UID: \"d21f5443-3331-4559-814d-5a68ee167fa5\") " pod="openshift-marketplace/certified-operators-pcmps" Mar 14 07:05:02 crc kubenswrapper[4893]: I0314 07:05:02.524908 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d21f5443-3331-4559-814d-5a68ee167fa5-catalog-content\") pod \"certified-operators-pcmps\" (UID: \"d21f5443-3331-4559-814d-5a68ee167fa5\") " pod="openshift-marketplace/certified-operators-pcmps" Mar 14 07:05:02 crc kubenswrapper[4893]: I0314 07:05:02.524997 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwk4p\" (UniqueName: \"kubernetes.io/projected/d21f5443-3331-4559-814d-5a68ee167fa5-kube-api-access-cwk4p\") pod \"certified-operators-pcmps\" (UID: \"d21f5443-3331-4559-814d-5a68ee167fa5\") " pod="openshift-marketplace/certified-operators-pcmps" Mar 14 07:05:02 crc kubenswrapper[4893]: I0314 07:05:02.626540 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d21f5443-3331-4559-814d-5a68ee167fa5-utilities\") pod \"certified-operators-pcmps\" (UID: \"d21f5443-3331-4559-814d-5a68ee167fa5\") " pod="openshift-marketplace/certified-operators-pcmps" Mar 14 07:05:02 crc kubenswrapper[4893]: I0314 07:05:02.626963 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d21f5443-3331-4559-814d-5a68ee167fa5-catalog-content\") pod \"certified-operators-pcmps\" (UID: \"d21f5443-3331-4559-814d-5a68ee167fa5\") " pod="openshift-marketplace/certified-operators-pcmps" Mar 14 07:05:02 crc kubenswrapper[4893]: I0314 07:05:02.627043 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwk4p\" (UniqueName: \"kubernetes.io/projected/d21f5443-3331-4559-814d-5a68ee167fa5-kube-api-access-cwk4p\") pod \"certified-operators-pcmps\" (UID: \"d21f5443-3331-4559-814d-5a68ee167fa5\") " pod="openshift-marketplace/certified-operators-pcmps" Mar 14 07:05:02 crc kubenswrapper[4893]: I0314 07:05:02.627632 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d21f5443-3331-4559-814d-5a68ee167fa5-catalog-content\") pod \"certified-operators-pcmps\" (UID: \"d21f5443-3331-4559-814d-5a68ee167fa5\") " pod="openshift-marketplace/certified-operators-pcmps" Mar 14 07:05:02 crc kubenswrapper[4893]: I0314 07:05:02.631689 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d21f5443-3331-4559-814d-5a68ee167fa5-utilities\") pod \"certified-operators-pcmps\" (UID: \"d21f5443-3331-4559-814d-5a68ee167fa5\") " pod="openshift-marketplace/certified-operators-pcmps" Mar 14 07:05:02 crc kubenswrapper[4893]: I0314 07:05:02.646494 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwk4p\" (UniqueName: \"kubernetes.io/projected/d21f5443-3331-4559-814d-5a68ee167fa5-kube-api-access-cwk4p\") pod \"certified-operators-pcmps\" (UID: \"d21f5443-3331-4559-814d-5a68ee167fa5\") " pod="openshift-marketplace/certified-operators-pcmps" Mar 14 07:05:02 crc kubenswrapper[4893]: I0314 07:05:02.715034 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pcmps" Mar 14 07:05:02 crc kubenswrapper[4893]: I0314 07:05:02.908972 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xpqn5"] Mar 14 07:05:02 crc kubenswrapper[4893]: W0314 07:05:02.913718 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57b36d58_d179_46c0_a64b_db5bb0ccc8be.slice/crio-055094aa10f1cae5a317d5169be08ebb05e9200eda4bf2e83d44b4ec0ef6cb18 WatchSource:0}: Error finding container 055094aa10f1cae5a317d5169be08ebb05e9200eda4bf2e83d44b4ec0ef6cb18: Status 404 returned error can't find the container with id 055094aa10f1cae5a317d5169be08ebb05e9200eda4bf2e83d44b4ec0ef6cb18 Mar 14 07:05:03 crc kubenswrapper[4893]: I0314 07:05:03.123745 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pcmps"] Mar 14 07:05:03 crc kubenswrapper[4893]: W0314 07:05:03.161154 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd21f5443_3331_4559_814d_5a68ee167fa5.slice/crio-d800c08fd07c525ee924322ded79d35a821bf4997c2fac644753ad100cb3dcbf WatchSource:0}: Error finding container d800c08fd07c525ee924322ded79d35a821bf4997c2fac644753ad100cb3dcbf: Status 404 returned error can't find the container with id d800c08fd07c525ee924322ded79d35a821bf4997c2fac644753ad100cb3dcbf Mar 14 07:05:03 crc kubenswrapper[4893]: I0314 07:05:03.717114 4893 generic.go:334] "Generic (PLEG): container finished" podID="9e3c87be-1689-4d71-b7e1-d71829082b39" containerID="763e3789a1094fa4a390bb442fc1a920abfd164c30f77c77f3596aee09477afc" exitCode=0 Mar 14 07:05:03 crc kubenswrapper[4893]: I0314 07:05:03.717207 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nkk7t" event={"ID":"9e3c87be-1689-4d71-b7e1-d71829082b39","Type":"ContainerDied","Data":"763e3789a1094fa4a390bb442fc1a920abfd164c30f77c77f3596aee09477afc"} Mar 14 07:05:03 crc kubenswrapper[4893]: I0314 07:05:03.719721 4893 generic.go:334] "Generic (PLEG): container finished" podID="d21f5443-3331-4559-814d-5a68ee167fa5" containerID="39bea43306dd31b17616d6af0f125a5e6828b942fa2cc30b5d460e34ebc10581" exitCode=0 Mar 14 07:05:03 crc kubenswrapper[4893]: I0314 07:05:03.719884 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pcmps" event={"ID":"d21f5443-3331-4559-814d-5a68ee167fa5","Type":"ContainerDied","Data":"39bea43306dd31b17616d6af0f125a5e6828b942fa2cc30b5d460e34ebc10581"} Mar 14 07:05:03 crc kubenswrapper[4893]: I0314 07:05:03.719919 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pcmps" event={"ID":"d21f5443-3331-4559-814d-5a68ee167fa5","Type":"ContainerStarted","Data":"d800c08fd07c525ee924322ded79d35a821bf4997c2fac644753ad100cb3dcbf"} Mar 14 07:05:03 crc kubenswrapper[4893]: I0314 07:05:03.723984 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t2cjr" event={"ID":"a2bde7b6-4ac1-40b1-b678-c64594684743","Type":"ContainerStarted","Data":"92f2a5ec5c480518b10671c23a631c221467454d0ebe9ddae2b150d71d38a941"} Mar 14 07:05:03 crc kubenswrapper[4893]: I0314 07:05:03.726614 4893 generic.go:334] "Generic (PLEG): container finished" podID="57b36d58-d179-46c0-a64b-db5bb0ccc8be" containerID="b2eda202fbde0de92c38fee3ffb654841d925671e08bb15d0e1c98b4a862f6c6" exitCode=0 Mar 14 07:05:03 crc kubenswrapper[4893]: I0314 07:05:03.726645 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xpqn5" event={"ID":"57b36d58-d179-46c0-a64b-db5bb0ccc8be","Type":"ContainerDied","Data":"b2eda202fbde0de92c38fee3ffb654841d925671e08bb15d0e1c98b4a862f6c6"} Mar 14 07:05:03 crc kubenswrapper[4893]: I0314 07:05:03.726666 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xpqn5" event={"ID":"57b36d58-d179-46c0-a64b-db5bb0ccc8be","Type":"ContainerStarted","Data":"055094aa10f1cae5a317d5169be08ebb05e9200eda4bf2e83d44b4ec0ef6cb18"} Mar 14 07:05:03 crc kubenswrapper[4893]: I0314 07:05:03.784231 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-t2cjr" podStartSLOduration=2.80009689 podStartE2EDuration="4.784216086s" podCreationTimestamp="2026-03-14 07:04:59 +0000 UTC" firstStartedPulling="2026-03-14 07:05:00.698034265 +0000 UTC m=+379.960211057" lastFinishedPulling="2026-03-14 07:05:02.682153461 +0000 UTC m=+381.944330253" observedRunningTime="2026-03-14 07:05:03.783942149 +0000 UTC m=+383.046118961" watchObservedRunningTime="2026-03-14 07:05:03.784216086 +0000 UTC m=+383.046392878" Mar 14 07:05:04 crc kubenswrapper[4893]: I0314 07:05:04.733044 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xpqn5" event={"ID":"57b36d58-d179-46c0-a64b-db5bb0ccc8be","Type":"ContainerStarted","Data":"720ac1c9b798689f15121906cebaee6ab0d8ad67f6e3acba6e3c501749ce815a"} Mar 14 07:05:04 crc kubenswrapper[4893]: I0314 07:05:04.734685 4893 generic.go:334] "Generic (PLEG): container finished" podID="d21f5443-3331-4559-814d-5a68ee167fa5" containerID="87b1ff72b49a5d48f67f4b4695a801bbd5e7cdd6b3aa61a1cf8162561efc7dfc" exitCode=0 Mar 14 07:05:04 crc kubenswrapper[4893]: I0314 07:05:04.734746 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pcmps" event={"ID":"d21f5443-3331-4559-814d-5a68ee167fa5","Type":"ContainerDied","Data":"87b1ff72b49a5d48f67f4b4695a801bbd5e7cdd6b3aa61a1cf8162561efc7dfc"} Mar 14 07:05:05 crc kubenswrapper[4893]: I0314 07:05:05.743548 4893 generic.go:334] "Generic (PLEG): container finished" podID="57b36d58-d179-46c0-a64b-db5bb0ccc8be" containerID="720ac1c9b798689f15121906cebaee6ab0d8ad67f6e3acba6e3c501749ce815a" exitCode=0 Mar 14 07:05:05 crc kubenswrapper[4893]: I0314 07:05:05.743687 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xpqn5" event={"ID":"57b36d58-d179-46c0-a64b-db5bb0ccc8be","Type":"ContainerDied","Data":"720ac1c9b798689f15121906cebaee6ab0d8ad67f6e3acba6e3c501749ce815a"} Mar 14 07:05:05 crc kubenswrapper[4893]: I0314 07:05:05.746936 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nkk7t" event={"ID":"9e3c87be-1689-4d71-b7e1-d71829082b39","Type":"ContainerStarted","Data":"9c825471d259f8050c542c4f894658221959723ad9e6ba3132c839512456366a"} Mar 14 07:05:05 crc kubenswrapper[4893]: I0314 07:05:05.752127 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pcmps" event={"ID":"d21f5443-3331-4559-814d-5a68ee167fa5","Type":"ContainerStarted","Data":"21569470a2703b67a833b07eba4106383e16433244c586fbf83919ce7c9334a3"} Mar 14 07:05:05 crc kubenswrapper[4893]: I0314 07:05:05.790775 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nkk7t" podStartSLOduration=3.307525033 podStartE2EDuration="6.790756334s" podCreationTimestamp="2026-03-14 07:04:59 +0000 UTC" firstStartedPulling="2026-03-14 07:05:01.704906442 +0000 UTC m=+380.967083234" lastFinishedPulling="2026-03-14 07:05:05.188137713 +0000 UTC m=+384.450314535" observedRunningTime="2026-03-14 07:05:05.786102979 +0000 UTC m=+385.048279771" watchObservedRunningTime="2026-03-14 07:05:05.790756334 +0000 UTC m=+385.052933126" Mar 14 07:05:05 crc kubenswrapper[4893]: I0314 07:05:05.810296 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-pcmps" podStartSLOduration=2.147376795 podStartE2EDuration="3.810271309s" podCreationTimestamp="2026-03-14 07:05:02 +0000 UTC" firstStartedPulling="2026-03-14 07:05:03.72264656 +0000 UTC m=+382.984823372" lastFinishedPulling="2026-03-14 07:05:05.385541094 +0000 UTC m=+384.647717886" observedRunningTime="2026-03-14 07:05:05.806178679 +0000 UTC m=+385.068355471" watchObservedRunningTime="2026-03-14 07:05:05.810271309 +0000 UTC m=+385.072448111" Mar 14 07:05:06 crc kubenswrapper[4893]: I0314 07:05:06.759424 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xpqn5" event={"ID":"57b36d58-d179-46c0-a64b-db5bb0ccc8be","Type":"ContainerStarted","Data":"fd075ea2c7b979d0d0c1712776562a7dfd7877d7275defe016221b3ab86c0218"} Mar 14 07:05:06 crc kubenswrapper[4893]: I0314 07:05:06.787716 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xpqn5" podStartSLOduration=2.306488436 podStartE2EDuration="4.787697313s" podCreationTimestamp="2026-03-14 07:05:02 +0000 UTC" firstStartedPulling="2026-03-14 07:05:03.728172289 +0000 UTC m=+382.990349091" lastFinishedPulling="2026-03-14 07:05:06.209381176 +0000 UTC m=+385.471557968" observedRunningTime="2026-03-14 07:05:06.787021525 +0000 UTC m=+386.049198337" watchObservedRunningTime="2026-03-14 07:05:06.787697313 +0000 UTC m=+386.049874115" Mar 14 07:05:10 crc kubenswrapper[4893]: I0314 07:05:10.111889 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-t2cjr" Mar 14 07:05:10 crc kubenswrapper[4893]: I0314 07:05:10.112274 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-t2cjr" Mar 14 07:05:10 crc kubenswrapper[4893]: I0314 07:05:10.150359 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-t2cjr" Mar 14 07:05:10 crc kubenswrapper[4893]: I0314 07:05:10.309123 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nkk7t" Mar 14 07:05:10 crc kubenswrapper[4893]: I0314 07:05:10.309186 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nkk7t" Mar 14 07:05:10 crc kubenswrapper[4893]: I0314 07:05:10.342892 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nkk7t" Mar 14 07:05:10 crc kubenswrapper[4893]: I0314 07:05:10.833800 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nkk7t" Mar 14 07:05:10 crc kubenswrapper[4893]: I0314 07:05:10.840673 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-t2cjr" Mar 14 07:05:12 crc kubenswrapper[4893]: I0314 07:05:12.513303 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xpqn5" Mar 14 07:05:12 crc kubenswrapper[4893]: I0314 07:05:12.513761 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xpqn5" Mar 14 07:05:12 crc kubenswrapper[4893]: I0314 07:05:12.696107 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-ml6qq" Mar 14 07:05:12 crc kubenswrapper[4893]: I0314 07:05:12.715744 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-pcmps" Mar 14 07:05:12 crc kubenswrapper[4893]: I0314 07:05:12.715781 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-pcmps" Mar 14 07:05:12 crc kubenswrapper[4893]: I0314 07:05:12.772811 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mjzxr"] Mar 14 07:05:12 crc kubenswrapper[4893]: I0314 07:05:12.787752 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-pcmps" Mar 14 07:05:12 crc kubenswrapper[4893]: I0314 07:05:12.845929 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-pcmps" Mar 14 07:05:13 crc kubenswrapper[4893]: I0314 07:05:13.576116 4893 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xpqn5" podUID="57b36d58-d179-46c0-a64b-db5bb0ccc8be" containerName="registry-server" probeResult="failure" output=< Mar 14 07:05:13 crc kubenswrapper[4893]: timeout: failed to connect service ":50051" within 1s Mar 14 07:05:13 crc kubenswrapper[4893]: > Mar 14 07:05:22 crc kubenswrapper[4893]: I0314 07:05:22.561700 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xpqn5" Mar 14 07:05:22 crc kubenswrapper[4893]: I0314 07:05:22.599873 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xpqn5" Mar 14 07:05:37 crc kubenswrapper[4893]: I0314 07:05:37.824090 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" podUID="85a8345c-8774-4272-887a-42b2d64a65cf" containerName="registry" containerID="cri-o://4ef354b203d1da84471c23e5fea5474eab1dcc41b9caa9c5bae3440f86862da8" gracePeriod=30 Mar 14 07:05:38 crc kubenswrapper[4893]: I0314 07:05:38.274085 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:05:38 crc kubenswrapper[4893]: I0314 07:05:38.445862 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"85a8345c-8774-4272-887a-42b2d64a65cf\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " Mar 14 07:05:38 crc kubenswrapper[4893]: I0314 07:05:38.445949 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/85a8345c-8774-4272-887a-42b2d64a65cf-ca-trust-extracted\") pod \"85a8345c-8774-4272-887a-42b2d64a65cf\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " Mar 14 07:05:38 crc kubenswrapper[4893]: I0314 07:05:38.446006 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/85a8345c-8774-4272-887a-42b2d64a65cf-bound-sa-token\") pod \"85a8345c-8774-4272-887a-42b2d64a65cf\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " Mar 14 07:05:38 crc kubenswrapper[4893]: I0314 07:05:38.446053 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/85a8345c-8774-4272-887a-42b2d64a65cf-installation-pull-secrets\") pod \"85a8345c-8774-4272-887a-42b2d64a65cf\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " Mar 14 07:05:38 crc kubenswrapper[4893]: I0314 07:05:38.446079 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/85a8345c-8774-4272-887a-42b2d64a65cf-registry-certificates\") pod \"85a8345c-8774-4272-887a-42b2d64a65cf\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " Mar 14 07:05:38 crc kubenswrapper[4893]: I0314 07:05:38.446098 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/85a8345c-8774-4272-887a-42b2d64a65cf-trusted-ca\") pod \"85a8345c-8774-4272-887a-42b2d64a65cf\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " Mar 14 07:05:38 crc kubenswrapper[4893]: I0314 07:05:38.446121 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gcg52\" (UniqueName: \"kubernetes.io/projected/85a8345c-8774-4272-887a-42b2d64a65cf-kube-api-access-gcg52\") pod \"85a8345c-8774-4272-887a-42b2d64a65cf\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " Mar 14 07:05:38 crc kubenswrapper[4893]: I0314 07:05:38.446157 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/85a8345c-8774-4272-887a-42b2d64a65cf-registry-tls\") pod \"85a8345c-8774-4272-887a-42b2d64a65cf\" (UID: \"85a8345c-8774-4272-887a-42b2d64a65cf\") " Mar 14 07:05:38 crc kubenswrapper[4893]: I0314 07:05:38.447096 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85a8345c-8774-4272-887a-42b2d64a65cf-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "85a8345c-8774-4272-887a-42b2d64a65cf" (UID: "85a8345c-8774-4272-887a-42b2d64a65cf"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:05:38 crc kubenswrapper[4893]: I0314 07:05:38.447121 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85a8345c-8774-4272-887a-42b2d64a65cf-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "85a8345c-8774-4272-887a-42b2d64a65cf" (UID: "85a8345c-8774-4272-887a-42b2d64a65cf"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:05:38 crc kubenswrapper[4893]: I0314 07:05:38.452768 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85a8345c-8774-4272-887a-42b2d64a65cf-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "85a8345c-8774-4272-887a-42b2d64a65cf" (UID: "85a8345c-8774-4272-887a-42b2d64a65cf"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:05:38 crc kubenswrapper[4893]: I0314 07:05:38.453349 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85a8345c-8774-4272-887a-42b2d64a65cf-kube-api-access-gcg52" (OuterVolumeSpecName: "kube-api-access-gcg52") pod "85a8345c-8774-4272-887a-42b2d64a65cf" (UID: "85a8345c-8774-4272-887a-42b2d64a65cf"). InnerVolumeSpecName "kube-api-access-gcg52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:05:38 crc kubenswrapper[4893]: I0314 07:05:38.453619 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85a8345c-8774-4272-887a-42b2d64a65cf-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "85a8345c-8774-4272-887a-42b2d64a65cf" (UID: "85a8345c-8774-4272-887a-42b2d64a65cf"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:05:38 crc kubenswrapper[4893]: I0314 07:05:38.454038 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85a8345c-8774-4272-887a-42b2d64a65cf-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "85a8345c-8774-4272-887a-42b2d64a65cf" (UID: "85a8345c-8774-4272-887a-42b2d64a65cf"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:05:38 crc kubenswrapper[4893]: I0314 07:05:38.459138 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "85a8345c-8774-4272-887a-42b2d64a65cf" (UID: "85a8345c-8774-4272-887a-42b2d64a65cf"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 14 07:05:38 crc kubenswrapper[4893]: I0314 07:05:38.463929 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85a8345c-8774-4272-887a-42b2d64a65cf-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "85a8345c-8774-4272-887a-42b2d64a65cf" (UID: "85a8345c-8774-4272-887a-42b2d64a65cf"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:05:38 crc kubenswrapper[4893]: I0314 07:05:38.547694 4893 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/85a8345c-8774-4272-887a-42b2d64a65cf-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 14 07:05:38 crc kubenswrapper[4893]: I0314 07:05:38.547749 4893 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/85a8345c-8774-4272-887a-42b2d64a65cf-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 14 07:05:38 crc kubenswrapper[4893]: I0314 07:05:38.547771 4893 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/85a8345c-8774-4272-887a-42b2d64a65cf-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 14 07:05:38 crc kubenswrapper[4893]: I0314 07:05:38.547790 4893 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/85a8345c-8774-4272-887a-42b2d64a65cf-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 14 07:05:38 crc kubenswrapper[4893]: I0314 07:05:38.547810 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gcg52\" (UniqueName: \"kubernetes.io/projected/85a8345c-8774-4272-887a-42b2d64a65cf-kube-api-access-gcg52\") on node \"crc\" DevicePath \"\"" Mar 14 07:05:38 crc kubenswrapper[4893]: I0314 07:05:38.547831 4893 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/85a8345c-8774-4272-887a-42b2d64a65cf-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 14 07:05:38 crc kubenswrapper[4893]: I0314 07:05:38.547849 4893 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/85a8345c-8774-4272-887a-42b2d64a65cf-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 14 07:05:38 crc kubenswrapper[4893]: I0314 07:05:38.957977 4893 generic.go:334] "Generic (PLEG): container finished" podID="85a8345c-8774-4272-887a-42b2d64a65cf" containerID="4ef354b203d1da84471c23e5fea5474eab1dcc41b9caa9c5bae3440f86862da8" exitCode=0 Mar 14 07:05:38 crc kubenswrapper[4893]: I0314 07:05:38.958076 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" Mar 14 07:05:38 crc kubenswrapper[4893]: I0314 07:05:38.958104 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" event={"ID":"85a8345c-8774-4272-887a-42b2d64a65cf","Type":"ContainerDied","Data":"4ef354b203d1da84471c23e5fea5474eab1dcc41b9caa9c5bae3440f86862da8"} Mar 14 07:05:38 crc kubenswrapper[4893]: I0314 07:05:38.958386 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mjzxr" event={"ID":"85a8345c-8774-4272-887a-42b2d64a65cf","Type":"ContainerDied","Data":"47937fbea22995d6de1a09e7355cd647ed372fae6e9897681b4eb0c16893141d"} Mar 14 07:05:38 crc kubenswrapper[4893]: I0314 07:05:38.958410 4893 scope.go:117] "RemoveContainer" containerID="4ef354b203d1da84471c23e5fea5474eab1dcc41b9caa9c5bae3440f86862da8" Mar 14 07:05:38 crc kubenswrapper[4893]: I0314 07:05:38.987056 4893 scope.go:117] "RemoveContainer" containerID="4ef354b203d1da84471c23e5fea5474eab1dcc41b9caa9c5bae3440f86862da8" Mar 14 07:05:38 crc kubenswrapper[4893]: E0314 07:05:38.987842 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ef354b203d1da84471c23e5fea5474eab1dcc41b9caa9c5bae3440f86862da8\": container with ID starting with 4ef354b203d1da84471c23e5fea5474eab1dcc41b9caa9c5bae3440f86862da8 not found: ID does not exist" containerID="4ef354b203d1da84471c23e5fea5474eab1dcc41b9caa9c5bae3440f86862da8" Mar 14 07:05:38 crc kubenswrapper[4893]: I0314 07:05:38.987898 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ef354b203d1da84471c23e5fea5474eab1dcc41b9caa9c5bae3440f86862da8"} err="failed to get container status \"4ef354b203d1da84471c23e5fea5474eab1dcc41b9caa9c5bae3440f86862da8\": rpc error: code = NotFound desc = could not find container \"4ef354b203d1da84471c23e5fea5474eab1dcc41b9caa9c5bae3440f86862da8\": container with ID starting with 4ef354b203d1da84471c23e5fea5474eab1dcc41b9caa9c5bae3440f86862da8 not found: ID does not exist" Mar 14 07:05:38 crc kubenswrapper[4893]: I0314 07:05:38.998207 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mjzxr"] Mar 14 07:05:39 crc kubenswrapper[4893]: I0314 07:05:39.003059 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mjzxr"] Mar 14 07:05:39 crc kubenswrapper[4893]: I0314 07:05:39.391316 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85a8345c-8774-4272-887a-42b2d64a65cf" path="/var/lib/kubelet/pods/85a8345c-8774-4272-887a-42b2d64a65cf/volumes" Mar 14 07:05:59 crc kubenswrapper[4893]: I0314 07:05:59.731114 4893 patch_prober.go:28] interesting pod/machine-config-daemon-d4x6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:05:59 crc kubenswrapper[4893]: I0314 07:05:59.731716 4893 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:06:00 crc kubenswrapper[4893]: I0314 07:06:00.149667 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557866-bzhl6"] Mar 14 07:06:00 crc kubenswrapper[4893]: E0314 07:06:00.150538 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85a8345c-8774-4272-887a-42b2d64a65cf" containerName="registry" Mar 14 07:06:00 crc kubenswrapper[4893]: I0314 07:06:00.150564 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="85a8345c-8774-4272-887a-42b2d64a65cf" containerName="registry" Mar 14 07:06:00 crc kubenswrapper[4893]: I0314 07:06:00.150829 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="85a8345c-8774-4272-887a-42b2d64a65cf" containerName="registry" Mar 14 07:06:00 crc kubenswrapper[4893]: I0314 07:06:00.151447 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557866-bzhl6" Mar 14 07:06:00 crc kubenswrapper[4893]: I0314 07:06:00.154486 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 07:06:00 crc kubenswrapper[4893]: I0314 07:06:00.159174 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 07:06:00 crc kubenswrapper[4893]: I0314 07:06:00.160285 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557866-bzhl6"] Mar 14 07:06:00 crc kubenswrapper[4893]: I0314 07:06:00.160677 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-44qb7" Mar 14 07:06:00 crc kubenswrapper[4893]: I0314 07:06:00.253590 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fhdc\" (UniqueName: \"kubernetes.io/projected/2f6cbd57-37ba-433b-afbf-59e3083c9dc2-kube-api-access-4fhdc\") pod \"auto-csr-approver-29557866-bzhl6\" (UID: \"2f6cbd57-37ba-433b-afbf-59e3083c9dc2\") " pod="openshift-infra/auto-csr-approver-29557866-bzhl6" Mar 14 07:06:00 crc kubenswrapper[4893]: I0314 07:06:00.354973 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fhdc\" (UniqueName: \"kubernetes.io/projected/2f6cbd57-37ba-433b-afbf-59e3083c9dc2-kube-api-access-4fhdc\") pod \"auto-csr-approver-29557866-bzhl6\" (UID: \"2f6cbd57-37ba-433b-afbf-59e3083c9dc2\") " pod="openshift-infra/auto-csr-approver-29557866-bzhl6" Mar 14 07:06:00 crc kubenswrapper[4893]: I0314 07:06:00.374253 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fhdc\" (UniqueName: \"kubernetes.io/projected/2f6cbd57-37ba-433b-afbf-59e3083c9dc2-kube-api-access-4fhdc\") pod \"auto-csr-approver-29557866-bzhl6\" (UID: \"2f6cbd57-37ba-433b-afbf-59e3083c9dc2\") " pod="openshift-infra/auto-csr-approver-29557866-bzhl6" Mar 14 07:06:00 crc kubenswrapper[4893]: I0314 07:06:00.483358 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557866-bzhl6" Mar 14 07:06:00 crc kubenswrapper[4893]: I0314 07:06:00.739366 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557866-bzhl6"] Mar 14 07:06:00 crc kubenswrapper[4893]: W0314 07:06:00.752712 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f6cbd57_37ba_433b_afbf_59e3083c9dc2.slice/crio-79824829e3f5fba7649b14cdc74c97edf0faa966041f9c170267f3c203f36aa8 WatchSource:0}: Error finding container 79824829e3f5fba7649b14cdc74c97edf0faa966041f9c170267f3c203f36aa8: Status 404 returned error can't find the container with id 79824829e3f5fba7649b14cdc74c97edf0faa966041f9c170267f3c203f36aa8 Mar 14 07:06:01 crc kubenswrapper[4893]: I0314 07:06:01.087456 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557866-bzhl6" event={"ID":"2f6cbd57-37ba-433b-afbf-59e3083c9dc2","Type":"ContainerStarted","Data":"79824829e3f5fba7649b14cdc74c97edf0faa966041f9c170267f3c203f36aa8"} Mar 14 07:06:02 crc kubenswrapper[4893]: I0314 07:06:02.095329 4893 generic.go:334] "Generic (PLEG): container finished" podID="2f6cbd57-37ba-433b-afbf-59e3083c9dc2" containerID="5dfcca0dfe39331a0a5a26fcbed16b3c40f7435821115aec81763e25730b6b97" exitCode=0 Mar 14 07:06:02 crc kubenswrapper[4893]: I0314 07:06:02.095760 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557866-bzhl6" event={"ID":"2f6cbd57-37ba-433b-afbf-59e3083c9dc2","Type":"ContainerDied","Data":"5dfcca0dfe39331a0a5a26fcbed16b3c40f7435821115aec81763e25730b6b97"} Mar 14 07:06:03 crc kubenswrapper[4893]: I0314 07:06:03.436370 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557866-bzhl6" Mar 14 07:06:03 crc kubenswrapper[4893]: I0314 07:06:03.623905 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4fhdc\" (UniqueName: \"kubernetes.io/projected/2f6cbd57-37ba-433b-afbf-59e3083c9dc2-kube-api-access-4fhdc\") pod \"2f6cbd57-37ba-433b-afbf-59e3083c9dc2\" (UID: \"2f6cbd57-37ba-433b-afbf-59e3083c9dc2\") " Mar 14 07:06:03 crc kubenswrapper[4893]: I0314 07:06:03.628864 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f6cbd57-37ba-433b-afbf-59e3083c9dc2-kube-api-access-4fhdc" (OuterVolumeSpecName: "kube-api-access-4fhdc") pod "2f6cbd57-37ba-433b-afbf-59e3083c9dc2" (UID: "2f6cbd57-37ba-433b-afbf-59e3083c9dc2"). InnerVolumeSpecName "kube-api-access-4fhdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:06:03 crc kubenswrapper[4893]: I0314 07:06:03.725230 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4fhdc\" (UniqueName: \"kubernetes.io/projected/2f6cbd57-37ba-433b-afbf-59e3083c9dc2-kube-api-access-4fhdc\") on node \"crc\" DevicePath \"\"" Mar 14 07:06:04 crc kubenswrapper[4893]: I0314 07:06:04.108366 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557866-bzhl6" event={"ID":"2f6cbd57-37ba-433b-afbf-59e3083c9dc2","Type":"ContainerDied","Data":"79824829e3f5fba7649b14cdc74c97edf0faa966041f9c170267f3c203f36aa8"} Mar 14 07:06:04 crc kubenswrapper[4893]: I0314 07:06:04.108409 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79824829e3f5fba7649b14cdc74c97edf0faa966041f9c170267f3c203f36aa8" Mar 14 07:06:04 crc kubenswrapper[4893]: I0314 07:06:04.108410 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557866-bzhl6" Mar 14 07:06:29 crc kubenswrapper[4893]: I0314 07:06:29.731690 4893 patch_prober.go:28] interesting pod/machine-config-daemon-d4x6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:06:29 crc kubenswrapper[4893]: I0314 07:06:29.732328 4893 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:06:59 crc kubenswrapper[4893]: I0314 07:06:59.730904 4893 patch_prober.go:28] interesting pod/machine-config-daemon-d4x6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:06:59 crc kubenswrapper[4893]: I0314 07:06:59.731928 4893 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:06:59 crc kubenswrapper[4893]: I0314 07:06:59.732008 4893 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" Mar 14 07:06:59 crc kubenswrapper[4893]: I0314 07:06:59.743988 4893 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"468464e3d7d61a6b60ffc50852de07cc978583465cf598b48439ac3af53b576d"} pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 07:06:59 crc kubenswrapper[4893]: I0314 07:06:59.744119 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" containerID="cri-o://468464e3d7d61a6b60ffc50852de07cc978583465cf598b48439ac3af53b576d" gracePeriod=600 Mar 14 07:07:00 crc kubenswrapper[4893]: I0314 07:07:00.523625 4893 generic.go:334] "Generic (PLEG): container finished" podID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerID="468464e3d7d61a6b60ffc50852de07cc978583465cf598b48439ac3af53b576d" exitCode=0 Mar 14 07:07:00 crc kubenswrapper[4893]: I0314 07:07:00.523834 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" event={"ID":"ad6724e5-48cf-4417-ae51-b1cb8c6af70d","Type":"ContainerDied","Data":"468464e3d7d61a6b60ffc50852de07cc978583465cf598b48439ac3af53b576d"} Mar 14 07:07:00 crc kubenswrapper[4893]: I0314 07:07:00.524355 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" event={"ID":"ad6724e5-48cf-4417-ae51-b1cb8c6af70d","Type":"ContainerStarted","Data":"5767eeb7a124a0dccdff1fd12ceffa1bf004fba841b3e93b8c0806a60c555792"} Mar 14 07:07:00 crc kubenswrapper[4893]: I0314 07:07:00.524389 4893 scope.go:117] "RemoveContainer" containerID="0b93320f866f07b1494ab844854d58a4a60af1526c128c8f2df7794c38234a32" Mar 14 07:08:00 crc kubenswrapper[4893]: I0314 07:08:00.136846 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557868-xdsfw"] Mar 14 07:08:00 crc kubenswrapper[4893]: E0314 07:08:00.137862 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f6cbd57-37ba-433b-afbf-59e3083c9dc2" containerName="oc" Mar 14 07:08:00 crc kubenswrapper[4893]: I0314 07:08:00.137878 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f6cbd57-37ba-433b-afbf-59e3083c9dc2" containerName="oc" Mar 14 07:08:00 crc kubenswrapper[4893]: I0314 07:08:00.137985 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f6cbd57-37ba-433b-afbf-59e3083c9dc2" containerName="oc" Mar 14 07:08:00 crc kubenswrapper[4893]: I0314 07:08:00.138505 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557868-xdsfw" Mar 14 07:08:00 crc kubenswrapper[4893]: I0314 07:08:00.141315 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 07:08:00 crc kubenswrapper[4893]: I0314 07:08:00.141749 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-44qb7" Mar 14 07:08:00 crc kubenswrapper[4893]: I0314 07:08:00.141928 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 07:08:00 crc kubenswrapper[4893]: I0314 07:08:00.143754 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557868-xdsfw"] Mar 14 07:08:00 crc kubenswrapper[4893]: I0314 07:08:00.256912 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzl6p\" (UniqueName: \"kubernetes.io/projected/8a845fb3-b401-45ba-8c57-5d2b5a7e4320-kube-api-access-bzl6p\") pod \"auto-csr-approver-29557868-xdsfw\" (UID: \"8a845fb3-b401-45ba-8c57-5d2b5a7e4320\") " pod="openshift-infra/auto-csr-approver-29557868-xdsfw" Mar 14 07:08:00 crc kubenswrapper[4893]: I0314 07:08:00.358437 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzl6p\" (UniqueName: \"kubernetes.io/projected/8a845fb3-b401-45ba-8c57-5d2b5a7e4320-kube-api-access-bzl6p\") pod \"auto-csr-approver-29557868-xdsfw\" (UID: \"8a845fb3-b401-45ba-8c57-5d2b5a7e4320\") " pod="openshift-infra/auto-csr-approver-29557868-xdsfw" Mar 14 07:08:00 crc kubenswrapper[4893]: I0314 07:08:00.390796 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzl6p\" (UniqueName: \"kubernetes.io/projected/8a845fb3-b401-45ba-8c57-5d2b5a7e4320-kube-api-access-bzl6p\") pod \"auto-csr-approver-29557868-xdsfw\" (UID: \"8a845fb3-b401-45ba-8c57-5d2b5a7e4320\") " pod="openshift-infra/auto-csr-approver-29557868-xdsfw" Mar 14 07:08:00 crc kubenswrapper[4893]: I0314 07:08:00.458363 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557868-xdsfw" Mar 14 07:08:00 crc kubenswrapper[4893]: I0314 07:08:00.817287 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557868-xdsfw"] Mar 14 07:08:00 crc kubenswrapper[4893]: I0314 07:08:00.826482 4893 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 07:08:00 crc kubenswrapper[4893]: I0314 07:08:00.901204 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557868-xdsfw" event={"ID":"8a845fb3-b401-45ba-8c57-5d2b5a7e4320","Type":"ContainerStarted","Data":"a0c17e5169c87aba81364f8d524833ad25782a11ba7ee6c46b77ef9cea1243ee"} Mar 14 07:08:02 crc kubenswrapper[4893]: I0314 07:08:02.911675 4893 generic.go:334] "Generic (PLEG): container finished" podID="8a845fb3-b401-45ba-8c57-5d2b5a7e4320" containerID="ce002a6786167410da0b822e7b73868b4f7987f4ff00bb4e971102daae2a214a" exitCode=0 Mar 14 07:08:02 crc kubenswrapper[4893]: I0314 07:08:02.911753 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557868-xdsfw" event={"ID":"8a845fb3-b401-45ba-8c57-5d2b5a7e4320","Type":"ContainerDied","Data":"ce002a6786167410da0b822e7b73868b4f7987f4ff00bb4e971102daae2a214a"} Mar 14 07:08:04 crc kubenswrapper[4893]: I0314 07:08:04.173154 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557868-xdsfw" Mar 14 07:08:04 crc kubenswrapper[4893]: I0314 07:08:04.310418 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzl6p\" (UniqueName: \"kubernetes.io/projected/8a845fb3-b401-45ba-8c57-5d2b5a7e4320-kube-api-access-bzl6p\") pod \"8a845fb3-b401-45ba-8c57-5d2b5a7e4320\" (UID: \"8a845fb3-b401-45ba-8c57-5d2b5a7e4320\") " Mar 14 07:08:04 crc kubenswrapper[4893]: I0314 07:08:04.315994 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a845fb3-b401-45ba-8c57-5d2b5a7e4320-kube-api-access-bzl6p" (OuterVolumeSpecName: "kube-api-access-bzl6p") pod "8a845fb3-b401-45ba-8c57-5d2b5a7e4320" (UID: "8a845fb3-b401-45ba-8c57-5d2b5a7e4320"). InnerVolumeSpecName "kube-api-access-bzl6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:08:04 crc kubenswrapper[4893]: I0314 07:08:04.412143 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzl6p\" (UniqueName: \"kubernetes.io/projected/8a845fb3-b401-45ba-8c57-5d2b5a7e4320-kube-api-access-bzl6p\") on node \"crc\" DevicePath \"\"" Mar 14 07:08:04 crc kubenswrapper[4893]: I0314 07:08:04.927499 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557868-xdsfw" event={"ID":"8a845fb3-b401-45ba-8c57-5d2b5a7e4320","Type":"ContainerDied","Data":"a0c17e5169c87aba81364f8d524833ad25782a11ba7ee6c46b77ef9cea1243ee"} Mar 14 07:08:04 crc kubenswrapper[4893]: I0314 07:08:04.927559 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0c17e5169c87aba81364f8d524833ad25782a11ba7ee6c46b77ef9cea1243ee" Mar 14 07:08:04 crc kubenswrapper[4893]: I0314 07:08:04.927578 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557868-xdsfw" Mar 14 07:08:05 crc kubenswrapper[4893]: I0314 07:08:05.231101 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557862-tmqkz"] Mar 14 07:08:05 crc kubenswrapper[4893]: I0314 07:08:05.237195 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557862-tmqkz"] Mar 14 07:08:05 crc kubenswrapper[4893]: I0314 07:08:05.385017 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19c5a27c-6f7c-46d9-bc16-27796b0bd030" path="/var/lib/kubelet/pods/19c5a27c-6f7c-46d9-bc16-27796b0bd030/volumes" Mar 14 07:08:43 crc kubenswrapper[4893]: I0314 07:08:43.966839 4893 scope.go:117] "RemoveContainer" containerID="85541424c8554b4ec125fcbc99002ef957e72ab249cce4e5d31dfa491362836e" Mar 14 07:08:59 crc kubenswrapper[4893]: I0314 07:08:59.731441 4893 patch_prober.go:28] interesting pod/machine-config-daemon-d4x6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:08:59 crc kubenswrapper[4893]: I0314 07:08:59.732229 4893 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:09:29 crc kubenswrapper[4893]: I0314 07:09:29.731346 4893 patch_prober.go:28] interesting pod/machine-config-daemon-d4x6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:09:29 crc kubenswrapper[4893]: I0314 07:09:29.731984 4893 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:09:59 crc kubenswrapper[4893]: I0314 07:09:59.731486 4893 patch_prober.go:28] interesting pod/machine-config-daemon-d4x6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:09:59 crc kubenswrapper[4893]: I0314 07:09:59.732324 4893 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:09:59 crc kubenswrapper[4893]: I0314 07:09:59.732426 4893 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" Mar 14 07:09:59 crc kubenswrapper[4893]: I0314 07:09:59.733664 4893 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5767eeb7a124a0dccdff1fd12ceffa1bf004fba841b3e93b8c0806a60c555792"} pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 07:09:59 crc kubenswrapper[4893]: I0314 07:09:59.733790 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" containerID="cri-o://5767eeb7a124a0dccdff1fd12ceffa1bf004fba841b3e93b8c0806a60c555792" gracePeriod=600 Mar 14 07:09:59 crc kubenswrapper[4893]: I0314 07:09:59.923480 4893 generic.go:334] "Generic (PLEG): container finished" podID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerID="5767eeb7a124a0dccdff1fd12ceffa1bf004fba841b3e93b8c0806a60c555792" exitCode=0 Mar 14 07:09:59 crc kubenswrapper[4893]: I0314 07:09:59.923556 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" event={"ID":"ad6724e5-48cf-4417-ae51-b1cb8c6af70d","Type":"ContainerDied","Data":"5767eeb7a124a0dccdff1fd12ceffa1bf004fba841b3e93b8c0806a60c555792"} Mar 14 07:09:59 crc kubenswrapper[4893]: I0314 07:09:59.923922 4893 scope.go:117] "RemoveContainer" containerID="468464e3d7d61a6b60ffc50852de07cc978583465cf598b48439ac3af53b576d" Mar 14 07:10:00 crc kubenswrapper[4893]: I0314 07:10:00.141594 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557870-gxmgb"] Mar 14 07:10:00 crc kubenswrapper[4893]: E0314 07:10:00.141896 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a845fb3-b401-45ba-8c57-5d2b5a7e4320" containerName="oc" Mar 14 07:10:00 crc kubenswrapper[4893]: I0314 07:10:00.141913 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a845fb3-b401-45ba-8c57-5d2b5a7e4320" containerName="oc" Mar 14 07:10:00 crc kubenswrapper[4893]: I0314 07:10:00.142023 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a845fb3-b401-45ba-8c57-5d2b5a7e4320" containerName="oc" Mar 14 07:10:00 crc kubenswrapper[4893]: I0314 07:10:00.142350 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557870-gxmgb"] Mar 14 07:10:00 crc kubenswrapper[4893]: I0314 07:10:00.142442 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557870-gxmgb" Mar 14 07:10:00 crc kubenswrapper[4893]: I0314 07:10:00.145201 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 07:10:00 crc kubenswrapper[4893]: I0314 07:10:00.145267 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 07:10:00 crc kubenswrapper[4893]: I0314 07:10:00.145492 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-44qb7" Mar 14 07:10:00 crc kubenswrapper[4893]: I0314 07:10:00.313040 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qt6fp\" (UniqueName: \"kubernetes.io/projected/7c3d9444-32d8-44e1-93cd-5ab047703eec-kube-api-access-qt6fp\") pod \"auto-csr-approver-29557870-gxmgb\" (UID: \"7c3d9444-32d8-44e1-93cd-5ab047703eec\") " pod="openshift-infra/auto-csr-approver-29557870-gxmgb" Mar 14 07:10:00 crc kubenswrapper[4893]: I0314 07:10:00.414063 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qt6fp\" (UniqueName: \"kubernetes.io/projected/7c3d9444-32d8-44e1-93cd-5ab047703eec-kube-api-access-qt6fp\") pod \"auto-csr-approver-29557870-gxmgb\" (UID: \"7c3d9444-32d8-44e1-93cd-5ab047703eec\") " pod="openshift-infra/auto-csr-approver-29557870-gxmgb" Mar 14 07:10:00 crc kubenswrapper[4893]: I0314 07:10:00.433077 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qt6fp\" (UniqueName: \"kubernetes.io/projected/7c3d9444-32d8-44e1-93cd-5ab047703eec-kube-api-access-qt6fp\") pod \"auto-csr-approver-29557870-gxmgb\" (UID: \"7c3d9444-32d8-44e1-93cd-5ab047703eec\") " pod="openshift-infra/auto-csr-approver-29557870-gxmgb" Mar 14 07:10:00 crc kubenswrapper[4893]: I0314 07:10:00.464057 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557870-gxmgb" Mar 14 07:10:00 crc kubenswrapper[4893]: I0314 07:10:00.614219 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557870-gxmgb"] Mar 14 07:10:00 crc kubenswrapper[4893]: W0314 07:10:00.619357 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c3d9444_32d8_44e1_93cd_5ab047703eec.slice/crio-e9e6c00f6a7722f2740f6335424a0d349dfabd54c29c9b590c7299dea6f94e67 WatchSource:0}: Error finding container e9e6c00f6a7722f2740f6335424a0d349dfabd54c29c9b590c7299dea6f94e67: Status 404 returned error can't find the container with id e9e6c00f6a7722f2740f6335424a0d349dfabd54c29c9b590c7299dea6f94e67 Mar 14 07:10:00 crc kubenswrapper[4893]: I0314 07:10:00.942232 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" event={"ID":"ad6724e5-48cf-4417-ae51-b1cb8c6af70d","Type":"ContainerStarted","Data":"41d06da2c2df1866ba771ef2b7559c95677812a35cd5740f54ac967f6074fe35"} Mar 14 07:10:00 crc kubenswrapper[4893]: I0314 07:10:00.944246 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557870-gxmgb" event={"ID":"7c3d9444-32d8-44e1-93cd-5ab047703eec","Type":"ContainerStarted","Data":"e9e6c00f6a7722f2740f6335424a0d349dfabd54c29c9b590c7299dea6f94e67"} Mar 14 07:10:01 crc kubenswrapper[4893]: I0314 07:10:01.951673 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557870-gxmgb" event={"ID":"7c3d9444-32d8-44e1-93cd-5ab047703eec","Type":"ContainerStarted","Data":"d9a956f8c5ca6d1103ec66377fa782a47cd493892c94e61a329217220fa197fb"} Mar 14 07:10:01 crc kubenswrapper[4893]: I0314 07:10:01.966093 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557870-gxmgb" podStartSLOduration=0.972816319 podStartE2EDuration="1.966074349s" podCreationTimestamp="2026-03-14 07:10:00 +0000 UTC" firstStartedPulling="2026-03-14 07:10:00.622148194 +0000 UTC m=+679.884324986" lastFinishedPulling="2026-03-14 07:10:01.615406214 +0000 UTC m=+680.877583016" observedRunningTime="2026-03-14 07:10:01.962405847 +0000 UTC m=+681.224582649" watchObservedRunningTime="2026-03-14 07:10:01.966074349 +0000 UTC m=+681.228251141" Mar 14 07:10:02 crc kubenswrapper[4893]: I0314 07:10:02.964102 4893 generic.go:334] "Generic (PLEG): container finished" podID="7c3d9444-32d8-44e1-93cd-5ab047703eec" containerID="d9a956f8c5ca6d1103ec66377fa782a47cd493892c94e61a329217220fa197fb" exitCode=0 Mar 14 07:10:02 crc kubenswrapper[4893]: I0314 07:10:02.964169 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557870-gxmgb" event={"ID":"7c3d9444-32d8-44e1-93cd-5ab047703eec","Type":"ContainerDied","Data":"d9a956f8c5ca6d1103ec66377fa782a47cd493892c94e61a329217220fa197fb"} Mar 14 07:10:04 crc kubenswrapper[4893]: I0314 07:10:04.184232 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557870-gxmgb" Mar 14 07:10:04 crc kubenswrapper[4893]: I0314 07:10:04.362283 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qt6fp\" (UniqueName: \"kubernetes.io/projected/7c3d9444-32d8-44e1-93cd-5ab047703eec-kube-api-access-qt6fp\") pod \"7c3d9444-32d8-44e1-93cd-5ab047703eec\" (UID: \"7c3d9444-32d8-44e1-93cd-5ab047703eec\") " Mar 14 07:10:04 crc kubenswrapper[4893]: I0314 07:10:04.370236 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c3d9444-32d8-44e1-93cd-5ab047703eec-kube-api-access-qt6fp" (OuterVolumeSpecName: "kube-api-access-qt6fp") pod "7c3d9444-32d8-44e1-93cd-5ab047703eec" (UID: "7c3d9444-32d8-44e1-93cd-5ab047703eec"). InnerVolumeSpecName "kube-api-access-qt6fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:10:04 crc kubenswrapper[4893]: I0314 07:10:04.454087 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557864-q48px"] Mar 14 07:10:04 crc kubenswrapper[4893]: I0314 07:10:04.457851 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557864-q48px"] Mar 14 07:10:04 crc kubenswrapper[4893]: I0314 07:10:04.463587 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qt6fp\" (UniqueName: \"kubernetes.io/projected/7c3d9444-32d8-44e1-93cd-5ab047703eec-kube-api-access-qt6fp\") on node \"crc\" DevicePath \"\"" Mar 14 07:10:04 crc kubenswrapper[4893]: I0314 07:10:04.979767 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557870-gxmgb" event={"ID":"7c3d9444-32d8-44e1-93cd-5ab047703eec","Type":"ContainerDied","Data":"e9e6c00f6a7722f2740f6335424a0d349dfabd54c29c9b590c7299dea6f94e67"} Mar 14 07:10:04 crc kubenswrapper[4893]: I0314 07:10:04.979807 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9e6c00f6a7722f2740f6335424a0d349dfabd54c29c9b590c7299dea6f94e67" Mar 14 07:10:04 crc kubenswrapper[4893]: I0314 07:10:04.979911 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557870-gxmgb" Mar 14 07:10:05 crc kubenswrapper[4893]: I0314 07:10:05.385909 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9d0f343-4f63-426f-89b3-35f0217cc0a2" path="/var/lib/kubelet/pods/f9d0f343-4f63-426f-89b3-35f0217cc0a2/volumes" Mar 14 07:10:44 crc kubenswrapper[4893]: I0314 07:10:44.052156 4893 scope.go:117] "RemoveContainer" containerID="6256f0440c97ff0fcb9199ae6921c71b190e6043f9ca3fdb6999d582514a13a4" Mar 14 07:11:12 crc kubenswrapper[4893]: I0314 07:11:12.590025 4893 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 14 07:11:59 crc kubenswrapper[4893]: I0314 07:11:59.731602 4893 patch_prober.go:28] interesting pod/machine-config-daemon-d4x6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:11:59 crc kubenswrapper[4893]: I0314 07:11:59.732661 4893 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:12:00 crc kubenswrapper[4893]: I0314 07:12:00.132431 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557872-s4jrn"] Mar 14 07:12:00 crc kubenswrapper[4893]: E0314 07:12:00.132707 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c3d9444-32d8-44e1-93cd-5ab047703eec" containerName="oc" Mar 14 07:12:00 crc kubenswrapper[4893]: I0314 07:12:00.132722 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c3d9444-32d8-44e1-93cd-5ab047703eec" containerName="oc" Mar 14 07:12:00 crc kubenswrapper[4893]: I0314 07:12:00.132815 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c3d9444-32d8-44e1-93cd-5ab047703eec" containerName="oc" Mar 14 07:12:00 crc kubenswrapper[4893]: I0314 07:12:00.134593 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557872-s4jrn" Mar 14 07:12:00 crc kubenswrapper[4893]: I0314 07:12:00.139813 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-44qb7" Mar 14 07:12:00 crc kubenswrapper[4893]: I0314 07:12:00.140112 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 07:12:00 crc kubenswrapper[4893]: I0314 07:12:00.140412 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 07:12:00 crc kubenswrapper[4893]: I0314 07:12:00.144870 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557872-s4jrn"] Mar 14 07:12:00 crc kubenswrapper[4893]: I0314 07:12:00.228727 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5h75\" (UniqueName: \"kubernetes.io/projected/f0ed59f6-a6b5-4601-97d6-66f298b0fc3e-kube-api-access-n5h75\") pod \"auto-csr-approver-29557872-s4jrn\" (UID: \"f0ed59f6-a6b5-4601-97d6-66f298b0fc3e\") " pod="openshift-infra/auto-csr-approver-29557872-s4jrn" Mar 14 07:12:00 crc kubenswrapper[4893]: I0314 07:12:00.330788 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5h75\" (UniqueName: \"kubernetes.io/projected/f0ed59f6-a6b5-4601-97d6-66f298b0fc3e-kube-api-access-n5h75\") pod \"auto-csr-approver-29557872-s4jrn\" (UID: \"f0ed59f6-a6b5-4601-97d6-66f298b0fc3e\") " pod="openshift-infra/auto-csr-approver-29557872-s4jrn" Mar 14 07:12:00 crc kubenswrapper[4893]: I0314 07:12:00.374586 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5h75\" (UniqueName: \"kubernetes.io/projected/f0ed59f6-a6b5-4601-97d6-66f298b0fc3e-kube-api-access-n5h75\") pod \"auto-csr-approver-29557872-s4jrn\" (UID: \"f0ed59f6-a6b5-4601-97d6-66f298b0fc3e\") " pod="openshift-infra/auto-csr-approver-29557872-s4jrn" Mar 14 07:12:00 crc kubenswrapper[4893]: I0314 07:12:00.453668 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557872-s4jrn" Mar 14 07:12:00 crc kubenswrapper[4893]: I0314 07:12:00.645136 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557872-s4jrn"] Mar 14 07:12:00 crc kubenswrapper[4893]: I0314 07:12:00.689474 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557872-s4jrn" event={"ID":"f0ed59f6-a6b5-4601-97d6-66f298b0fc3e","Type":"ContainerStarted","Data":"58dc119adefee70427c6b2bae979f6aacbf5948147533c349161239e25141355"} Mar 14 07:12:02 crc kubenswrapper[4893]: I0314 07:12:02.700873 4893 generic.go:334] "Generic (PLEG): container finished" podID="f0ed59f6-a6b5-4601-97d6-66f298b0fc3e" containerID="7f869d2d08b072906140050fdd8e8329d0ca605ecc94294ecc9799c0092f363d" exitCode=0 Mar 14 07:12:02 crc kubenswrapper[4893]: I0314 07:12:02.700939 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557872-s4jrn" event={"ID":"f0ed59f6-a6b5-4601-97d6-66f298b0fc3e","Type":"ContainerDied","Data":"7f869d2d08b072906140050fdd8e8329d0ca605ecc94294ecc9799c0092f363d"} Mar 14 07:12:03 crc kubenswrapper[4893]: I0314 07:12:03.987884 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557872-s4jrn" Mar 14 07:12:04 crc kubenswrapper[4893]: I0314 07:12:04.082231 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5h75\" (UniqueName: \"kubernetes.io/projected/f0ed59f6-a6b5-4601-97d6-66f298b0fc3e-kube-api-access-n5h75\") pod \"f0ed59f6-a6b5-4601-97d6-66f298b0fc3e\" (UID: \"f0ed59f6-a6b5-4601-97d6-66f298b0fc3e\") " Mar 14 07:12:04 crc kubenswrapper[4893]: I0314 07:12:04.091419 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0ed59f6-a6b5-4601-97d6-66f298b0fc3e-kube-api-access-n5h75" (OuterVolumeSpecName: "kube-api-access-n5h75") pod "f0ed59f6-a6b5-4601-97d6-66f298b0fc3e" (UID: "f0ed59f6-a6b5-4601-97d6-66f298b0fc3e"). InnerVolumeSpecName "kube-api-access-n5h75". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:12:04 crc kubenswrapper[4893]: I0314 07:12:04.183752 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5h75\" (UniqueName: \"kubernetes.io/projected/f0ed59f6-a6b5-4601-97d6-66f298b0fc3e-kube-api-access-n5h75\") on node \"crc\" DevicePath \"\"" Mar 14 07:12:04 crc kubenswrapper[4893]: I0314 07:12:04.715922 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557872-s4jrn" event={"ID":"f0ed59f6-a6b5-4601-97d6-66f298b0fc3e","Type":"ContainerDied","Data":"58dc119adefee70427c6b2bae979f6aacbf5948147533c349161239e25141355"} Mar 14 07:12:04 crc kubenswrapper[4893]: I0314 07:12:04.715989 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="58dc119adefee70427c6b2bae979f6aacbf5948147533c349161239e25141355" Mar 14 07:12:04 crc kubenswrapper[4893]: I0314 07:12:04.716007 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557872-s4jrn" Mar 14 07:12:04 crc kubenswrapper[4893]: E0314 07:12:04.776166 4893 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0ed59f6_a6b5_4601_97d6_66f298b0fc3e.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0ed59f6_a6b5_4601_97d6_66f298b0fc3e.slice/crio-58dc119adefee70427c6b2bae979f6aacbf5948147533c349161239e25141355\": RecentStats: unable to find data in memory cache]" Mar 14 07:12:05 crc kubenswrapper[4893]: I0314 07:12:05.058744 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557866-bzhl6"] Mar 14 07:12:05 crc kubenswrapper[4893]: I0314 07:12:05.063126 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557866-bzhl6"] Mar 14 07:12:05 crc kubenswrapper[4893]: I0314 07:12:05.388166 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f6cbd57-37ba-433b-afbf-59e3083c9dc2" path="/var/lib/kubelet/pods/2f6cbd57-37ba-433b-afbf-59e3083c9dc2/volumes" Mar 14 07:12:29 crc kubenswrapper[4893]: I0314 07:12:29.730807 4893 patch_prober.go:28] interesting pod/machine-config-daemon-d4x6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:12:29 crc kubenswrapper[4893]: I0314 07:12:29.731248 4893 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:12:44 crc kubenswrapper[4893]: I0314 07:12:44.140787 4893 scope.go:117] "RemoveContainer" containerID="5dfcca0dfe39331a0a5a26fcbed16b3c40f7435821115aec81763e25730b6b97" Mar 14 07:12:59 crc kubenswrapper[4893]: I0314 07:12:59.730660 4893 patch_prober.go:28] interesting pod/machine-config-daemon-d4x6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:12:59 crc kubenswrapper[4893]: I0314 07:12:59.732862 4893 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:12:59 crc kubenswrapper[4893]: I0314 07:12:59.732991 4893 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" Mar 14 07:12:59 crc kubenswrapper[4893]: I0314 07:12:59.733639 4893 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"41d06da2c2df1866ba771ef2b7559c95677812a35cd5740f54ac967f6074fe35"} pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 07:12:59 crc kubenswrapper[4893]: I0314 07:12:59.733799 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" containerID="cri-o://41d06da2c2df1866ba771ef2b7559c95677812a35cd5740f54ac967f6074fe35" gracePeriod=600 Mar 14 07:13:00 crc kubenswrapper[4893]: I0314 07:13:00.039861 4893 generic.go:334] "Generic (PLEG): container finished" podID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerID="41d06da2c2df1866ba771ef2b7559c95677812a35cd5740f54ac967f6074fe35" exitCode=0 Mar 14 07:13:00 crc kubenswrapper[4893]: I0314 07:13:00.039934 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" event={"ID":"ad6724e5-48cf-4417-ae51-b1cb8c6af70d","Type":"ContainerDied","Data":"41d06da2c2df1866ba771ef2b7559c95677812a35cd5740f54ac967f6074fe35"} Mar 14 07:13:00 crc kubenswrapper[4893]: I0314 07:13:00.040309 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" event={"ID":"ad6724e5-48cf-4417-ae51-b1cb8c6af70d","Type":"ContainerStarted","Data":"2fd7b7357426a71964d49e99a8163fe1e89a54e8bb9c768156381da3bae22bd0"} Mar 14 07:13:00 crc kubenswrapper[4893]: I0314 07:13:00.040334 4893 scope.go:117] "RemoveContainer" containerID="5767eeb7a124a0dccdff1fd12ceffa1bf004fba841b3e93b8c0806a60c555792" Mar 14 07:13:08 crc kubenswrapper[4893]: I0314 07:13:08.777916 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-cbskd"] Mar 14 07:13:08 crc kubenswrapper[4893]: I0314 07:13:08.778866 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" podUID="fa7a2543-8a0d-4d25-9e7a-bc387f9662df" containerName="ovn-controller" containerID="cri-o://d26c4e4548c14b8e7000a08e95791dbba2dbf19d141a9339a1189eef4a6671d4" gracePeriod=30 Mar 14 07:13:08 crc kubenswrapper[4893]: I0314 07:13:08.779210 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" podUID="fa7a2543-8a0d-4d25-9e7a-bc387f9662df" containerName="kube-rbac-proxy-node" containerID="cri-o://03a4f2bcd347b9f116d0bb722090bc3d64c5bbada85d7a2085b0df772b554219" gracePeriod=30 Mar 14 07:13:08 crc kubenswrapper[4893]: I0314 07:13:08.779243 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" podUID="fa7a2543-8a0d-4d25-9e7a-bc387f9662df" containerName="ovn-acl-logging" containerID="cri-o://6f567feeedfca9666b2b3013232517cb979a7009fe9e67e796e5cfa0e8c747be" gracePeriod=30 Mar 14 07:13:08 crc kubenswrapper[4893]: I0314 07:13:08.779293 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" podUID="fa7a2543-8a0d-4d25-9e7a-bc387f9662df" containerName="sbdb" containerID="cri-o://99ccd9e40fe6a70d8fd34f384f62c4df059e937f65a039b068126039882d4784" gracePeriod=30 Mar 14 07:13:08 crc kubenswrapper[4893]: I0314 07:13:08.779282 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" podUID="fa7a2543-8a0d-4d25-9e7a-bc387f9662df" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://ddc3fe94134a2ec81663c46aa2f6886c3f380d1c7c5eba5a895d452f8d140953" gracePeriod=30 Mar 14 07:13:08 crc kubenswrapper[4893]: I0314 07:13:08.779282 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" podUID="fa7a2543-8a0d-4d25-9e7a-bc387f9662df" containerName="nbdb" containerID="cri-o://67df34ecb8f8d052c704bb968928a75eda35defdf1070c97adace5a42bf07a75" gracePeriod=30 Mar 14 07:13:08 crc kubenswrapper[4893]: I0314 07:13:08.779359 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" podUID="fa7a2543-8a0d-4d25-9e7a-bc387f9662df" containerName="northd" containerID="cri-o://f37f44adb89d717c3f8f96c0bec6073b95833268279249dd58f74cc6101590b7" gracePeriod=30 Mar 14 07:13:08 crc kubenswrapper[4893]: I0314 07:13:08.815730 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" podUID="fa7a2543-8a0d-4d25-9e7a-bc387f9662df" containerName="ovnkube-controller" containerID="cri-o://7a807995d9c790c0f2ca6db51e41256c6db5022f6edfa08930fed8f85a99319e" gracePeriod=30 Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.099730 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cbskd_fa7a2543-8a0d-4d25-9e7a-bc387f9662df/ovn-acl-logging/0.log" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.100462 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cbskd_fa7a2543-8a0d-4d25-9e7a-bc387f9662df/ovn-controller/0.log" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.100950 4893 generic.go:334] "Generic (PLEG): container finished" podID="fa7a2543-8a0d-4d25-9e7a-bc387f9662df" containerID="7a807995d9c790c0f2ca6db51e41256c6db5022f6edfa08930fed8f85a99319e" exitCode=0 Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.100989 4893 generic.go:334] "Generic (PLEG): container finished" podID="fa7a2543-8a0d-4d25-9e7a-bc387f9662df" containerID="99ccd9e40fe6a70d8fd34f384f62c4df059e937f65a039b068126039882d4784" exitCode=0 Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.101014 4893 generic.go:334] "Generic (PLEG): container finished" podID="fa7a2543-8a0d-4d25-9e7a-bc387f9662df" containerID="67df34ecb8f8d052c704bb968928a75eda35defdf1070c97adace5a42bf07a75" exitCode=0 Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.101033 4893 generic.go:334] "Generic (PLEG): container finished" podID="fa7a2543-8a0d-4d25-9e7a-bc387f9662df" containerID="f37f44adb89d717c3f8f96c0bec6073b95833268279249dd58f74cc6101590b7" exitCode=0 Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.101047 4893 generic.go:334] "Generic (PLEG): container finished" podID="fa7a2543-8a0d-4d25-9e7a-bc387f9662df" containerID="ddc3fe94134a2ec81663c46aa2f6886c3f380d1c7c5eba5a895d452f8d140953" exitCode=0 Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.101074 4893 generic.go:334] "Generic (PLEG): container finished" podID="fa7a2543-8a0d-4d25-9e7a-bc387f9662df" containerID="03a4f2bcd347b9f116d0bb722090bc3d64c5bbada85d7a2085b0df772b554219" exitCode=0 Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.101091 4893 generic.go:334] "Generic (PLEG): container finished" podID="fa7a2543-8a0d-4d25-9e7a-bc387f9662df" containerID="6f567feeedfca9666b2b3013232517cb979a7009fe9e67e796e5cfa0e8c747be" exitCode=143 Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.101105 4893 generic.go:334] "Generic (PLEG): container finished" podID="fa7a2543-8a0d-4d25-9e7a-bc387f9662df" containerID="d26c4e4548c14b8e7000a08e95791dbba2dbf19d141a9339a1189eef4a6671d4" exitCode=143 Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.101038 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" event={"ID":"fa7a2543-8a0d-4d25-9e7a-bc387f9662df","Type":"ContainerDied","Data":"7a807995d9c790c0f2ca6db51e41256c6db5022f6edfa08930fed8f85a99319e"} Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.101164 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" event={"ID":"fa7a2543-8a0d-4d25-9e7a-bc387f9662df","Type":"ContainerDied","Data":"99ccd9e40fe6a70d8fd34f384f62c4df059e937f65a039b068126039882d4784"} Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.101176 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" event={"ID":"fa7a2543-8a0d-4d25-9e7a-bc387f9662df","Type":"ContainerDied","Data":"67df34ecb8f8d052c704bb968928a75eda35defdf1070c97adace5a42bf07a75"} Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.101185 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" event={"ID":"fa7a2543-8a0d-4d25-9e7a-bc387f9662df","Type":"ContainerDied","Data":"f37f44adb89d717c3f8f96c0bec6073b95833268279249dd58f74cc6101590b7"} Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.101195 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" event={"ID":"fa7a2543-8a0d-4d25-9e7a-bc387f9662df","Type":"ContainerDied","Data":"ddc3fe94134a2ec81663c46aa2f6886c3f380d1c7c5eba5a895d452f8d140953"} Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.101204 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" event={"ID":"fa7a2543-8a0d-4d25-9e7a-bc387f9662df","Type":"ContainerDied","Data":"03a4f2bcd347b9f116d0bb722090bc3d64c5bbada85d7a2085b0df772b554219"} Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.101216 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" event={"ID":"fa7a2543-8a0d-4d25-9e7a-bc387f9662df","Type":"ContainerDied","Data":"6f567feeedfca9666b2b3013232517cb979a7009fe9e67e796e5cfa0e8c747be"} Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.101226 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" event={"ID":"fa7a2543-8a0d-4d25-9e7a-bc387f9662df","Type":"ContainerDied","Data":"d26c4e4548c14b8e7000a08e95791dbba2dbf19d141a9339a1189eef4a6671d4"} Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.103051 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hk75c_9d0cffc0-c15f-4461-817c-1a937ad2afba/kube-multus/0.log" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.103104 4893 generic.go:334] "Generic (PLEG): container finished" podID="9d0cffc0-c15f-4461-817c-1a937ad2afba" containerID="cc713a16aeafba57ddb4b72c228d42d3c34a69d0bcfca89de5ddf76c94a6b0fa" exitCode=2 Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.103143 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hk75c" event={"ID":"9d0cffc0-c15f-4461-817c-1a937ad2afba","Type":"ContainerDied","Data":"cc713a16aeafba57ddb4b72c228d42d3c34a69d0bcfca89de5ddf76c94a6b0fa"} Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.103695 4893 scope.go:117] "RemoveContainer" containerID="cc713a16aeafba57ddb4b72c228d42d3c34a69d0bcfca89de5ddf76c94a6b0fa" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.447820 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cbskd_fa7a2543-8a0d-4d25-9e7a-bc387f9662df/ovn-acl-logging/0.log" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.448575 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cbskd_fa7a2543-8a0d-4d25-9e7a-bc387f9662df/ovn-controller/0.log" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.448960 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.509313 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bszj2"] Mar 14 07:13:09 crc kubenswrapper[4893]: E0314 07:13:09.509626 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa7a2543-8a0d-4d25-9e7a-bc387f9662df" containerName="kube-rbac-proxy-ovn-metrics" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.509649 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa7a2543-8a0d-4d25-9e7a-bc387f9662df" containerName="kube-rbac-proxy-ovn-metrics" Mar 14 07:13:09 crc kubenswrapper[4893]: E0314 07:13:09.509668 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa7a2543-8a0d-4d25-9e7a-bc387f9662df" containerName="nbdb" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.509676 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa7a2543-8a0d-4d25-9e7a-bc387f9662df" containerName="nbdb" Mar 14 07:13:09 crc kubenswrapper[4893]: E0314 07:13:09.509690 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa7a2543-8a0d-4d25-9e7a-bc387f9662df" containerName="sbdb" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.509697 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa7a2543-8a0d-4d25-9e7a-bc387f9662df" containerName="sbdb" Mar 14 07:13:09 crc kubenswrapper[4893]: E0314 07:13:09.509706 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa7a2543-8a0d-4d25-9e7a-bc387f9662df" containerName="ovnkube-controller" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.509714 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa7a2543-8a0d-4d25-9e7a-bc387f9662df" containerName="ovnkube-controller" Mar 14 07:13:09 crc kubenswrapper[4893]: E0314 07:13:09.509726 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa7a2543-8a0d-4d25-9e7a-bc387f9662df" containerName="kubecfg-setup" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.509735 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa7a2543-8a0d-4d25-9e7a-bc387f9662df" containerName="kubecfg-setup" Mar 14 07:13:09 crc kubenswrapper[4893]: E0314 07:13:09.509753 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa7a2543-8a0d-4d25-9e7a-bc387f9662df" containerName="kube-rbac-proxy-node" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.509763 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa7a2543-8a0d-4d25-9e7a-bc387f9662df" containerName="kube-rbac-proxy-node" Mar 14 07:13:09 crc kubenswrapper[4893]: E0314 07:13:09.509779 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa7a2543-8a0d-4d25-9e7a-bc387f9662df" containerName="northd" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.509787 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa7a2543-8a0d-4d25-9e7a-bc387f9662df" containerName="northd" Mar 14 07:13:09 crc kubenswrapper[4893]: E0314 07:13:09.509799 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa7a2543-8a0d-4d25-9e7a-bc387f9662df" containerName="ovn-controller" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.509808 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa7a2543-8a0d-4d25-9e7a-bc387f9662df" containerName="ovn-controller" Mar 14 07:13:09 crc kubenswrapper[4893]: E0314 07:13:09.509817 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0ed59f6-a6b5-4601-97d6-66f298b0fc3e" containerName="oc" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.509825 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0ed59f6-a6b5-4601-97d6-66f298b0fc3e" containerName="oc" Mar 14 07:13:09 crc kubenswrapper[4893]: E0314 07:13:09.509835 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa7a2543-8a0d-4d25-9e7a-bc387f9662df" containerName="ovn-acl-logging" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.509843 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa7a2543-8a0d-4d25-9e7a-bc387f9662df" containerName="ovn-acl-logging" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.509961 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa7a2543-8a0d-4d25-9e7a-bc387f9662df" containerName="northd" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.509980 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa7a2543-8a0d-4d25-9e7a-bc387f9662df" containerName="ovn-controller" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.509989 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa7a2543-8a0d-4d25-9e7a-bc387f9662df" containerName="ovnkube-controller" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.509999 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa7a2543-8a0d-4d25-9e7a-bc387f9662df" containerName="kube-rbac-proxy-node" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.510011 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa7a2543-8a0d-4d25-9e7a-bc387f9662df" containerName="nbdb" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.510021 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0ed59f6-a6b5-4601-97d6-66f298b0fc3e" containerName="oc" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.510029 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa7a2543-8a0d-4d25-9e7a-bc387f9662df" containerName="sbdb" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.510041 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa7a2543-8a0d-4d25-9e7a-bc387f9662df" containerName="ovn-acl-logging" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.510062 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa7a2543-8a0d-4d25-9e7a-bc387f9662df" containerName="kube-rbac-proxy-ovn-metrics" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.512305 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.533909 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-host-slash\") pod \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.533958 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-node-log\") pod \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.534026 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-ovnkube-script-lib\") pod \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.534026 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-host-slash" (OuterVolumeSpecName: "host-slash") pod "fa7a2543-8a0d-4d25-9e7a-bc387f9662df" (UID: "fa7a2543-8a0d-4d25-9e7a-bc387f9662df"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.534055 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-host-cni-bin\") pod \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.534077 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-node-log" (OuterVolumeSpecName: "node-log") pod "fa7a2543-8a0d-4d25-9e7a-bc387f9662df" (UID: "fa7a2543-8a0d-4d25-9e7a-bc387f9662df"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.534087 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-ovn-node-metrics-cert\") pod \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.534120 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-host-kubelet\") pod \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.534148 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-run-openvswitch\") pod \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.534193 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-ovnkube-config\") pod \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.534228 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-env-overrides\") pod \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.534255 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-host-run-ovn-kubernetes\") pod \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.534279 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-host-run-netns\") pod \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.534311 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-etc-openvswitch\") pod \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.534365 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zd7rs\" (UniqueName: \"kubernetes.io/projected/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-kube-api-access-zd7rs\") pod \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.534393 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-host-cni-netd\") pod \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.534432 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-systemd-units\") pod \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.534477 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-host-var-lib-cni-networks-ovn-kubernetes\") pod \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.534504 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-log-socket\") pod \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.534567 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-run-systemd\") pod \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.534617 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-var-lib-openvswitch\") pod \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.534619 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "fa7a2543-8a0d-4d25-9e7a-bc387f9662df" (UID: "fa7a2543-8a0d-4d25-9e7a-bc387f9662df"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.534645 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-run-ovn\") pod \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\" (UID: \"fa7a2543-8a0d-4d25-9e7a-bc387f9662df\") " Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.534659 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "fa7a2543-8a0d-4d25-9e7a-bc387f9662df" (UID: "fa7a2543-8a0d-4d25-9e7a-bc387f9662df"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.534686 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "fa7a2543-8a0d-4d25-9e7a-bc387f9662df" (UID: "fa7a2543-8a0d-4d25-9e7a-bc387f9662df"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.534828 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0a757c8a-4aae-44f8-ace1-d48816a7f9ac-host-kubelet\") pod \"ovnkube-node-bszj2\" (UID: \"0a757c8a-4aae-44f8-ace1-d48816a7f9ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.534904 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0a757c8a-4aae-44f8-ace1-d48816a7f9ac-run-ovn\") pod \"ovnkube-node-bszj2\" (UID: \"0a757c8a-4aae-44f8-ace1-d48816a7f9ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.535029 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "fa7a2543-8a0d-4d25-9e7a-bc387f9662df" (UID: "fa7a2543-8a0d-4d25-9e7a-bc387f9662df"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.535504 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0a757c8a-4aae-44f8-ace1-d48816a7f9ac-host-run-ovn-kubernetes\") pod \"ovnkube-node-bszj2\" (UID: \"0a757c8a-4aae-44f8-ace1-d48816a7f9ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.535561 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0a757c8a-4aae-44f8-ace1-d48816a7f9ac-run-openvswitch\") pod \"ovnkube-node-bszj2\" (UID: \"0a757c8a-4aae-44f8-ace1-d48816a7f9ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.535598 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0a757c8a-4aae-44f8-ace1-d48816a7f9ac-ovnkube-script-lib\") pod \"ovnkube-node-bszj2\" (UID: \"0a757c8a-4aae-44f8-ace1-d48816a7f9ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.535632 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0a757c8a-4aae-44f8-ace1-d48816a7f9ac-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bszj2\" (UID: \"0a757c8a-4aae-44f8-ace1-d48816a7f9ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.535656 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0a757c8a-4aae-44f8-ace1-d48816a7f9ac-var-lib-openvswitch\") pod \"ovnkube-node-bszj2\" (UID: \"0a757c8a-4aae-44f8-ace1-d48816a7f9ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.535689 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0a757c8a-4aae-44f8-ace1-d48816a7f9ac-env-overrides\") pod \"ovnkube-node-bszj2\" (UID: \"0a757c8a-4aae-44f8-ace1-d48816a7f9ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.535723 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0a757c8a-4aae-44f8-ace1-d48816a7f9ac-systemd-units\") pod \"ovnkube-node-bszj2\" (UID: \"0a757c8a-4aae-44f8-ace1-d48816a7f9ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.535741 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0a757c8a-4aae-44f8-ace1-d48816a7f9ac-etc-openvswitch\") pod \"ovnkube-node-bszj2\" (UID: \"0a757c8a-4aae-44f8-ace1-d48816a7f9ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.535766 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0a757c8a-4aae-44f8-ace1-d48816a7f9ac-run-systemd\") pod \"ovnkube-node-bszj2\" (UID: \"0a757c8a-4aae-44f8-ace1-d48816a7f9ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.535800 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0a757c8a-4aae-44f8-ace1-d48816a7f9ac-host-cni-netd\") pod \"ovnkube-node-bszj2\" (UID: \"0a757c8a-4aae-44f8-ace1-d48816a7f9ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.535830 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0a757c8a-4aae-44f8-ace1-d48816a7f9ac-node-log\") pod \"ovnkube-node-bszj2\" (UID: \"0a757c8a-4aae-44f8-ace1-d48816a7f9ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.535850 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0a757c8a-4aae-44f8-ace1-d48816a7f9ac-host-slash\") pod \"ovnkube-node-bszj2\" (UID: \"0a757c8a-4aae-44f8-ace1-d48816a7f9ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.535875 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0a757c8a-4aae-44f8-ace1-d48816a7f9ac-log-socket\") pod \"ovnkube-node-bszj2\" (UID: \"0a757c8a-4aae-44f8-ace1-d48816a7f9ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.535897 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0a757c8a-4aae-44f8-ace1-d48816a7f9ac-ovnkube-config\") pod \"ovnkube-node-bszj2\" (UID: \"0a757c8a-4aae-44f8-ace1-d48816a7f9ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.535922 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0a757c8a-4aae-44f8-ace1-d48816a7f9ac-host-cni-bin\") pod \"ovnkube-node-bszj2\" (UID: \"0a757c8a-4aae-44f8-ace1-d48816a7f9ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.535959 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcht7\" (UniqueName: \"kubernetes.io/projected/0a757c8a-4aae-44f8-ace1-d48816a7f9ac-kube-api-access-mcht7\") pod \"ovnkube-node-bszj2\" (UID: \"0a757c8a-4aae-44f8-ace1-d48816a7f9ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.535979 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0a757c8a-4aae-44f8-ace1-d48816a7f9ac-host-run-netns\") pod \"ovnkube-node-bszj2\" (UID: \"0a757c8a-4aae-44f8-ace1-d48816a7f9ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.536002 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0a757c8a-4aae-44f8-ace1-d48816a7f9ac-ovn-node-metrics-cert\") pod \"ovnkube-node-bszj2\" (UID: \"0a757c8a-4aae-44f8-ace1-d48816a7f9ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.536041 4893 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-host-slash\") on node \"crc\" DevicePath \"\"" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.536056 4893 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-node-log\") on node \"crc\" DevicePath \"\"" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.536068 4893 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.536083 4893 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.536095 4893 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.536106 4893 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.536221 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-log-socket" (OuterVolumeSpecName: "log-socket") pod "fa7a2543-8a0d-4d25-9e7a-bc387f9662df" (UID: "fa7a2543-8a0d-4d25-9e7a-bc387f9662df"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.536247 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "fa7a2543-8a0d-4d25-9e7a-bc387f9662df" (UID: "fa7a2543-8a0d-4d25-9e7a-bc387f9662df"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.536270 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "fa7a2543-8a0d-4d25-9e7a-bc387f9662df" (UID: "fa7a2543-8a0d-4d25-9e7a-bc387f9662df"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.536292 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "fa7a2543-8a0d-4d25-9e7a-bc387f9662df" (UID: "fa7a2543-8a0d-4d25-9e7a-bc387f9662df"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.536614 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "fa7a2543-8a0d-4d25-9e7a-bc387f9662df" (UID: "fa7a2543-8a0d-4d25-9e7a-bc387f9662df"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.536923 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "fa7a2543-8a0d-4d25-9e7a-bc387f9662df" (UID: "fa7a2543-8a0d-4d25-9e7a-bc387f9662df"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.536960 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "fa7a2543-8a0d-4d25-9e7a-bc387f9662df" (UID: "fa7a2543-8a0d-4d25-9e7a-bc387f9662df"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.536984 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "fa7a2543-8a0d-4d25-9e7a-bc387f9662df" (UID: "fa7a2543-8a0d-4d25-9e7a-bc387f9662df"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.537011 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "fa7a2543-8a0d-4d25-9e7a-bc387f9662df" (UID: "fa7a2543-8a0d-4d25-9e7a-bc387f9662df"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.537058 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "fa7a2543-8a0d-4d25-9e7a-bc387f9662df" (UID: "fa7a2543-8a0d-4d25-9e7a-bc387f9662df"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.544807 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "fa7a2543-8a0d-4d25-9e7a-bc387f9662df" (UID: "fa7a2543-8a0d-4d25-9e7a-bc387f9662df"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.554741 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "fa7a2543-8a0d-4d25-9e7a-bc387f9662df" (UID: "fa7a2543-8a0d-4d25-9e7a-bc387f9662df"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.554782 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-kube-api-access-zd7rs" (OuterVolumeSpecName: "kube-api-access-zd7rs") pod "fa7a2543-8a0d-4d25-9e7a-bc387f9662df" (UID: "fa7a2543-8a0d-4d25-9e7a-bc387f9662df"). InnerVolumeSpecName "kube-api-access-zd7rs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.554827 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "fa7a2543-8a0d-4d25-9e7a-bc387f9662df" (UID: "fa7a2543-8a0d-4d25-9e7a-bc387f9662df"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.636785 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0a757c8a-4aae-44f8-ace1-d48816a7f9ac-host-run-ovn-kubernetes\") pod \"ovnkube-node-bszj2\" (UID: \"0a757c8a-4aae-44f8-ace1-d48816a7f9ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.636824 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0a757c8a-4aae-44f8-ace1-d48816a7f9ac-run-openvswitch\") pod \"ovnkube-node-bszj2\" (UID: \"0a757c8a-4aae-44f8-ace1-d48816a7f9ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.636862 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0a757c8a-4aae-44f8-ace1-d48816a7f9ac-ovnkube-script-lib\") pod \"ovnkube-node-bszj2\" (UID: \"0a757c8a-4aae-44f8-ace1-d48816a7f9ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.636891 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0a757c8a-4aae-44f8-ace1-d48816a7f9ac-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bszj2\" (UID: \"0a757c8a-4aae-44f8-ace1-d48816a7f9ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.636918 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0a757c8a-4aae-44f8-ace1-d48816a7f9ac-var-lib-openvswitch\") pod \"ovnkube-node-bszj2\" (UID: \"0a757c8a-4aae-44f8-ace1-d48816a7f9ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.636946 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0a757c8a-4aae-44f8-ace1-d48816a7f9ac-env-overrides\") pod \"ovnkube-node-bszj2\" (UID: \"0a757c8a-4aae-44f8-ace1-d48816a7f9ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.636960 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0a757c8a-4aae-44f8-ace1-d48816a7f9ac-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bszj2\" (UID: \"0a757c8a-4aae-44f8-ace1-d48816a7f9ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.636971 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0a757c8a-4aae-44f8-ace1-d48816a7f9ac-etc-openvswitch\") pod \"ovnkube-node-bszj2\" (UID: \"0a757c8a-4aae-44f8-ace1-d48816a7f9ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.637015 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0a757c8a-4aae-44f8-ace1-d48816a7f9ac-var-lib-openvswitch\") pod \"ovnkube-node-bszj2\" (UID: \"0a757c8a-4aae-44f8-ace1-d48816a7f9ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.637002 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0a757c8a-4aae-44f8-ace1-d48816a7f9ac-run-openvswitch\") pod \"ovnkube-node-bszj2\" (UID: \"0a757c8a-4aae-44f8-ace1-d48816a7f9ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.636916 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0a757c8a-4aae-44f8-ace1-d48816a7f9ac-host-run-ovn-kubernetes\") pod \"ovnkube-node-bszj2\" (UID: \"0a757c8a-4aae-44f8-ace1-d48816a7f9ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.636995 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0a757c8a-4aae-44f8-ace1-d48816a7f9ac-etc-openvswitch\") pod \"ovnkube-node-bszj2\" (UID: \"0a757c8a-4aae-44f8-ace1-d48816a7f9ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.637158 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0a757c8a-4aae-44f8-ace1-d48816a7f9ac-systemd-units\") pod \"ovnkube-node-bszj2\" (UID: \"0a757c8a-4aae-44f8-ace1-d48816a7f9ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.637184 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0a757c8a-4aae-44f8-ace1-d48816a7f9ac-run-systemd\") pod \"ovnkube-node-bszj2\" (UID: \"0a757c8a-4aae-44f8-ace1-d48816a7f9ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.637213 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0a757c8a-4aae-44f8-ace1-d48816a7f9ac-host-cni-netd\") pod \"ovnkube-node-bszj2\" (UID: \"0a757c8a-4aae-44f8-ace1-d48816a7f9ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.637236 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0a757c8a-4aae-44f8-ace1-d48816a7f9ac-node-log\") pod \"ovnkube-node-bszj2\" (UID: \"0a757c8a-4aae-44f8-ace1-d48816a7f9ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.637255 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0a757c8a-4aae-44f8-ace1-d48816a7f9ac-host-slash\") pod \"ovnkube-node-bszj2\" (UID: \"0a757c8a-4aae-44f8-ace1-d48816a7f9ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.637278 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0a757c8a-4aae-44f8-ace1-d48816a7f9ac-log-socket\") pod \"ovnkube-node-bszj2\" (UID: \"0a757c8a-4aae-44f8-ace1-d48816a7f9ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.637276 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0a757c8a-4aae-44f8-ace1-d48816a7f9ac-systemd-units\") pod \"ovnkube-node-bszj2\" (UID: \"0a757c8a-4aae-44f8-ace1-d48816a7f9ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.637298 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0a757c8a-4aae-44f8-ace1-d48816a7f9ac-ovnkube-config\") pod \"ovnkube-node-bszj2\" (UID: \"0a757c8a-4aae-44f8-ace1-d48816a7f9ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.637322 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0a757c8a-4aae-44f8-ace1-d48816a7f9ac-host-cni-netd\") pod \"ovnkube-node-bszj2\" (UID: \"0a757c8a-4aae-44f8-ace1-d48816a7f9ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.637328 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0a757c8a-4aae-44f8-ace1-d48816a7f9ac-run-systemd\") pod \"ovnkube-node-bszj2\" (UID: \"0a757c8a-4aae-44f8-ace1-d48816a7f9ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.637348 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0a757c8a-4aae-44f8-ace1-d48816a7f9ac-host-cni-bin\") pod \"ovnkube-node-bszj2\" (UID: \"0a757c8a-4aae-44f8-ace1-d48816a7f9ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.637360 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0a757c8a-4aae-44f8-ace1-d48816a7f9ac-node-log\") pod \"ovnkube-node-bszj2\" (UID: \"0a757c8a-4aae-44f8-ace1-d48816a7f9ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.637384 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0a757c8a-4aae-44f8-ace1-d48816a7f9ac-log-socket\") pod \"ovnkube-node-bszj2\" (UID: \"0a757c8a-4aae-44f8-ace1-d48816a7f9ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.637420 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0a757c8a-4aae-44f8-ace1-d48816a7f9ac-host-slash\") pod \"ovnkube-node-bszj2\" (UID: \"0a757c8a-4aae-44f8-ace1-d48816a7f9ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.637322 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0a757c8a-4aae-44f8-ace1-d48816a7f9ac-host-cni-bin\") pod \"ovnkube-node-bszj2\" (UID: \"0a757c8a-4aae-44f8-ace1-d48816a7f9ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.637569 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcht7\" (UniqueName: \"kubernetes.io/projected/0a757c8a-4aae-44f8-ace1-d48816a7f9ac-kube-api-access-mcht7\") pod \"ovnkube-node-bszj2\" (UID: \"0a757c8a-4aae-44f8-ace1-d48816a7f9ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.637593 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0a757c8a-4aae-44f8-ace1-d48816a7f9ac-host-run-netns\") pod \"ovnkube-node-bszj2\" (UID: \"0a757c8a-4aae-44f8-ace1-d48816a7f9ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.637614 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0a757c8a-4aae-44f8-ace1-d48816a7f9ac-ovn-node-metrics-cert\") pod \"ovnkube-node-bszj2\" (UID: \"0a757c8a-4aae-44f8-ace1-d48816a7f9ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.637653 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0a757c8a-4aae-44f8-ace1-d48816a7f9ac-host-kubelet\") pod \"ovnkube-node-bszj2\" (UID: \"0a757c8a-4aae-44f8-ace1-d48816a7f9ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.637671 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0a757c8a-4aae-44f8-ace1-d48816a7f9ac-run-ovn\") pod \"ovnkube-node-bszj2\" (UID: \"0a757c8a-4aae-44f8-ace1-d48816a7f9ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.637755 4893 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.637766 4893 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.637775 4893 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.637783 4893 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.637791 4893 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.637800 4893 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.637808 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zd7rs\" (UniqueName: \"kubernetes.io/projected/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-kube-api-access-zd7rs\") on node \"crc\" DevicePath \"\"" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.637816 4893 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.637826 4893 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.637834 4893 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-log-socket\") on node \"crc\" DevicePath \"\"" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.637842 4893 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.637851 4893 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.637859 4893 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.637868 4893 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fa7a2543-8a0d-4d25-9e7a-bc387f9662df-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.637895 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0a757c8a-4aae-44f8-ace1-d48816a7f9ac-run-ovn\") pod \"ovnkube-node-bszj2\" (UID: \"0a757c8a-4aae-44f8-ace1-d48816a7f9ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.637920 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0a757c8a-4aae-44f8-ace1-d48816a7f9ac-ovnkube-config\") pod \"ovnkube-node-bszj2\" (UID: \"0a757c8a-4aae-44f8-ace1-d48816a7f9ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.637920 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0a757c8a-4aae-44f8-ace1-d48816a7f9ac-env-overrides\") pod \"ovnkube-node-bszj2\" (UID: \"0a757c8a-4aae-44f8-ace1-d48816a7f9ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.637976 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0a757c8a-4aae-44f8-ace1-d48816a7f9ac-host-kubelet\") pod \"ovnkube-node-bszj2\" (UID: \"0a757c8a-4aae-44f8-ace1-d48816a7f9ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.637997 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0a757c8a-4aae-44f8-ace1-d48816a7f9ac-host-run-netns\") pod \"ovnkube-node-bszj2\" (UID: \"0a757c8a-4aae-44f8-ace1-d48816a7f9ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.638398 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0a757c8a-4aae-44f8-ace1-d48816a7f9ac-ovnkube-script-lib\") pod \"ovnkube-node-bszj2\" (UID: \"0a757c8a-4aae-44f8-ace1-d48816a7f9ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.641478 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0a757c8a-4aae-44f8-ace1-d48816a7f9ac-ovn-node-metrics-cert\") pod \"ovnkube-node-bszj2\" (UID: \"0a757c8a-4aae-44f8-ace1-d48816a7f9ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.664171 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcht7\" (UniqueName: \"kubernetes.io/projected/0a757c8a-4aae-44f8-ace1-d48816a7f9ac-kube-api-access-mcht7\") pod \"ovnkube-node-bszj2\" (UID: \"0a757c8a-4aae-44f8-ace1-d48816a7f9ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" Mar 14 07:13:09 crc kubenswrapper[4893]: I0314 07:13:09.829781 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" Mar 14 07:13:10 crc kubenswrapper[4893]: I0314 07:13:10.123057 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cbskd_fa7a2543-8a0d-4d25-9e7a-bc387f9662df/ovn-acl-logging/0.log" Mar 14 07:13:10 crc kubenswrapper[4893]: I0314 07:13:10.123979 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cbskd_fa7a2543-8a0d-4d25-9e7a-bc387f9662df/ovn-controller/0.log" Mar 14 07:13:10 crc kubenswrapper[4893]: I0314 07:13:10.124336 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" event={"ID":"fa7a2543-8a0d-4d25-9e7a-bc387f9662df","Type":"ContainerDied","Data":"2a560e3ee24dac91f1ed7b177a5be2c4f340aa317b22682e48b6608050ff8859"} Mar 14 07:13:10 crc kubenswrapper[4893]: I0314 07:13:10.124392 4893 scope.go:117] "RemoveContainer" containerID="7a807995d9c790c0f2ca6db51e41256c6db5022f6edfa08930fed8f85a99319e" Mar 14 07:13:10 crc kubenswrapper[4893]: I0314 07:13:10.124736 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-cbskd" Mar 14 07:13:10 crc kubenswrapper[4893]: I0314 07:13:10.127333 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hk75c_9d0cffc0-c15f-4461-817c-1a937ad2afba/kube-multus/0.log" Mar 14 07:13:10 crc kubenswrapper[4893]: I0314 07:13:10.127432 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hk75c" event={"ID":"9d0cffc0-c15f-4461-817c-1a937ad2afba","Type":"ContainerStarted","Data":"aa4eb3ae41447a5327d0eefcd6349d793d3995c104872ba9d1a28358ae396104"} Mar 14 07:13:10 crc kubenswrapper[4893]: I0314 07:13:10.129927 4893 generic.go:334] "Generic (PLEG): container finished" podID="0a757c8a-4aae-44f8-ace1-d48816a7f9ac" containerID="19bd3f848b253ab0aa56501bc0b01b99575a29dca729e4647c13c70190915508" exitCode=0 Mar 14 07:13:10 crc kubenswrapper[4893]: I0314 07:13:10.129966 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" event={"ID":"0a757c8a-4aae-44f8-ace1-d48816a7f9ac","Type":"ContainerDied","Data":"19bd3f848b253ab0aa56501bc0b01b99575a29dca729e4647c13c70190915508"} Mar 14 07:13:10 crc kubenswrapper[4893]: I0314 07:13:10.129992 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" event={"ID":"0a757c8a-4aae-44f8-ace1-d48816a7f9ac","Type":"ContainerStarted","Data":"0984b46b5996870ec9ed74a7028070b7eb7407a4a37cd52ac29a604b2424b756"} Mar 14 07:13:10 crc kubenswrapper[4893]: I0314 07:13:10.174298 4893 scope.go:117] "RemoveContainer" containerID="99ccd9e40fe6a70d8fd34f384f62c4df059e937f65a039b068126039882d4784" Mar 14 07:13:10 crc kubenswrapper[4893]: I0314 07:13:10.189962 4893 scope.go:117] "RemoveContainer" containerID="67df34ecb8f8d052c704bb968928a75eda35defdf1070c97adace5a42bf07a75" Mar 14 07:13:10 crc kubenswrapper[4893]: I0314 07:13:10.215965 4893 scope.go:117] "RemoveContainer" containerID="f37f44adb89d717c3f8f96c0bec6073b95833268279249dd58f74cc6101590b7" Mar 14 07:13:10 crc kubenswrapper[4893]: I0314 07:13:10.231211 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-cbskd"] Mar 14 07:13:10 crc kubenswrapper[4893]: I0314 07:13:10.234994 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-cbskd"] Mar 14 07:13:10 crc kubenswrapper[4893]: I0314 07:13:10.252433 4893 scope.go:117] "RemoveContainer" containerID="ddc3fe94134a2ec81663c46aa2f6886c3f380d1c7c5eba5a895d452f8d140953" Mar 14 07:13:10 crc kubenswrapper[4893]: I0314 07:13:10.273089 4893 scope.go:117] "RemoveContainer" containerID="03a4f2bcd347b9f116d0bb722090bc3d64c5bbada85d7a2085b0df772b554219" Mar 14 07:13:10 crc kubenswrapper[4893]: I0314 07:13:10.301542 4893 scope.go:117] "RemoveContainer" containerID="6f567feeedfca9666b2b3013232517cb979a7009fe9e67e796e5cfa0e8c747be" Mar 14 07:13:10 crc kubenswrapper[4893]: I0314 07:13:10.316475 4893 scope.go:117] "RemoveContainer" containerID="d26c4e4548c14b8e7000a08e95791dbba2dbf19d141a9339a1189eef4a6671d4" Mar 14 07:13:10 crc kubenswrapper[4893]: I0314 07:13:10.334968 4893 scope.go:117] "RemoveContainer" containerID="c88e3264725c6c08616c082527e59f27875f23b6b601b99504f2f273127cdb9f" Mar 14 07:13:11 crc kubenswrapper[4893]: I0314 07:13:11.153324 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" event={"ID":"0a757c8a-4aae-44f8-ace1-d48816a7f9ac","Type":"ContainerStarted","Data":"a862505a79964616c29c7bc37f74c307f590e8ace95c412e64a49a6c398ad061"} Mar 14 07:13:11 crc kubenswrapper[4893]: I0314 07:13:11.153760 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" event={"ID":"0a757c8a-4aae-44f8-ace1-d48816a7f9ac","Type":"ContainerStarted","Data":"0a94cf8917ab24933c80e91282013ffa93f3666d51b2ad30dfafa8a4d5a2abee"} Mar 14 07:13:11 crc kubenswrapper[4893]: I0314 07:13:11.153805 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" event={"ID":"0a757c8a-4aae-44f8-ace1-d48816a7f9ac","Type":"ContainerStarted","Data":"20ddf2520f23d151accb0b3a6f56acf4fd9f0485ee22738e74ad6988697a700d"} Mar 14 07:13:11 crc kubenswrapper[4893]: I0314 07:13:11.153823 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" event={"ID":"0a757c8a-4aae-44f8-ace1-d48816a7f9ac","Type":"ContainerStarted","Data":"5bc0973b595350429ed60ec530892d9e276656cb53b29099005140e48aeb6fdf"} Mar 14 07:13:11 crc kubenswrapper[4893]: I0314 07:13:11.153836 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" event={"ID":"0a757c8a-4aae-44f8-ace1-d48816a7f9ac","Type":"ContainerStarted","Data":"9acdee9450b5c95f18580d7ebefdb86ada4884f1259d11ceb91eaa11fcd32051"} Mar 14 07:13:11 crc kubenswrapper[4893]: I0314 07:13:11.153848 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" event={"ID":"0a757c8a-4aae-44f8-ace1-d48816a7f9ac","Type":"ContainerStarted","Data":"96f86eea2288880141904bd6628478f822c30561b7a84d18c408e8a423128e0c"} Mar 14 07:13:11 crc kubenswrapper[4893]: I0314 07:13:11.385133 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa7a2543-8a0d-4d25-9e7a-bc387f9662df" path="/var/lib/kubelet/pods/fa7a2543-8a0d-4d25-9e7a-bc387f9662df/volumes" Mar 14 07:13:14 crc kubenswrapper[4893]: I0314 07:13:14.179833 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" event={"ID":"0a757c8a-4aae-44f8-ace1-d48816a7f9ac","Type":"ContainerStarted","Data":"0d601ece11036447f919039fe788dc160bbf5a4b2cdfa775ded7ea637764c007"} Mar 14 07:13:15 crc kubenswrapper[4893]: I0314 07:13:15.786757 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-44rml"] Mar 14 07:13:15 crc kubenswrapper[4893]: I0314 07:13:15.788710 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-44rml" Mar 14 07:13:15 crc kubenswrapper[4893]: I0314 07:13:15.790835 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Mar 14 07:13:15 crc kubenswrapper[4893]: I0314 07:13:15.791006 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Mar 14 07:13:15 crc kubenswrapper[4893]: I0314 07:13:15.791203 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Mar 14 07:13:15 crc kubenswrapper[4893]: I0314 07:13:15.791328 4893 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-4b7zk" Mar 14 07:13:15 crc kubenswrapper[4893]: I0314 07:13:15.825254 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/e28887b6-81c8-4891-b846-8c2a177c1fd6-node-mnt\") pod \"crc-storage-crc-44rml\" (UID: \"e28887b6-81c8-4891-b846-8c2a177c1fd6\") " pod="crc-storage/crc-storage-crc-44rml" Mar 14 07:13:15 crc kubenswrapper[4893]: I0314 07:13:15.825312 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2v6sv\" (UniqueName: \"kubernetes.io/projected/e28887b6-81c8-4891-b846-8c2a177c1fd6-kube-api-access-2v6sv\") pod \"crc-storage-crc-44rml\" (UID: \"e28887b6-81c8-4891-b846-8c2a177c1fd6\") " pod="crc-storage/crc-storage-crc-44rml" Mar 14 07:13:15 crc kubenswrapper[4893]: I0314 07:13:15.825383 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/e28887b6-81c8-4891-b846-8c2a177c1fd6-crc-storage\") pod \"crc-storage-crc-44rml\" (UID: \"e28887b6-81c8-4891-b846-8c2a177c1fd6\") " pod="crc-storage/crc-storage-crc-44rml" Mar 14 07:13:15 crc kubenswrapper[4893]: I0314 07:13:15.926650 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/e28887b6-81c8-4891-b846-8c2a177c1fd6-node-mnt\") pod \"crc-storage-crc-44rml\" (UID: \"e28887b6-81c8-4891-b846-8c2a177c1fd6\") " pod="crc-storage/crc-storage-crc-44rml" Mar 14 07:13:15 crc kubenswrapper[4893]: I0314 07:13:15.926712 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2v6sv\" (UniqueName: \"kubernetes.io/projected/e28887b6-81c8-4891-b846-8c2a177c1fd6-kube-api-access-2v6sv\") pod \"crc-storage-crc-44rml\" (UID: \"e28887b6-81c8-4891-b846-8c2a177c1fd6\") " pod="crc-storage/crc-storage-crc-44rml" Mar 14 07:13:15 crc kubenswrapper[4893]: I0314 07:13:15.926756 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/e28887b6-81c8-4891-b846-8c2a177c1fd6-crc-storage\") pod \"crc-storage-crc-44rml\" (UID: \"e28887b6-81c8-4891-b846-8c2a177c1fd6\") " pod="crc-storage/crc-storage-crc-44rml" Mar 14 07:13:15 crc kubenswrapper[4893]: I0314 07:13:15.927054 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/e28887b6-81c8-4891-b846-8c2a177c1fd6-node-mnt\") pod \"crc-storage-crc-44rml\" (UID: \"e28887b6-81c8-4891-b846-8c2a177c1fd6\") " pod="crc-storage/crc-storage-crc-44rml" Mar 14 07:13:15 crc kubenswrapper[4893]: I0314 07:13:15.930453 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/e28887b6-81c8-4891-b846-8c2a177c1fd6-crc-storage\") pod \"crc-storage-crc-44rml\" (UID: \"e28887b6-81c8-4891-b846-8c2a177c1fd6\") " pod="crc-storage/crc-storage-crc-44rml" Mar 14 07:13:15 crc kubenswrapper[4893]: I0314 07:13:15.946159 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2v6sv\" (UniqueName: \"kubernetes.io/projected/e28887b6-81c8-4891-b846-8c2a177c1fd6-kube-api-access-2v6sv\") pod \"crc-storage-crc-44rml\" (UID: \"e28887b6-81c8-4891-b846-8c2a177c1fd6\") " pod="crc-storage/crc-storage-crc-44rml" Mar 14 07:13:16 crc kubenswrapper[4893]: I0314 07:13:16.112241 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-44rml" Mar 14 07:13:16 crc kubenswrapper[4893]: E0314 07:13:16.132738 4893 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-44rml_crc-storage_e28887b6-81c8-4891-b846-8c2a177c1fd6_0(7ac2ece025de60c5e36d9674b171137e925b46ee77878bbb542b3cf4b5e44131): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 07:13:16 crc kubenswrapper[4893]: E0314 07:13:16.132807 4893 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-44rml_crc-storage_e28887b6-81c8-4891-b846-8c2a177c1fd6_0(7ac2ece025de60c5e36d9674b171137e925b46ee77878bbb542b3cf4b5e44131): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-44rml" Mar 14 07:13:16 crc kubenswrapper[4893]: E0314 07:13:16.132828 4893 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-44rml_crc-storage_e28887b6-81c8-4891-b846-8c2a177c1fd6_0(7ac2ece025de60c5e36d9674b171137e925b46ee77878bbb542b3cf4b5e44131): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-44rml" Mar 14 07:13:16 crc kubenswrapper[4893]: E0314 07:13:16.132875 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-44rml_crc-storage(e28887b6-81c8-4891-b846-8c2a177c1fd6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-44rml_crc-storage(e28887b6-81c8-4891-b846-8c2a177c1fd6)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-44rml_crc-storage_e28887b6-81c8-4891-b846-8c2a177c1fd6_0(7ac2ece025de60c5e36d9674b171137e925b46ee77878bbb542b3cf4b5e44131): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-44rml" podUID="e28887b6-81c8-4891-b846-8c2a177c1fd6" Mar 14 07:13:16 crc kubenswrapper[4893]: I0314 07:13:16.193247 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" event={"ID":"0a757c8a-4aae-44f8-ace1-d48816a7f9ac","Type":"ContainerStarted","Data":"00d90831274c8c037d844fa758de56414df2cf8bee63eec7c57501df1434be1a"} Mar 14 07:13:16 crc kubenswrapper[4893]: I0314 07:13:16.193554 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" Mar 14 07:13:16 crc kubenswrapper[4893]: I0314 07:13:16.223373 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" podStartSLOduration=7.22335409 podStartE2EDuration="7.22335409s" podCreationTimestamp="2026-03-14 07:13:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:13:16.220419417 +0000 UTC m=+875.482596209" watchObservedRunningTime="2026-03-14 07:13:16.22335409 +0000 UTC m=+875.485530882" Mar 14 07:13:16 crc kubenswrapper[4893]: I0314 07:13:16.270359 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" Mar 14 07:13:16 crc kubenswrapper[4893]: I0314 07:13:16.807717 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-44rml"] Mar 14 07:13:16 crc kubenswrapper[4893]: I0314 07:13:16.807849 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-44rml" Mar 14 07:13:16 crc kubenswrapper[4893]: I0314 07:13:16.808327 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-44rml" Mar 14 07:13:16 crc kubenswrapper[4893]: E0314 07:13:16.848988 4893 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-44rml_crc-storage_e28887b6-81c8-4891-b846-8c2a177c1fd6_0(5e843a5e984866eaf51be3f108f5c059b66ba95bccff4ac6315c3710d89b84db): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 07:13:16 crc kubenswrapper[4893]: E0314 07:13:16.849067 4893 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-44rml_crc-storage_e28887b6-81c8-4891-b846-8c2a177c1fd6_0(5e843a5e984866eaf51be3f108f5c059b66ba95bccff4ac6315c3710d89b84db): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-44rml" Mar 14 07:13:16 crc kubenswrapper[4893]: E0314 07:13:16.849103 4893 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-44rml_crc-storage_e28887b6-81c8-4891-b846-8c2a177c1fd6_0(5e843a5e984866eaf51be3f108f5c059b66ba95bccff4ac6315c3710d89b84db): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-44rml" Mar 14 07:13:16 crc kubenswrapper[4893]: E0314 07:13:16.849172 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-44rml_crc-storage(e28887b6-81c8-4891-b846-8c2a177c1fd6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-44rml_crc-storage(e28887b6-81c8-4891-b846-8c2a177c1fd6)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-44rml_crc-storage_e28887b6-81c8-4891-b846-8c2a177c1fd6_0(5e843a5e984866eaf51be3f108f5c059b66ba95bccff4ac6315c3710d89b84db): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-44rml" podUID="e28887b6-81c8-4891-b846-8c2a177c1fd6" Mar 14 07:13:17 crc kubenswrapper[4893]: I0314 07:13:17.202352 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" Mar 14 07:13:17 crc kubenswrapper[4893]: I0314 07:13:17.202958 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" Mar 14 07:13:17 crc kubenswrapper[4893]: I0314 07:13:17.226951 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" Mar 14 07:13:30 crc kubenswrapper[4893]: I0314 07:13:30.375996 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-44rml" Mar 14 07:13:30 crc kubenswrapper[4893]: I0314 07:13:30.377245 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-44rml" Mar 14 07:13:30 crc kubenswrapper[4893]: I0314 07:13:30.568763 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-44rml"] Mar 14 07:13:30 crc kubenswrapper[4893]: I0314 07:13:30.580359 4893 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 07:13:31 crc kubenswrapper[4893]: I0314 07:13:31.285299 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-44rml" event={"ID":"e28887b6-81c8-4891-b846-8c2a177c1fd6","Type":"ContainerStarted","Data":"1f280699edcc19d08f40bf477211bc77575eb99494a9c1a963613778131b66fa"} Mar 14 07:13:32 crc kubenswrapper[4893]: I0314 07:13:32.292104 4893 generic.go:334] "Generic (PLEG): container finished" podID="e28887b6-81c8-4891-b846-8c2a177c1fd6" containerID="6cec5f500d600f5f255b650d3d520fe42cf5a6a08f93fe28c3ea130fa0939a04" exitCode=0 Mar 14 07:13:32 crc kubenswrapper[4893]: I0314 07:13:32.292151 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-44rml" event={"ID":"e28887b6-81c8-4891-b846-8c2a177c1fd6","Type":"ContainerDied","Data":"6cec5f500d600f5f255b650d3d520fe42cf5a6a08f93fe28c3ea130fa0939a04"} Mar 14 07:13:33 crc kubenswrapper[4893]: I0314 07:13:33.583143 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-44rml" Mar 14 07:13:33 crc kubenswrapper[4893]: I0314 07:13:33.657927 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/e28887b6-81c8-4891-b846-8c2a177c1fd6-node-mnt\") pod \"e28887b6-81c8-4891-b846-8c2a177c1fd6\" (UID: \"e28887b6-81c8-4891-b846-8c2a177c1fd6\") " Mar 14 07:13:33 crc kubenswrapper[4893]: I0314 07:13:33.658022 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2v6sv\" (UniqueName: \"kubernetes.io/projected/e28887b6-81c8-4891-b846-8c2a177c1fd6-kube-api-access-2v6sv\") pod \"e28887b6-81c8-4891-b846-8c2a177c1fd6\" (UID: \"e28887b6-81c8-4891-b846-8c2a177c1fd6\") " Mar 14 07:13:33 crc kubenswrapper[4893]: I0314 07:13:33.658070 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/e28887b6-81c8-4891-b846-8c2a177c1fd6-crc-storage\") pod \"e28887b6-81c8-4891-b846-8c2a177c1fd6\" (UID: \"e28887b6-81c8-4891-b846-8c2a177c1fd6\") " Mar 14 07:13:33 crc kubenswrapper[4893]: I0314 07:13:33.658166 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e28887b6-81c8-4891-b846-8c2a177c1fd6-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "e28887b6-81c8-4891-b846-8c2a177c1fd6" (UID: "e28887b6-81c8-4891-b846-8c2a177c1fd6"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:13:33 crc kubenswrapper[4893]: I0314 07:13:33.658279 4893 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/e28887b6-81c8-4891-b846-8c2a177c1fd6-node-mnt\") on node \"crc\" DevicePath \"\"" Mar 14 07:13:33 crc kubenswrapper[4893]: I0314 07:13:33.663015 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e28887b6-81c8-4891-b846-8c2a177c1fd6-kube-api-access-2v6sv" (OuterVolumeSpecName: "kube-api-access-2v6sv") pod "e28887b6-81c8-4891-b846-8c2a177c1fd6" (UID: "e28887b6-81c8-4891-b846-8c2a177c1fd6"). InnerVolumeSpecName "kube-api-access-2v6sv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:13:33 crc kubenswrapper[4893]: I0314 07:13:33.676989 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e28887b6-81c8-4891-b846-8c2a177c1fd6-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "e28887b6-81c8-4891-b846-8c2a177c1fd6" (UID: "e28887b6-81c8-4891-b846-8c2a177c1fd6"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:13:33 crc kubenswrapper[4893]: I0314 07:13:33.760070 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2v6sv\" (UniqueName: \"kubernetes.io/projected/e28887b6-81c8-4891-b846-8c2a177c1fd6-kube-api-access-2v6sv\") on node \"crc\" DevicePath \"\"" Mar 14 07:13:33 crc kubenswrapper[4893]: I0314 07:13:33.760131 4893 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/e28887b6-81c8-4891-b846-8c2a177c1fd6-crc-storage\") on node \"crc\" DevicePath \"\"" Mar 14 07:13:34 crc kubenswrapper[4893]: I0314 07:13:34.308678 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-44rml" event={"ID":"e28887b6-81c8-4891-b846-8c2a177c1fd6","Type":"ContainerDied","Data":"1f280699edcc19d08f40bf477211bc77575eb99494a9c1a963613778131b66fa"} Mar 14 07:13:34 crc kubenswrapper[4893]: I0314 07:13:34.308724 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f280699edcc19d08f40bf477211bc77575eb99494a9c1a963613778131b66fa" Mar 14 07:13:34 crc kubenswrapper[4893]: I0314 07:13:34.308790 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-44rml" Mar 14 07:13:39 crc kubenswrapper[4893]: I0314 07:13:39.854543 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bszj2" Mar 14 07:13:39 crc kubenswrapper[4893]: I0314 07:13:39.994186 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874cdkvf"] Mar 14 07:13:39 crc kubenswrapper[4893]: E0314 07:13:39.994437 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e28887b6-81c8-4891-b846-8c2a177c1fd6" containerName="storage" Mar 14 07:13:39 crc kubenswrapper[4893]: I0314 07:13:39.994454 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="e28887b6-81c8-4891-b846-8c2a177c1fd6" containerName="storage" Mar 14 07:13:39 crc kubenswrapper[4893]: I0314 07:13:39.994596 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="e28887b6-81c8-4891-b846-8c2a177c1fd6" containerName="storage" Mar 14 07:13:39 crc kubenswrapper[4893]: I0314 07:13:39.995474 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874cdkvf" Mar 14 07:13:40 crc kubenswrapper[4893]: I0314 07:13:40.002818 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874cdkvf"] Mar 14 07:13:40 crc kubenswrapper[4893]: I0314 07:13:40.003292 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 14 07:13:40 crc kubenswrapper[4893]: I0314 07:13:40.035875 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eee28f73-c7eb-4a40-b0a0-3eb53b1cba7f-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874cdkvf\" (UID: \"eee28f73-c7eb-4a40-b0a0-3eb53b1cba7f\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874cdkvf" Mar 14 07:13:40 crc kubenswrapper[4893]: I0314 07:13:40.036214 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eee28f73-c7eb-4a40-b0a0-3eb53b1cba7f-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874cdkvf\" (UID: \"eee28f73-c7eb-4a40-b0a0-3eb53b1cba7f\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874cdkvf" Mar 14 07:13:40 crc kubenswrapper[4893]: I0314 07:13:40.036310 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zth6r\" (UniqueName: \"kubernetes.io/projected/eee28f73-c7eb-4a40-b0a0-3eb53b1cba7f-kube-api-access-zth6r\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874cdkvf\" (UID: \"eee28f73-c7eb-4a40-b0a0-3eb53b1cba7f\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874cdkvf" Mar 14 07:13:40 crc kubenswrapper[4893]: I0314 07:13:40.137698 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eee28f73-c7eb-4a40-b0a0-3eb53b1cba7f-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874cdkvf\" (UID: \"eee28f73-c7eb-4a40-b0a0-3eb53b1cba7f\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874cdkvf" Mar 14 07:13:40 crc kubenswrapper[4893]: I0314 07:13:40.137793 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eee28f73-c7eb-4a40-b0a0-3eb53b1cba7f-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874cdkvf\" (UID: \"eee28f73-c7eb-4a40-b0a0-3eb53b1cba7f\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874cdkvf" Mar 14 07:13:40 crc kubenswrapper[4893]: I0314 07:13:40.137824 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zth6r\" (UniqueName: \"kubernetes.io/projected/eee28f73-c7eb-4a40-b0a0-3eb53b1cba7f-kube-api-access-zth6r\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874cdkvf\" (UID: \"eee28f73-c7eb-4a40-b0a0-3eb53b1cba7f\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874cdkvf" Mar 14 07:13:40 crc kubenswrapper[4893]: I0314 07:13:40.138350 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eee28f73-c7eb-4a40-b0a0-3eb53b1cba7f-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874cdkvf\" (UID: \"eee28f73-c7eb-4a40-b0a0-3eb53b1cba7f\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874cdkvf" Mar 14 07:13:40 crc kubenswrapper[4893]: I0314 07:13:40.138424 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eee28f73-c7eb-4a40-b0a0-3eb53b1cba7f-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874cdkvf\" (UID: \"eee28f73-c7eb-4a40-b0a0-3eb53b1cba7f\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874cdkvf" Mar 14 07:13:40 crc kubenswrapper[4893]: I0314 07:13:40.169367 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zth6r\" (UniqueName: \"kubernetes.io/projected/eee28f73-c7eb-4a40-b0a0-3eb53b1cba7f-kube-api-access-zth6r\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874cdkvf\" (UID: \"eee28f73-c7eb-4a40-b0a0-3eb53b1cba7f\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874cdkvf" Mar 14 07:13:40 crc kubenswrapper[4893]: I0314 07:13:40.317508 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874cdkvf" Mar 14 07:13:40 crc kubenswrapper[4893]: I0314 07:13:40.502480 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874cdkvf"] Mar 14 07:13:41 crc kubenswrapper[4893]: I0314 07:13:41.351055 4893 generic.go:334] "Generic (PLEG): container finished" podID="eee28f73-c7eb-4a40-b0a0-3eb53b1cba7f" containerID="58527e0264cbab76d597c0e5699886ba8475b920548a3d7bb1b32e1e6460cfae" exitCode=0 Mar 14 07:13:41 crc kubenswrapper[4893]: I0314 07:13:41.351105 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874cdkvf" event={"ID":"eee28f73-c7eb-4a40-b0a0-3eb53b1cba7f","Type":"ContainerDied","Data":"58527e0264cbab76d597c0e5699886ba8475b920548a3d7bb1b32e1e6460cfae"} Mar 14 07:13:41 crc kubenswrapper[4893]: I0314 07:13:41.351146 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874cdkvf" event={"ID":"eee28f73-c7eb-4a40-b0a0-3eb53b1cba7f","Type":"ContainerStarted","Data":"1b97ee6a097942de31ec2c1873659cf97ff811ffe8f38fd9ad45545c12cec211"} Mar 14 07:13:42 crc kubenswrapper[4893]: I0314 07:13:42.125649 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pjqjl"] Mar 14 07:13:42 crc kubenswrapper[4893]: I0314 07:13:42.127833 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pjqjl" Mar 14 07:13:42 crc kubenswrapper[4893]: I0314 07:13:42.137113 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pjqjl"] Mar 14 07:13:42 crc kubenswrapper[4893]: I0314 07:13:42.162618 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6k6pw\" (UniqueName: \"kubernetes.io/projected/336a87c0-3910-451c-b7ec-1da49c958939-kube-api-access-6k6pw\") pod \"redhat-operators-pjqjl\" (UID: \"336a87c0-3910-451c-b7ec-1da49c958939\") " pod="openshift-marketplace/redhat-operators-pjqjl" Mar 14 07:13:42 crc kubenswrapper[4893]: I0314 07:13:42.162685 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/336a87c0-3910-451c-b7ec-1da49c958939-catalog-content\") pod \"redhat-operators-pjqjl\" (UID: \"336a87c0-3910-451c-b7ec-1da49c958939\") " pod="openshift-marketplace/redhat-operators-pjqjl" Mar 14 07:13:42 crc kubenswrapper[4893]: I0314 07:13:42.162717 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/336a87c0-3910-451c-b7ec-1da49c958939-utilities\") pod \"redhat-operators-pjqjl\" (UID: \"336a87c0-3910-451c-b7ec-1da49c958939\") " pod="openshift-marketplace/redhat-operators-pjqjl" Mar 14 07:13:42 crc kubenswrapper[4893]: I0314 07:13:42.264224 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6k6pw\" (UniqueName: \"kubernetes.io/projected/336a87c0-3910-451c-b7ec-1da49c958939-kube-api-access-6k6pw\") pod \"redhat-operators-pjqjl\" (UID: \"336a87c0-3910-451c-b7ec-1da49c958939\") " pod="openshift-marketplace/redhat-operators-pjqjl" Mar 14 07:13:42 crc kubenswrapper[4893]: I0314 07:13:42.264284 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/336a87c0-3910-451c-b7ec-1da49c958939-catalog-content\") pod \"redhat-operators-pjqjl\" (UID: \"336a87c0-3910-451c-b7ec-1da49c958939\") " pod="openshift-marketplace/redhat-operators-pjqjl" Mar 14 07:13:42 crc kubenswrapper[4893]: I0314 07:13:42.264826 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/336a87c0-3910-451c-b7ec-1da49c958939-catalog-content\") pod \"redhat-operators-pjqjl\" (UID: \"336a87c0-3910-451c-b7ec-1da49c958939\") " pod="openshift-marketplace/redhat-operators-pjqjl" Mar 14 07:13:42 crc kubenswrapper[4893]: I0314 07:13:42.264873 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/336a87c0-3910-451c-b7ec-1da49c958939-utilities\") pod \"redhat-operators-pjqjl\" (UID: \"336a87c0-3910-451c-b7ec-1da49c958939\") " pod="openshift-marketplace/redhat-operators-pjqjl" Mar 14 07:13:42 crc kubenswrapper[4893]: I0314 07:13:42.264939 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/336a87c0-3910-451c-b7ec-1da49c958939-utilities\") pod \"redhat-operators-pjqjl\" (UID: \"336a87c0-3910-451c-b7ec-1da49c958939\") " pod="openshift-marketplace/redhat-operators-pjqjl" Mar 14 07:13:42 crc kubenswrapper[4893]: I0314 07:13:42.293507 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6k6pw\" (UniqueName: \"kubernetes.io/projected/336a87c0-3910-451c-b7ec-1da49c958939-kube-api-access-6k6pw\") pod \"redhat-operators-pjqjl\" (UID: \"336a87c0-3910-451c-b7ec-1da49c958939\") " pod="openshift-marketplace/redhat-operators-pjqjl" Mar 14 07:13:42 crc kubenswrapper[4893]: I0314 07:13:42.448321 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pjqjl" Mar 14 07:13:42 crc kubenswrapper[4893]: I0314 07:13:42.637006 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pjqjl"] Mar 14 07:13:42 crc kubenswrapper[4893]: W0314 07:13:42.646235 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod336a87c0_3910_451c_b7ec_1da49c958939.slice/crio-120483a7aef86057e9f8435829575278fc0f08f17bf17b842d6bd8eaeaee95f9 WatchSource:0}: Error finding container 120483a7aef86057e9f8435829575278fc0f08f17bf17b842d6bd8eaeaee95f9: Status 404 returned error can't find the container with id 120483a7aef86057e9f8435829575278fc0f08f17bf17b842d6bd8eaeaee95f9 Mar 14 07:13:43 crc kubenswrapper[4893]: I0314 07:13:43.363366 4893 generic.go:334] "Generic (PLEG): container finished" podID="336a87c0-3910-451c-b7ec-1da49c958939" containerID="4cd46c8067e6363c18f8d0294e223046e272f5d1c525de9d5bcb6041b23ec2e2" exitCode=0 Mar 14 07:13:43 crc kubenswrapper[4893]: I0314 07:13:43.363416 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pjqjl" event={"ID":"336a87c0-3910-451c-b7ec-1da49c958939","Type":"ContainerDied","Data":"4cd46c8067e6363c18f8d0294e223046e272f5d1c525de9d5bcb6041b23ec2e2"} Mar 14 07:13:43 crc kubenswrapper[4893]: I0314 07:13:43.363442 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pjqjl" event={"ID":"336a87c0-3910-451c-b7ec-1da49c958939","Type":"ContainerStarted","Data":"120483a7aef86057e9f8435829575278fc0f08f17bf17b842d6bd8eaeaee95f9"} Mar 14 07:13:44 crc kubenswrapper[4893]: I0314 07:13:44.372445 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874cdkvf" event={"ID":"eee28f73-c7eb-4a40-b0a0-3eb53b1cba7f","Type":"ContainerStarted","Data":"a5c71d3eb6f97fe9367a045c95fe8a084ebb253d1e3629c563e39010daa7ac87"} Mar 14 07:13:45 crc kubenswrapper[4893]: I0314 07:13:45.383984 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pjqjl" event={"ID":"336a87c0-3910-451c-b7ec-1da49c958939","Type":"ContainerStarted","Data":"6f2638d11605da999f36fc90644e0997ff9ec3c51da6aa45f8b95dd89beee1f2"} Mar 14 07:13:45 crc kubenswrapper[4893]: I0314 07:13:45.387290 4893 generic.go:334] "Generic (PLEG): container finished" podID="eee28f73-c7eb-4a40-b0a0-3eb53b1cba7f" containerID="a5c71d3eb6f97fe9367a045c95fe8a084ebb253d1e3629c563e39010daa7ac87" exitCode=0 Mar 14 07:13:45 crc kubenswrapper[4893]: I0314 07:13:45.387350 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874cdkvf" event={"ID":"eee28f73-c7eb-4a40-b0a0-3eb53b1cba7f","Type":"ContainerDied","Data":"a5c71d3eb6f97fe9367a045c95fe8a084ebb253d1e3629c563e39010daa7ac87"} Mar 14 07:13:46 crc kubenswrapper[4893]: I0314 07:13:46.395259 4893 generic.go:334] "Generic (PLEG): container finished" podID="336a87c0-3910-451c-b7ec-1da49c958939" containerID="6f2638d11605da999f36fc90644e0997ff9ec3c51da6aa45f8b95dd89beee1f2" exitCode=0 Mar 14 07:13:46 crc kubenswrapper[4893]: I0314 07:13:46.395356 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pjqjl" event={"ID":"336a87c0-3910-451c-b7ec-1da49c958939","Type":"ContainerDied","Data":"6f2638d11605da999f36fc90644e0997ff9ec3c51da6aa45f8b95dd89beee1f2"} Mar 14 07:13:46 crc kubenswrapper[4893]: I0314 07:13:46.400627 4893 generic.go:334] "Generic (PLEG): container finished" podID="eee28f73-c7eb-4a40-b0a0-3eb53b1cba7f" containerID="2e2613c72986155707943446dfe8f0952fdd54942f68ec2f77a2996e516b5cd9" exitCode=0 Mar 14 07:13:46 crc kubenswrapper[4893]: I0314 07:13:46.400679 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874cdkvf" event={"ID":"eee28f73-c7eb-4a40-b0a0-3eb53b1cba7f","Type":"ContainerDied","Data":"2e2613c72986155707943446dfe8f0952fdd54942f68ec2f77a2996e516b5cd9"} Mar 14 07:13:47 crc kubenswrapper[4893]: I0314 07:13:47.414369 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pjqjl" event={"ID":"336a87c0-3910-451c-b7ec-1da49c958939","Type":"ContainerStarted","Data":"fe1887c899236db08cde769ae67eb362b4a2873fc37f175bbc4791138ae95606"} Mar 14 07:13:47 crc kubenswrapper[4893]: I0314 07:13:47.436831 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pjqjl" podStartSLOduration=1.989766687 podStartE2EDuration="5.436808656s" podCreationTimestamp="2026-03-14 07:13:42 +0000 UTC" firstStartedPulling="2026-03-14 07:13:43.364739456 +0000 UTC m=+902.626916248" lastFinishedPulling="2026-03-14 07:13:46.811781435 +0000 UTC m=+906.073958217" observedRunningTime="2026-03-14 07:13:47.435124725 +0000 UTC m=+906.697301587" watchObservedRunningTime="2026-03-14 07:13:47.436808656 +0000 UTC m=+906.698985458" Mar 14 07:13:47 crc kubenswrapper[4893]: I0314 07:13:47.650449 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874cdkvf" Mar 14 07:13:47 crc kubenswrapper[4893]: I0314 07:13:47.842162 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eee28f73-c7eb-4a40-b0a0-3eb53b1cba7f-bundle\") pod \"eee28f73-c7eb-4a40-b0a0-3eb53b1cba7f\" (UID: \"eee28f73-c7eb-4a40-b0a0-3eb53b1cba7f\") " Mar 14 07:13:47 crc kubenswrapper[4893]: I0314 07:13:47.842224 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zth6r\" (UniqueName: \"kubernetes.io/projected/eee28f73-c7eb-4a40-b0a0-3eb53b1cba7f-kube-api-access-zth6r\") pod \"eee28f73-c7eb-4a40-b0a0-3eb53b1cba7f\" (UID: \"eee28f73-c7eb-4a40-b0a0-3eb53b1cba7f\") " Mar 14 07:13:47 crc kubenswrapper[4893]: I0314 07:13:47.842281 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eee28f73-c7eb-4a40-b0a0-3eb53b1cba7f-util\") pod \"eee28f73-c7eb-4a40-b0a0-3eb53b1cba7f\" (UID: \"eee28f73-c7eb-4a40-b0a0-3eb53b1cba7f\") " Mar 14 07:13:47 crc kubenswrapper[4893]: I0314 07:13:47.843380 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eee28f73-c7eb-4a40-b0a0-3eb53b1cba7f-bundle" (OuterVolumeSpecName: "bundle") pod "eee28f73-c7eb-4a40-b0a0-3eb53b1cba7f" (UID: "eee28f73-c7eb-4a40-b0a0-3eb53b1cba7f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:13:47 crc kubenswrapper[4893]: I0314 07:13:47.850817 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eee28f73-c7eb-4a40-b0a0-3eb53b1cba7f-kube-api-access-zth6r" (OuterVolumeSpecName: "kube-api-access-zth6r") pod "eee28f73-c7eb-4a40-b0a0-3eb53b1cba7f" (UID: "eee28f73-c7eb-4a40-b0a0-3eb53b1cba7f"). InnerVolumeSpecName "kube-api-access-zth6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:13:47 crc kubenswrapper[4893]: I0314 07:13:47.853851 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eee28f73-c7eb-4a40-b0a0-3eb53b1cba7f-util" (OuterVolumeSpecName: "util") pod "eee28f73-c7eb-4a40-b0a0-3eb53b1cba7f" (UID: "eee28f73-c7eb-4a40-b0a0-3eb53b1cba7f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:13:47 crc kubenswrapper[4893]: I0314 07:13:47.943731 4893 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eee28f73-c7eb-4a40-b0a0-3eb53b1cba7f-util\") on node \"crc\" DevicePath \"\"" Mar 14 07:13:47 crc kubenswrapper[4893]: I0314 07:13:47.943797 4893 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eee28f73-c7eb-4a40-b0a0-3eb53b1cba7f-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:13:47 crc kubenswrapper[4893]: I0314 07:13:47.943813 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zth6r\" (UniqueName: \"kubernetes.io/projected/eee28f73-c7eb-4a40-b0a0-3eb53b1cba7f-kube-api-access-zth6r\") on node \"crc\" DevicePath \"\"" Mar 14 07:13:48 crc kubenswrapper[4893]: I0314 07:13:48.421159 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874cdkvf" event={"ID":"eee28f73-c7eb-4a40-b0a0-3eb53b1cba7f","Type":"ContainerDied","Data":"1b97ee6a097942de31ec2c1873659cf97ff811ffe8f38fd9ad45545c12cec211"} Mar 14 07:13:48 crc kubenswrapper[4893]: I0314 07:13:48.421188 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874cdkvf" Mar 14 07:13:48 crc kubenswrapper[4893]: I0314 07:13:48.421219 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b97ee6a097942de31ec2c1873659cf97ff811ffe8f38fd9ad45545c12cec211" Mar 14 07:13:50 crc kubenswrapper[4893]: I0314 07:13:50.226915 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-cwxwg"] Mar 14 07:13:50 crc kubenswrapper[4893]: E0314 07:13:50.227469 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eee28f73-c7eb-4a40-b0a0-3eb53b1cba7f" containerName="util" Mar 14 07:13:50 crc kubenswrapper[4893]: I0314 07:13:50.227486 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="eee28f73-c7eb-4a40-b0a0-3eb53b1cba7f" containerName="util" Mar 14 07:13:50 crc kubenswrapper[4893]: E0314 07:13:50.227498 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eee28f73-c7eb-4a40-b0a0-3eb53b1cba7f" containerName="extract" Mar 14 07:13:50 crc kubenswrapper[4893]: I0314 07:13:50.227504 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="eee28f73-c7eb-4a40-b0a0-3eb53b1cba7f" containerName="extract" Mar 14 07:13:50 crc kubenswrapper[4893]: E0314 07:13:50.227536 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eee28f73-c7eb-4a40-b0a0-3eb53b1cba7f" containerName="pull" Mar 14 07:13:50 crc kubenswrapper[4893]: I0314 07:13:50.227546 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="eee28f73-c7eb-4a40-b0a0-3eb53b1cba7f" containerName="pull" Mar 14 07:13:50 crc kubenswrapper[4893]: I0314 07:13:50.227656 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="eee28f73-c7eb-4a40-b0a0-3eb53b1cba7f" containerName="extract" Mar 14 07:13:50 crc kubenswrapper[4893]: I0314 07:13:50.228046 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-cwxwg" Mar 14 07:13:50 crc kubenswrapper[4893]: I0314 07:13:50.230984 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 14 07:13:50 crc kubenswrapper[4893]: I0314 07:13:50.231171 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 14 07:13:50 crc kubenswrapper[4893]: I0314 07:13:50.231306 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-drdnn" Mar 14 07:13:50 crc kubenswrapper[4893]: I0314 07:13:50.235543 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-cwxwg"] Mar 14 07:13:50 crc kubenswrapper[4893]: I0314 07:13:50.373865 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94jzr\" (UniqueName: \"kubernetes.io/projected/ac8e2f6f-8e50-46f3-b062-310098d7091c-kube-api-access-94jzr\") pod \"nmstate-operator-796d4cfff4-cwxwg\" (UID: \"ac8e2f6f-8e50-46f3-b062-310098d7091c\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-cwxwg" Mar 14 07:13:50 crc kubenswrapper[4893]: I0314 07:13:50.475629 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94jzr\" (UniqueName: \"kubernetes.io/projected/ac8e2f6f-8e50-46f3-b062-310098d7091c-kube-api-access-94jzr\") pod \"nmstate-operator-796d4cfff4-cwxwg\" (UID: \"ac8e2f6f-8e50-46f3-b062-310098d7091c\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-cwxwg" Mar 14 07:13:50 crc kubenswrapper[4893]: I0314 07:13:50.494548 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94jzr\" (UniqueName: \"kubernetes.io/projected/ac8e2f6f-8e50-46f3-b062-310098d7091c-kube-api-access-94jzr\") pod \"nmstate-operator-796d4cfff4-cwxwg\" (UID: \"ac8e2f6f-8e50-46f3-b062-310098d7091c\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-cwxwg" Mar 14 07:13:50 crc kubenswrapper[4893]: I0314 07:13:50.543143 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-cwxwg" Mar 14 07:13:50 crc kubenswrapper[4893]: I0314 07:13:50.752543 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-cwxwg"] Mar 14 07:13:50 crc kubenswrapper[4893]: W0314 07:13:50.757746 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac8e2f6f_8e50_46f3_b062_310098d7091c.slice/crio-295a5a89620559714fe5fecd8d3ca1f8176c46242df33c3892364d7754c363a7 WatchSource:0}: Error finding container 295a5a89620559714fe5fecd8d3ca1f8176c46242df33c3892364d7754c363a7: Status 404 returned error can't find the container with id 295a5a89620559714fe5fecd8d3ca1f8176c46242df33c3892364d7754c363a7 Mar 14 07:13:51 crc kubenswrapper[4893]: I0314 07:13:51.438350 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-cwxwg" event={"ID":"ac8e2f6f-8e50-46f3-b062-310098d7091c","Type":"ContainerStarted","Data":"295a5a89620559714fe5fecd8d3ca1f8176c46242df33c3892364d7754c363a7"} Mar 14 07:13:52 crc kubenswrapper[4893]: I0314 07:13:52.449246 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pjqjl" Mar 14 07:13:52 crc kubenswrapper[4893]: I0314 07:13:52.449598 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pjqjl" Mar 14 07:13:53 crc kubenswrapper[4893]: I0314 07:13:53.497597 4893 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-pjqjl" podUID="336a87c0-3910-451c-b7ec-1da49c958939" containerName="registry-server" probeResult="failure" output=< Mar 14 07:13:53 crc kubenswrapper[4893]: timeout: failed to connect service ":50051" within 1s Mar 14 07:13:53 crc kubenswrapper[4893]: > Mar 14 07:13:57 crc kubenswrapper[4893]: I0314 07:13:57.474330 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-cwxwg" event={"ID":"ac8e2f6f-8e50-46f3-b062-310098d7091c","Type":"ContainerStarted","Data":"00f0e29b270ebc5679a56221647ff6313ec1c7dc6c47cea7349763acc0e16c3c"} Mar 14 07:13:57 crc kubenswrapper[4893]: I0314 07:13:57.502117 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-cwxwg" podStartSLOduration=1.061756223 podStartE2EDuration="7.50208968s" podCreationTimestamp="2026-03-14 07:13:50 +0000 UTC" firstStartedPulling="2026-03-14 07:13:50.75942969 +0000 UTC m=+910.021606482" lastFinishedPulling="2026-03-14 07:13:57.199763137 +0000 UTC m=+916.461939939" observedRunningTime="2026-03-14 07:13:57.495998161 +0000 UTC m=+916.758174963" watchObservedRunningTime="2026-03-14 07:13:57.50208968 +0000 UTC m=+916.764266502" Mar 14 07:13:57 crc kubenswrapper[4893]: I0314 07:13:57.527590 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-snbq4"] Mar 14 07:13:57 crc kubenswrapper[4893]: I0314 07:13:57.528797 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-snbq4" Mar 14 07:13:57 crc kubenswrapper[4893]: I0314 07:13:57.549014 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-snbq4"] Mar 14 07:13:57 crc kubenswrapper[4893]: I0314 07:13:57.659630 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/686f244c-6569-41e7-a316-de89833dad05-catalog-content\") pod \"certified-operators-snbq4\" (UID: \"686f244c-6569-41e7-a316-de89833dad05\") " pod="openshift-marketplace/certified-operators-snbq4" Mar 14 07:13:57 crc kubenswrapper[4893]: I0314 07:13:57.659683 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tphsj\" (UniqueName: \"kubernetes.io/projected/686f244c-6569-41e7-a316-de89833dad05-kube-api-access-tphsj\") pod \"certified-operators-snbq4\" (UID: \"686f244c-6569-41e7-a316-de89833dad05\") " pod="openshift-marketplace/certified-operators-snbq4" Mar 14 07:13:57 crc kubenswrapper[4893]: I0314 07:13:57.659708 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/686f244c-6569-41e7-a316-de89833dad05-utilities\") pod \"certified-operators-snbq4\" (UID: \"686f244c-6569-41e7-a316-de89833dad05\") " pod="openshift-marketplace/certified-operators-snbq4" Mar 14 07:13:57 crc kubenswrapper[4893]: I0314 07:13:57.761112 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/686f244c-6569-41e7-a316-de89833dad05-catalog-content\") pod \"certified-operators-snbq4\" (UID: \"686f244c-6569-41e7-a316-de89833dad05\") " pod="openshift-marketplace/certified-operators-snbq4" Mar 14 07:13:57 crc kubenswrapper[4893]: I0314 07:13:57.761408 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tphsj\" (UniqueName: \"kubernetes.io/projected/686f244c-6569-41e7-a316-de89833dad05-kube-api-access-tphsj\") pod \"certified-operators-snbq4\" (UID: \"686f244c-6569-41e7-a316-de89833dad05\") " pod="openshift-marketplace/certified-operators-snbq4" Mar 14 07:13:57 crc kubenswrapper[4893]: I0314 07:13:57.761566 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/686f244c-6569-41e7-a316-de89833dad05-utilities\") pod \"certified-operators-snbq4\" (UID: \"686f244c-6569-41e7-a316-de89833dad05\") " pod="openshift-marketplace/certified-operators-snbq4" Mar 14 07:13:57 crc kubenswrapper[4893]: I0314 07:13:57.761767 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/686f244c-6569-41e7-a316-de89833dad05-catalog-content\") pod \"certified-operators-snbq4\" (UID: \"686f244c-6569-41e7-a316-de89833dad05\") " pod="openshift-marketplace/certified-operators-snbq4" Mar 14 07:13:57 crc kubenswrapper[4893]: I0314 07:13:57.762247 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/686f244c-6569-41e7-a316-de89833dad05-utilities\") pod \"certified-operators-snbq4\" (UID: \"686f244c-6569-41e7-a316-de89833dad05\") " pod="openshift-marketplace/certified-operators-snbq4" Mar 14 07:13:57 crc kubenswrapper[4893]: I0314 07:13:57.786723 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tphsj\" (UniqueName: \"kubernetes.io/projected/686f244c-6569-41e7-a316-de89833dad05-kube-api-access-tphsj\") pod \"certified-operators-snbq4\" (UID: \"686f244c-6569-41e7-a316-de89833dad05\") " pod="openshift-marketplace/certified-operators-snbq4" Mar 14 07:13:57 crc kubenswrapper[4893]: I0314 07:13:57.854703 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-snbq4" Mar 14 07:13:58 crc kubenswrapper[4893]: I0314 07:13:58.129966 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-snbq4"] Mar 14 07:13:58 crc kubenswrapper[4893]: W0314 07:13:58.135467 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod686f244c_6569_41e7_a316_de89833dad05.slice/crio-c23f3770d8d51751074955fd710268f07814d8529021165472a74d9c7f73145d WatchSource:0}: Error finding container c23f3770d8d51751074955fd710268f07814d8529021165472a74d9c7f73145d: Status 404 returned error can't find the container with id c23f3770d8d51751074955fd710268f07814d8529021165472a74d9c7f73145d Mar 14 07:13:58 crc kubenswrapper[4893]: I0314 07:13:58.480778 4893 generic.go:334] "Generic (PLEG): container finished" podID="686f244c-6569-41e7-a316-de89833dad05" containerID="18119777fb8dca78364c38c2743c5ecdcf6a67d15f5d24ab15e6e3158e787067" exitCode=0 Mar 14 07:13:58 crc kubenswrapper[4893]: I0314 07:13:58.480831 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-snbq4" event={"ID":"686f244c-6569-41e7-a316-de89833dad05","Type":"ContainerDied","Data":"18119777fb8dca78364c38c2743c5ecdcf6a67d15f5d24ab15e6e3158e787067"} Mar 14 07:13:58 crc kubenswrapper[4893]: I0314 07:13:58.481079 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-snbq4" event={"ID":"686f244c-6569-41e7-a316-de89833dad05","Type":"ContainerStarted","Data":"c23f3770d8d51751074955fd710268f07814d8529021165472a74d9c7f73145d"} Mar 14 07:13:59 crc kubenswrapper[4893]: I0314 07:13:59.842434 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-cqr9d"] Mar 14 07:13:59 crc kubenswrapper[4893]: I0314 07:13:59.843694 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-cqr9d" Mar 14 07:13:59 crc kubenswrapper[4893]: I0314 07:13:59.855544 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-lm6j2" Mar 14 07:13:59 crc kubenswrapper[4893]: I0314 07:13:59.876781 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-v47ts"] Mar 14 07:13:59 crc kubenswrapper[4893]: I0314 07:13:59.880012 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-v47ts" Mar 14 07:13:59 crc kubenswrapper[4893]: I0314 07:13:59.889706 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-cqr9d"] Mar 14 07:13:59 crc kubenswrapper[4893]: I0314 07:13:59.898510 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 14 07:13:59 crc kubenswrapper[4893]: I0314 07:13:59.928501 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-k468n"] Mar 14 07:13:59 crc kubenswrapper[4893]: I0314 07:13:59.929485 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-k468n" Mar 14 07:13:59 crc kubenswrapper[4893]: I0314 07:13:59.944919 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-v47ts"] Mar 14 07:13:59 crc kubenswrapper[4893]: I0314 07:13:59.991348 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/d6b31376-0d73-47a7-93e6-26f77823be30-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-v47ts\" (UID: \"d6b31376-0d73-47a7-93e6-26f77823be30\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-v47ts" Mar 14 07:13:59 crc kubenswrapper[4893]: I0314 07:13:59.991402 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5zfw\" (UniqueName: \"kubernetes.io/projected/d6b31376-0d73-47a7-93e6-26f77823be30-kube-api-access-b5zfw\") pod \"nmstate-webhook-5f558f5558-v47ts\" (UID: \"d6b31376-0d73-47a7-93e6-26f77823be30\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-v47ts" Mar 14 07:13:59 crc kubenswrapper[4893]: I0314 07:13:59.991432 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pb676\" (UniqueName: \"kubernetes.io/projected/d76392bf-1abc-406a-bd1e-bb95efdba8fc-kube-api-access-pb676\") pod \"nmstate-metrics-9b8c8685d-cqr9d\" (UID: \"d76392bf-1abc-406a-bd1e-bb95efdba8fc\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-cqr9d" Mar 14 07:13:59 crc kubenswrapper[4893]: I0314 07:13:59.997350 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-vzjv7"] Mar 14 07:13:59 crc kubenswrapper[4893]: I0314 07:13:59.998012 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-vzjv7" Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.005322 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.005414 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-hgtmw" Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.005477 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.006306 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-vzjv7"] Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.092956 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/132b2e20-99cc-4cba-b1dd-9d7f51cb774c-nmstate-lock\") pod \"nmstate-handler-k468n\" (UID: \"132b2e20-99cc-4cba-b1dd-9d7f51cb774c\") " pod="openshift-nmstate/nmstate-handler-k468n" Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.093078 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pb676\" (UniqueName: \"kubernetes.io/projected/d76392bf-1abc-406a-bd1e-bb95efdba8fc-kube-api-access-pb676\") pod \"nmstate-metrics-9b8c8685d-cqr9d\" (UID: \"d76392bf-1abc-406a-bd1e-bb95efdba8fc\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-cqr9d" Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.093137 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/7d9fe9c4-e01d-4c14-babe-c16587326c63-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-vzjv7\" (UID: \"7d9fe9c4-e01d-4c14-babe-c16587326c63\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-vzjv7" Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.093211 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tv7dm\" (UniqueName: \"kubernetes.io/projected/7d9fe9c4-e01d-4c14-babe-c16587326c63-kube-api-access-tv7dm\") pod \"nmstate-console-plugin-86f58fcf4-vzjv7\" (UID: \"7d9fe9c4-e01d-4c14-babe-c16587326c63\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-vzjv7" Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.093291 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/132b2e20-99cc-4cba-b1dd-9d7f51cb774c-ovs-socket\") pod \"nmstate-handler-k468n\" (UID: \"132b2e20-99cc-4cba-b1dd-9d7f51cb774c\") " pod="openshift-nmstate/nmstate-handler-k468n" Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.093325 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/132b2e20-99cc-4cba-b1dd-9d7f51cb774c-dbus-socket\") pod \"nmstate-handler-k468n\" (UID: \"132b2e20-99cc-4cba-b1dd-9d7f51cb774c\") " pod="openshift-nmstate/nmstate-handler-k468n" Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.093380 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/d6b31376-0d73-47a7-93e6-26f77823be30-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-v47ts\" (UID: \"d6b31376-0d73-47a7-93e6-26f77823be30\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-v47ts" Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.093492 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5zfw\" (UniqueName: \"kubernetes.io/projected/d6b31376-0d73-47a7-93e6-26f77823be30-kube-api-access-b5zfw\") pod \"nmstate-webhook-5f558f5558-v47ts\" (UID: \"d6b31376-0d73-47a7-93e6-26f77823be30\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-v47ts" Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.093558 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wrnt\" (UniqueName: \"kubernetes.io/projected/132b2e20-99cc-4cba-b1dd-9d7f51cb774c-kube-api-access-8wrnt\") pod \"nmstate-handler-k468n\" (UID: \"132b2e20-99cc-4cba-b1dd-9d7f51cb774c\") " pod="openshift-nmstate/nmstate-handler-k468n" Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.093602 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/7d9fe9c4-e01d-4c14-babe-c16587326c63-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-vzjv7\" (UID: \"7d9fe9c4-e01d-4c14-babe-c16587326c63\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-vzjv7" Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.102620 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/d6b31376-0d73-47a7-93e6-26f77823be30-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-v47ts\" (UID: \"d6b31376-0d73-47a7-93e6-26f77823be30\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-v47ts" Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.111491 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5zfw\" (UniqueName: \"kubernetes.io/projected/d6b31376-0d73-47a7-93e6-26f77823be30-kube-api-access-b5zfw\") pod \"nmstate-webhook-5f558f5558-v47ts\" (UID: \"d6b31376-0d73-47a7-93e6-26f77823be30\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-v47ts" Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.117595 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pb676\" (UniqueName: \"kubernetes.io/projected/d76392bf-1abc-406a-bd1e-bb95efdba8fc-kube-api-access-pb676\") pod \"nmstate-metrics-9b8c8685d-cqr9d\" (UID: \"d76392bf-1abc-406a-bd1e-bb95efdba8fc\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-cqr9d" Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.136579 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557874-7lqhj"] Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.137318 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557874-7lqhj" Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.139898 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.141022 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-44qb7" Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.141149 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.148594 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557874-7lqhj"] Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.190435 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-cqr9d" Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.194692 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wrnt\" (UniqueName: \"kubernetes.io/projected/132b2e20-99cc-4cba-b1dd-9d7f51cb774c-kube-api-access-8wrnt\") pod \"nmstate-handler-k468n\" (UID: \"132b2e20-99cc-4cba-b1dd-9d7f51cb774c\") " pod="openshift-nmstate/nmstate-handler-k468n" Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.194741 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/7d9fe9c4-e01d-4c14-babe-c16587326c63-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-vzjv7\" (UID: \"7d9fe9c4-e01d-4c14-babe-c16587326c63\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-vzjv7" Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.194769 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/132b2e20-99cc-4cba-b1dd-9d7f51cb774c-nmstate-lock\") pod \"nmstate-handler-k468n\" (UID: \"132b2e20-99cc-4cba-b1dd-9d7f51cb774c\") " pod="openshift-nmstate/nmstate-handler-k468n" Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.194801 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/7d9fe9c4-e01d-4c14-babe-c16587326c63-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-vzjv7\" (UID: \"7d9fe9c4-e01d-4c14-babe-c16587326c63\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-vzjv7" Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.194835 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tv7dm\" (UniqueName: \"kubernetes.io/projected/7d9fe9c4-e01d-4c14-babe-c16587326c63-kube-api-access-tv7dm\") pod \"nmstate-console-plugin-86f58fcf4-vzjv7\" (UID: \"7d9fe9c4-e01d-4c14-babe-c16587326c63\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-vzjv7" Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.194880 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/132b2e20-99cc-4cba-b1dd-9d7f51cb774c-ovs-socket\") pod \"nmstate-handler-k468n\" (UID: \"132b2e20-99cc-4cba-b1dd-9d7f51cb774c\") " pod="openshift-nmstate/nmstate-handler-k468n" Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.194896 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/132b2e20-99cc-4cba-b1dd-9d7f51cb774c-dbus-socket\") pod \"nmstate-handler-k468n\" (UID: \"132b2e20-99cc-4cba-b1dd-9d7f51cb774c\") " pod="openshift-nmstate/nmstate-handler-k468n" Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.195021 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/132b2e20-99cc-4cba-b1dd-9d7f51cb774c-nmstate-lock\") pod \"nmstate-handler-k468n\" (UID: \"132b2e20-99cc-4cba-b1dd-9d7f51cb774c\") " pod="openshift-nmstate/nmstate-handler-k468n" Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.195382 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/132b2e20-99cc-4cba-b1dd-9d7f51cb774c-ovs-socket\") pod \"nmstate-handler-k468n\" (UID: \"132b2e20-99cc-4cba-b1dd-9d7f51cb774c\") " pod="openshift-nmstate/nmstate-handler-k468n" Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.195969 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/132b2e20-99cc-4cba-b1dd-9d7f51cb774c-dbus-socket\") pod \"nmstate-handler-k468n\" (UID: \"132b2e20-99cc-4cba-b1dd-9d7f51cb774c\") " pod="openshift-nmstate/nmstate-handler-k468n" Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.196082 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/7d9fe9c4-e01d-4c14-babe-c16587326c63-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-vzjv7\" (UID: \"7d9fe9c4-e01d-4c14-babe-c16587326c63\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-vzjv7" Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.199891 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/7d9fe9c4-e01d-4c14-babe-c16587326c63-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-vzjv7\" (UID: \"7d9fe9c4-e01d-4c14-babe-c16587326c63\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-vzjv7" Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.219993 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-v47ts" Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.220918 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tv7dm\" (UniqueName: \"kubernetes.io/projected/7d9fe9c4-e01d-4c14-babe-c16587326c63-kube-api-access-tv7dm\") pod \"nmstate-console-plugin-86f58fcf4-vzjv7\" (UID: \"7d9fe9c4-e01d-4c14-babe-c16587326c63\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-vzjv7" Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.221342 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wrnt\" (UniqueName: \"kubernetes.io/projected/132b2e20-99cc-4cba-b1dd-9d7f51cb774c-kube-api-access-8wrnt\") pod \"nmstate-handler-k468n\" (UID: \"132b2e20-99cc-4cba-b1dd-9d7f51cb774c\") " pod="openshift-nmstate/nmstate-handler-k468n" Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.243760 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-679b7bfb6c-gwjc2"] Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.248083 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-679b7bfb6c-gwjc2" Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.249817 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-k468n" Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.257122 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-679b7bfb6c-gwjc2"] Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.295961 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgfgh\" (UniqueName: \"kubernetes.io/projected/521aeee7-0d64-4708-8f7b-7718bfbaed47-kube-api-access-zgfgh\") pod \"auto-csr-approver-29557874-7lqhj\" (UID: \"521aeee7-0d64-4708-8f7b-7718bfbaed47\") " pod="openshift-infra/auto-csr-approver-29557874-7lqhj" Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.320877 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-vzjv7" Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.397315 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/582c19dd-8724-4bb9-b749-92384b2961ee-console-serving-cert\") pod \"console-679b7bfb6c-gwjc2\" (UID: \"582c19dd-8724-4bb9-b749-92384b2961ee\") " pod="openshift-console/console-679b7bfb6c-gwjc2" Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.397359 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/582c19dd-8724-4bb9-b749-92384b2961ee-service-ca\") pod \"console-679b7bfb6c-gwjc2\" (UID: \"582c19dd-8724-4bb9-b749-92384b2961ee\") " pod="openshift-console/console-679b7bfb6c-gwjc2" Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.397377 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/582c19dd-8724-4bb9-b749-92384b2961ee-oauth-serving-cert\") pod \"console-679b7bfb6c-gwjc2\" (UID: \"582c19dd-8724-4bb9-b749-92384b2961ee\") " pod="openshift-console/console-679b7bfb6c-gwjc2" Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.397418 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/582c19dd-8724-4bb9-b749-92384b2961ee-trusted-ca-bundle\") pod \"console-679b7bfb6c-gwjc2\" (UID: \"582c19dd-8724-4bb9-b749-92384b2961ee\") " pod="openshift-console/console-679b7bfb6c-gwjc2" Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.397497 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/582c19dd-8724-4bb9-b749-92384b2961ee-console-config\") pod \"console-679b7bfb6c-gwjc2\" (UID: \"582c19dd-8724-4bb9-b749-92384b2961ee\") " pod="openshift-console/console-679b7bfb6c-gwjc2" Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.397542 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgfgh\" (UniqueName: \"kubernetes.io/projected/521aeee7-0d64-4708-8f7b-7718bfbaed47-kube-api-access-zgfgh\") pod \"auto-csr-approver-29557874-7lqhj\" (UID: \"521aeee7-0d64-4708-8f7b-7718bfbaed47\") " pod="openshift-infra/auto-csr-approver-29557874-7lqhj" Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.397564 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqdgc\" (UniqueName: \"kubernetes.io/projected/582c19dd-8724-4bb9-b749-92384b2961ee-kube-api-access-nqdgc\") pod \"console-679b7bfb6c-gwjc2\" (UID: \"582c19dd-8724-4bb9-b749-92384b2961ee\") " pod="openshift-console/console-679b7bfb6c-gwjc2" Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.397590 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/582c19dd-8724-4bb9-b749-92384b2961ee-console-oauth-config\") pod \"console-679b7bfb6c-gwjc2\" (UID: \"582c19dd-8724-4bb9-b749-92384b2961ee\") " pod="openshift-console/console-679b7bfb6c-gwjc2" Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.417404 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgfgh\" (UniqueName: \"kubernetes.io/projected/521aeee7-0d64-4708-8f7b-7718bfbaed47-kube-api-access-zgfgh\") pod \"auto-csr-approver-29557874-7lqhj\" (UID: \"521aeee7-0d64-4708-8f7b-7718bfbaed47\") " pod="openshift-infra/auto-csr-approver-29557874-7lqhj" Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.492566 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-k468n" event={"ID":"132b2e20-99cc-4cba-b1dd-9d7f51cb774c","Type":"ContainerStarted","Data":"77306734b67656e2a474346578fbde145eb59169431b5b41ce5a0938f56ee780"} Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.494241 4893 generic.go:334] "Generic (PLEG): container finished" podID="686f244c-6569-41e7-a316-de89833dad05" containerID="28f14880dc087134114847ce1d89721e91d9faf3b3c1ba3a1d27c58ad4bbce71" exitCode=0 Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.494336 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-snbq4" event={"ID":"686f244c-6569-41e7-a316-de89833dad05","Type":"ContainerDied","Data":"28f14880dc087134114847ce1d89721e91d9faf3b3c1ba3a1d27c58ad4bbce71"} Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.499122 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqdgc\" (UniqueName: \"kubernetes.io/projected/582c19dd-8724-4bb9-b749-92384b2961ee-kube-api-access-nqdgc\") pod \"console-679b7bfb6c-gwjc2\" (UID: \"582c19dd-8724-4bb9-b749-92384b2961ee\") " pod="openshift-console/console-679b7bfb6c-gwjc2" Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.499171 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/582c19dd-8724-4bb9-b749-92384b2961ee-console-oauth-config\") pod \"console-679b7bfb6c-gwjc2\" (UID: \"582c19dd-8724-4bb9-b749-92384b2961ee\") " pod="openshift-console/console-679b7bfb6c-gwjc2" Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.499205 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/582c19dd-8724-4bb9-b749-92384b2961ee-service-ca\") pod \"console-679b7bfb6c-gwjc2\" (UID: \"582c19dd-8724-4bb9-b749-92384b2961ee\") " pod="openshift-console/console-679b7bfb6c-gwjc2" Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.499226 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/582c19dd-8724-4bb9-b749-92384b2961ee-oauth-serving-cert\") pod \"console-679b7bfb6c-gwjc2\" (UID: \"582c19dd-8724-4bb9-b749-92384b2961ee\") " pod="openshift-console/console-679b7bfb6c-gwjc2" Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.499249 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/582c19dd-8724-4bb9-b749-92384b2961ee-console-serving-cert\") pod \"console-679b7bfb6c-gwjc2\" (UID: \"582c19dd-8724-4bb9-b749-92384b2961ee\") " pod="openshift-console/console-679b7bfb6c-gwjc2" Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.499290 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/582c19dd-8724-4bb9-b749-92384b2961ee-trusted-ca-bundle\") pod \"console-679b7bfb6c-gwjc2\" (UID: \"582c19dd-8724-4bb9-b749-92384b2961ee\") " pod="openshift-console/console-679b7bfb6c-gwjc2" Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.499345 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/582c19dd-8724-4bb9-b749-92384b2961ee-console-config\") pod \"console-679b7bfb6c-gwjc2\" (UID: \"582c19dd-8724-4bb9-b749-92384b2961ee\") " pod="openshift-console/console-679b7bfb6c-gwjc2" Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.500382 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/582c19dd-8724-4bb9-b749-92384b2961ee-service-ca\") pod \"console-679b7bfb6c-gwjc2\" (UID: \"582c19dd-8724-4bb9-b749-92384b2961ee\") " pod="openshift-console/console-679b7bfb6c-gwjc2" Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.500427 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/582c19dd-8724-4bb9-b749-92384b2961ee-oauth-serving-cert\") pod \"console-679b7bfb6c-gwjc2\" (UID: \"582c19dd-8724-4bb9-b749-92384b2961ee\") " pod="openshift-console/console-679b7bfb6c-gwjc2" Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.500455 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/582c19dd-8724-4bb9-b749-92384b2961ee-trusted-ca-bundle\") pod \"console-679b7bfb6c-gwjc2\" (UID: \"582c19dd-8724-4bb9-b749-92384b2961ee\") " pod="openshift-console/console-679b7bfb6c-gwjc2" Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.500517 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/582c19dd-8724-4bb9-b749-92384b2961ee-console-config\") pod \"console-679b7bfb6c-gwjc2\" (UID: \"582c19dd-8724-4bb9-b749-92384b2961ee\") " pod="openshift-console/console-679b7bfb6c-gwjc2" Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.502961 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/582c19dd-8724-4bb9-b749-92384b2961ee-console-oauth-config\") pod \"console-679b7bfb6c-gwjc2\" (UID: \"582c19dd-8724-4bb9-b749-92384b2961ee\") " pod="openshift-console/console-679b7bfb6c-gwjc2" Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.502978 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/582c19dd-8724-4bb9-b749-92384b2961ee-console-serving-cert\") pod \"console-679b7bfb6c-gwjc2\" (UID: \"582c19dd-8724-4bb9-b749-92384b2961ee\") " pod="openshift-console/console-679b7bfb6c-gwjc2" Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.516552 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqdgc\" (UniqueName: \"kubernetes.io/projected/582c19dd-8724-4bb9-b749-92384b2961ee-kube-api-access-nqdgc\") pod \"console-679b7bfb6c-gwjc2\" (UID: \"582c19dd-8724-4bb9-b749-92384b2961ee\") " pod="openshift-console/console-679b7bfb6c-gwjc2" Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.540956 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-v47ts"] Mar 14 07:14:00 crc kubenswrapper[4893]: W0314 07:14:00.546155 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6b31376_0d73_47a7_93e6_26f77823be30.slice/crio-1455940f20c7e38239e5e0121c99c89ac908033bdbf45b9fb944b9ad6b4d5f2c WatchSource:0}: Error finding container 1455940f20c7e38239e5e0121c99c89ac908033bdbf45b9fb944b9ad6b4d5f2c: Status 404 returned error can't find the container with id 1455940f20c7e38239e5e0121c99c89ac908033bdbf45b9fb944b9ad6b4d5f2c Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.569105 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557874-7lqhj" Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.581405 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-679b7bfb6c-gwjc2" Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.696994 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-cqr9d"] Mar 14 07:14:00 crc kubenswrapper[4893]: W0314 07:14:00.759364 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd76392bf_1abc_406a_bd1e_bb95efdba8fc.slice/crio-5df3b16b9cf9874b9756cb6dbdf01621f209bc57c0f1e5d1579e9e70d1b770ba WatchSource:0}: Error finding container 5df3b16b9cf9874b9756cb6dbdf01621f209bc57c0f1e5d1579e9e70d1b770ba: Status 404 returned error can't find the container with id 5df3b16b9cf9874b9756cb6dbdf01621f209bc57c0f1e5d1579e9e70d1b770ba Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.800171 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-vzjv7"] Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.825266 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557874-7lqhj"] Mar 14 07:14:00 crc kubenswrapper[4893]: I0314 07:14:00.921441 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-679b7bfb6c-gwjc2"] Mar 14 07:14:00 crc kubenswrapper[4893]: W0314 07:14:00.934749 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod582c19dd_8724_4bb9_b749_92384b2961ee.slice/crio-a8e721bb8e053d6ef09a3f4cc3a3b8665b12436d557ef9dad579ee3465b6534b WatchSource:0}: Error finding container a8e721bb8e053d6ef09a3f4cc3a3b8665b12436d557ef9dad579ee3465b6534b: Status 404 returned error can't find the container with id a8e721bb8e053d6ef09a3f4cc3a3b8665b12436d557ef9dad579ee3465b6534b Mar 14 07:14:01 crc kubenswrapper[4893]: I0314 07:14:01.508378 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-cqr9d" event={"ID":"d76392bf-1abc-406a-bd1e-bb95efdba8fc","Type":"ContainerStarted","Data":"5df3b16b9cf9874b9756cb6dbdf01621f209bc57c0f1e5d1579e9e70d1b770ba"} Mar 14 07:14:01 crc kubenswrapper[4893]: I0314 07:14:01.513915 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-679b7bfb6c-gwjc2" event={"ID":"582c19dd-8724-4bb9-b749-92384b2961ee","Type":"ContainerStarted","Data":"b6d87e6b7a489bb3f698ade4304f86a93c2f20cc7265dc53df093362adcc1e02"} Mar 14 07:14:01 crc kubenswrapper[4893]: I0314 07:14:01.513953 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-679b7bfb6c-gwjc2" event={"ID":"582c19dd-8724-4bb9-b749-92384b2961ee","Type":"ContainerStarted","Data":"a8e721bb8e053d6ef09a3f4cc3a3b8665b12436d557ef9dad579ee3465b6534b"} Mar 14 07:14:01 crc kubenswrapper[4893]: I0314 07:14:01.515236 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-vzjv7" event={"ID":"7d9fe9c4-e01d-4c14-babe-c16587326c63","Type":"ContainerStarted","Data":"07d350087b035dc2c226bfc0478fbe9ba6c21dbffe8db396404dd1683643bc6b"} Mar 14 07:14:01 crc kubenswrapper[4893]: I0314 07:14:01.521486 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-snbq4" event={"ID":"686f244c-6569-41e7-a316-de89833dad05","Type":"ContainerStarted","Data":"f881004e0fff8d534bcdbb75b0e327a43bcec8dafb657e5ec91e46e5ca852e2b"} Mar 14 07:14:01 crc kubenswrapper[4893]: I0314 07:14:01.523203 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557874-7lqhj" event={"ID":"521aeee7-0d64-4708-8f7b-7718bfbaed47","Type":"ContainerStarted","Data":"6eb04ac53e039dcccb84e5247cbe3fce55051dfa6f807f9f819d116a4dde083a"} Mar 14 07:14:01 crc kubenswrapper[4893]: I0314 07:14:01.524226 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-v47ts" event={"ID":"d6b31376-0d73-47a7-93e6-26f77823be30","Type":"ContainerStarted","Data":"1455940f20c7e38239e5e0121c99c89ac908033bdbf45b9fb944b9ad6b4d5f2c"} Mar 14 07:14:01 crc kubenswrapper[4893]: I0314 07:14:01.534217 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-679b7bfb6c-gwjc2" podStartSLOduration=1.534028315 podStartE2EDuration="1.534028315s" podCreationTimestamp="2026-03-14 07:14:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:14:01.531263048 +0000 UTC m=+920.793439860" watchObservedRunningTime="2026-03-14 07:14:01.534028315 +0000 UTC m=+920.796205107" Mar 14 07:14:01 crc kubenswrapper[4893]: I0314 07:14:01.553735 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-snbq4" podStartSLOduration=1.769679028 podStartE2EDuration="4.553717757s" podCreationTimestamp="2026-03-14 07:13:57 +0000 UTC" firstStartedPulling="2026-03-14 07:13:58.482064425 +0000 UTC m=+917.744241217" lastFinishedPulling="2026-03-14 07:14:01.266103154 +0000 UTC m=+920.528279946" observedRunningTime="2026-03-14 07:14:01.549612487 +0000 UTC m=+920.811789279" watchObservedRunningTime="2026-03-14 07:14:01.553717757 +0000 UTC m=+920.815894549" Mar 14 07:14:02 crc kubenswrapper[4893]: I0314 07:14:02.491124 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pjqjl" Mar 14 07:14:02 crc kubenswrapper[4893]: I0314 07:14:02.538802 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pjqjl" Mar 14 07:14:03 crc kubenswrapper[4893]: I0314 07:14:03.542855 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557874-7lqhj" event={"ID":"521aeee7-0d64-4708-8f7b-7718bfbaed47","Type":"ContainerStarted","Data":"d61478f864ddff1c56163078581ecd2c7ef46fb5f00c3a2cbc2635474642b2fd"} Mar 14 07:14:03 crc kubenswrapper[4893]: I0314 07:14:03.558511 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557874-7lqhj" podStartSLOduration=2.055647518 podStartE2EDuration="3.558477926s" podCreationTimestamp="2026-03-14 07:14:00 +0000 UTC" firstStartedPulling="2026-03-14 07:14:00.899457347 +0000 UTC m=+920.161634139" lastFinishedPulling="2026-03-14 07:14:02.402287735 +0000 UTC m=+921.664464547" observedRunningTime="2026-03-14 07:14:03.556256271 +0000 UTC m=+922.818433073" watchObservedRunningTime="2026-03-14 07:14:03.558477926 +0000 UTC m=+922.820654718" Mar 14 07:14:04 crc kubenswrapper[4893]: I0314 07:14:04.121455 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pjqjl"] Mar 14 07:14:04 crc kubenswrapper[4893]: I0314 07:14:04.121702 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pjqjl" podUID="336a87c0-3910-451c-b7ec-1da49c958939" containerName="registry-server" containerID="cri-o://fe1887c899236db08cde769ae67eb362b4a2873fc37f175bbc4791138ae95606" gracePeriod=2 Mar 14 07:14:04 crc kubenswrapper[4893]: I0314 07:14:04.550471 4893 generic.go:334] "Generic (PLEG): container finished" podID="336a87c0-3910-451c-b7ec-1da49c958939" containerID="fe1887c899236db08cde769ae67eb362b4a2873fc37f175bbc4791138ae95606" exitCode=0 Mar 14 07:14:04 crc kubenswrapper[4893]: I0314 07:14:04.550603 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pjqjl" event={"ID":"336a87c0-3910-451c-b7ec-1da49c958939","Type":"ContainerDied","Data":"fe1887c899236db08cde769ae67eb362b4a2873fc37f175bbc4791138ae95606"} Mar 14 07:14:04 crc kubenswrapper[4893]: I0314 07:14:04.551972 4893 generic.go:334] "Generic (PLEG): container finished" podID="521aeee7-0d64-4708-8f7b-7718bfbaed47" containerID="d61478f864ddff1c56163078581ecd2c7ef46fb5f00c3a2cbc2635474642b2fd" exitCode=0 Mar 14 07:14:04 crc kubenswrapper[4893]: I0314 07:14:04.552021 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557874-7lqhj" event={"ID":"521aeee7-0d64-4708-8f7b-7718bfbaed47","Type":"ContainerDied","Data":"d61478f864ddff1c56163078581ecd2c7ef46fb5f00c3a2cbc2635474642b2fd"} Mar 14 07:14:04 crc kubenswrapper[4893]: I0314 07:14:04.955955 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pjqjl" Mar 14 07:14:05 crc kubenswrapper[4893]: I0314 07:14:05.085154 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/336a87c0-3910-451c-b7ec-1da49c958939-catalog-content\") pod \"336a87c0-3910-451c-b7ec-1da49c958939\" (UID: \"336a87c0-3910-451c-b7ec-1da49c958939\") " Mar 14 07:14:05 crc kubenswrapper[4893]: I0314 07:14:05.085243 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6k6pw\" (UniqueName: \"kubernetes.io/projected/336a87c0-3910-451c-b7ec-1da49c958939-kube-api-access-6k6pw\") pod \"336a87c0-3910-451c-b7ec-1da49c958939\" (UID: \"336a87c0-3910-451c-b7ec-1da49c958939\") " Mar 14 07:14:05 crc kubenswrapper[4893]: I0314 07:14:05.085302 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/336a87c0-3910-451c-b7ec-1da49c958939-utilities\") pod \"336a87c0-3910-451c-b7ec-1da49c958939\" (UID: \"336a87c0-3910-451c-b7ec-1da49c958939\") " Mar 14 07:14:05 crc kubenswrapper[4893]: I0314 07:14:05.086506 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/336a87c0-3910-451c-b7ec-1da49c958939-utilities" (OuterVolumeSpecName: "utilities") pod "336a87c0-3910-451c-b7ec-1da49c958939" (UID: "336a87c0-3910-451c-b7ec-1da49c958939"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:14:05 crc kubenswrapper[4893]: I0314 07:14:05.090335 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/336a87c0-3910-451c-b7ec-1da49c958939-kube-api-access-6k6pw" (OuterVolumeSpecName: "kube-api-access-6k6pw") pod "336a87c0-3910-451c-b7ec-1da49c958939" (UID: "336a87c0-3910-451c-b7ec-1da49c958939"). InnerVolumeSpecName "kube-api-access-6k6pw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:14:05 crc kubenswrapper[4893]: I0314 07:14:05.186516 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6k6pw\" (UniqueName: \"kubernetes.io/projected/336a87c0-3910-451c-b7ec-1da49c958939-kube-api-access-6k6pw\") on node \"crc\" DevicePath \"\"" Mar 14 07:14:05 crc kubenswrapper[4893]: I0314 07:14:05.186570 4893 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/336a87c0-3910-451c-b7ec-1da49c958939-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 07:14:05 crc kubenswrapper[4893]: I0314 07:14:05.247922 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/336a87c0-3910-451c-b7ec-1da49c958939-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "336a87c0-3910-451c-b7ec-1da49c958939" (UID: "336a87c0-3910-451c-b7ec-1da49c958939"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:14:05 crc kubenswrapper[4893]: I0314 07:14:05.287655 4893 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/336a87c0-3910-451c-b7ec-1da49c958939-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 07:14:05 crc kubenswrapper[4893]: I0314 07:14:05.561308 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pjqjl" event={"ID":"336a87c0-3910-451c-b7ec-1da49c958939","Type":"ContainerDied","Data":"120483a7aef86057e9f8435829575278fc0f08f17bf17b842d6bd8eaeaee95f9"} Mar 14 07:14:05 crc kubenswrapper[4893]: I0314 07:14:05.561401 4893 scope.go:117] "RemoveContainer" containerID="fe1887c899236db08cde769ae67eb362b4a2873fc37f175bbc4791138ae95606" Mar 14 07:14:05 crc kubenswrapper[4893]: I0314 07:14:05.561427 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pjqjl" Mar 14 07:14:05 crc kubenswrapper[4893]: I0314 07:14:05.592185 4893 scope.go:117] "RemoveContainer" containerID="6f2638d11605da999f36fc90644e0997ff9ec3c51da6aa45f8b95dd89beee1f2" Mar 14 07:14:05 crc kubenswrapper[4893]: I0314 07:14:05.595003 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pjqjl"] Mar 14 07:14:05 crc kubenswrapper[4893]: I0314 07:14:05.601265 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-pjqjl"] Mar 14 07:14:05 crc kubenswrapper[4893]: I0314 07:14:05.620581 4893 scope.go:117] "RemoveContainer" containerID="4cd46c8067e6363c18f8d0294e223046e272f5d1c525de9d5bcb6041b23ec2e2" Mar 14 07:14:05 crc kubenswrapper[4893]: I0314 07:14:05.808159 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557874-7lqhj" Mar 14 07:14:05 crc kubenswrapper[4893]: I0314 07:14:05.996507 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgfgh\" (UniqueName: \"kubernetes.io/projected/521aeee7-0d64-4708-8f7b-7718bfbaed47-kube-api-access-zgfgh\") pod \"521aeee7-0d64-4708-8f7b-7718bfbaed47\" (UID: \"521aeee7-0d64-4708-8f7b-7718bfbaed47\") " Mar 14 07:14:06 crc kubenswrapper[4893]: I0314 07:14:06.006472 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/521aeee7-0d64-4708-8f7b-7718bfbaed47-kube-api-access-zgfgh" (OuterVolumeSpecName: "kube-api-access-zgfgh") pod "521aeee7-0d64-4708-8f7b-7718bfbaed47" (UID: "521aeee7-0d64-4708-8f7b-7718bfbaed47"). InnerVolumeSpecName "kube-api-access-zgfgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:14:06 crc kubenswrapper[4893]: I0314 07:14:06.098254 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgfgh\" (UniqueName: \"kubernetes.io/projected/521aeee7-0d64-4708-8f7b-7718bfbaed47-kube-api-access-zgfgh\") on node \"crc\" DevicePath \"\"" Mar 14 07:14:06 crc kubenswrapper[4893]: I0314 07:14:06.568611 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-v47ts" event={"ID":"d6b31376-0d73-47a7-93e6-26f77823be30","Type":"ContainerStarted","Data":"9deb243a408905d11fbcf66e37b5d5ebc590826cd09b8331ec9d529a35d032a0"} Mar 14 07:14:06 crc kubenswrapper[4893]: I0314 07:14:06.568817 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-v47ts" Mar 14 07:14:06 crc kubenswrapper[4893]: I0314 07:14:06.571320 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-cqr9d" event={"ID":"d76392bf-1abc-406a-bd1e-bb95efdba8fc","Type":"ContainerStarted","Data":"442cfd22e1ea8fa070c34d938906f60952a2d24dc52113d7032faffab54c35cd"} Mar 14 07:14:06 crc kubenswrapper[4893]: I0314 07:14:06.573425 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-k468n" event={"ID":"132b2e20-99cc-4cba-b1dd-9d7f51cb774c","Type":"ContainerStarted","Data":"1c625db0a92659e72d113362bfedf14e5d3ee650045d9f50047a391e53a39dc8"} Mar 14 07:14:06 crc kubenswrapper[4893]: I0314 07:14:06.575944 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-vzjv7" event={"ID":"7d9fe9c4-e01d-4c14-babe-c16587326c63","Type":"ContainerStarted","Data":"7890cf5c5da2b18de882d73427fac7dce2655e98355af356c42c91ba87059c2e"} Mar 14 07:14:06 crc kubenswrapper[4893]: I0314 07:14:06.594512 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557874-7lqhj" event={"ID":"521aeee7-0d64-4708-8f7b-7718bfbaed47","Type":"ContainerDied","Data":"6eb04ac53e039dcccb84e5247cbe3fce55051dfa6f807f9f819d116a4dde083a"} Mar 14 07:14:06 crc kubenswrapper[4893]: I0314 07:14:06.594622 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6eb04ac53e039dcccb84e5247cbe3fce55051dfa6f807f9f819d116a4dde083a" Mar 14 07:14:06 crc kubenswrapper[4893]: I0314 07:14:06.594705 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557874-7lqhj" Mar 14 07:14:06 crc kubenswrapper[4893]: I0314 07:14:06.595055 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-v47ts" podStartSLOduration=3.100287911 podStartE2EDuration="7.595004878s" podCreationTimestamp="2026-03-14 07:13:59 +0000 UTC" firstStartedPulling="2026-03-14 07:14:00.548284169 +0000 UTC m=+919.810460961" lastFinishedPulling="2026-03-14 07:14:05.043001116 +0000 UTC m=+924.305177928" observedRunningTime="2026-03-14 07:14:06.587107305 +0000 UTC m=+925.849284197" watchObservedRunningTime="2026-03-14 07:14:06.595004878 +0000 UTC m=+925.857181670" Mar 14 07:14:06 crc kubenswrapper[4893]: I0314 07:14:06.612104 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-k468n" podStartSLOduration=2.843088872 podStartE2EDuration="7.612082326s" podCreationTimestamp="2026-03-14 07:13:59 +0000 UTC" firstStartedPulling="2026-03-14 07:14:00.275217402 +0000 UTC m=+919.537394194" lastFinishedPulling="2026-03-14 07:14:05.044210856 +0000 UTC m=+924.306387648" observedRunningTime="2026-03-14 07:14:06.602751388 +0000 UTC m=+925.864928200" watchObservedRunningTime="2026-03-14 07:14:06.612082326 +0000 UTC m=+925.874259128" Mar 14 07:14:06 crc kubenswrapper[4893]: I0314 07:14:06.634181 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557868-xdsfw"] Mar 14 07:14:06 crc kubenswrapper[4893]: I0314 07:14:06.640156 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557868-xdsfw"] Mar 14 07:14:06 crc kubenswrapper[4893]: I0314 07:14:06.642041 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-vzjv7" podStartSLOduration=3.444724855 podStartE2EDuration="7.64202424s" podCreationTimestamp="2026-03-14 07:13:59 +0000 UTC" firstStartedPulling="2026-03-14 07:14:00.846706986 +0000 UTC m=+920.108883778" lastFinishedPulling="2026-03-14 07:14:05.044006371 +0000 UTC m=+924.306183163" observedRunningTime="2026-03-14 07:14:06.62242472 +0000 UTC m=+925.884601522" watchObservedRunningTime="2026-03-14 07:14:06.64202424 +0000 UTC m=+925.904201022" Mar 14 07:14:07 crc kubenswrapper[4893]: I0314 07:14:07.384841 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="336a87c0-3910-451c-b7ec-1da49c958939" path="/var/lib/kubelet/pods/336a87c0-3910-451c-b7ec-1da49c958939/volumes" Mar 14 07:14:07 crc kubenswrapper[4893]: I0314 07:14:07.385620 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a845fb3-b401-45ba-8c57-5d2b5a7e4320" path="/var/lib/kubelet/pods/8a845fb3-b401-45ba-8c57-5d2b5a7e4320/volumes" Mar 14 07:14:07 crc kubenswrapper[4893]: I0314 07:14:07.602261 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-k468n" Mar 14 07:14:07 crc kubenswrapper[4893]: I0314 07:14:07.856007 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-snbq4" Mar 14 07:14:07 crc kubenswrapper[4893]: I0314 07:14:07.856825 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-snbq4" Mar 14 07:14:07 crc kubenswrapper[4893]: I0314 07:14:07.894394 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-snbq4" Mar 14 07:14:08 crc kubenswrapper[4893]: I0314 07:14:08.529086 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-npfjh"] Mar 14 07:14:08 crc kubenswrapper[4893]: E0314 07:14:08.529679 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="521aeee7-0d64-4708-8f7b-7718bfbaed47" containerName="oc" Mar 14 07:14:08 crc kubenswrapper[4893]: I0314 07:14:08.529695 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="521aeee7-0d64-4708-8f7b-7718bfbaed47" containerName="oc" Mar 14 07:14:08 crc kubenswrapper[4893]: E0314 07:14:08.529718 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="336a87c0-3910-451c-b7ec-1da49c958939" containerName="extract-content" Mar 14 07:14:08 crc kubenswrapper[4893]: I0314 07:14:08.529726 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="336a87c0-3910-451c-b7ec-1da49c958939" containerName="extract-content" Mar 14 07:14:08 crc kubenswrapper[4893]: E0314 07:14:08.529745 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="336a87c0-3910-451c-b7ec-1da49c958939" containerName="extract-utilities" Mar 14 07:14:08 crc kubenswrapper[4893]: I0314 07:14:08.529754 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="336a87c0-3910-451c-b7ec-1da49c958939" containerName="extract-utilities" Mar 14 07:14:08 crc kubenswrapper[4893]: E0314 07:14:08.529764 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="336a87c0-3910-451c-b7ec-1da49c958939" containerName="registry-server" Mar 14 07:14:08 crc kubenswrapper[4893]: I0314 07:14:08.529774 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="336a87c0-3910-451c-b7ec-1da49c958939" containerName="registry-server" Mar 14 07:14:08 crc kubenswrapper[4893]: I0314 07:14:08.529902 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="336a87c0-3910-451c-b7ec-1da49c958939" containerName="registry-server" Mar 14 07:14:08 crc kubenswrapper[4893]: I0314 07:14:08.529921 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="521aeee7-0d64-4708-8f7b-7718bfbaed47" containerName="oc" Mar 14 07:14:08 crc kubenswrapper[4893]: I0314 07:14:08.531828 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-npfjh" Mar 14 07:14:08 crc kubenswrapper[4893]: I0314 07:14:08.546927 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-npfjh"] Mar 14 07:14:08 crc kubenswrapper[4893]: I0314 07:14:08.642180 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pph6\" (UniqueName: \"kubernetes.io/projected/76288b33-f834-43f4-ba1a-7be4a9ec82f0-kube-api-access-4pph6\") pod \"redhat-marketplace-npfjh\" (UID: \"76288b33-f834-43f4-ba1a-7be4a9ec82f0\") " pod="openshift-marketplace/redhat-marketplace-npfjh" Mar 14 07:14:08 crc kubenswrapper[4893]: I0314 07:14:08.642274 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76288b33-f834-43f4-ba1a-7be4a9ec82f0-utilities\") pod \"redhat-marketplace-npfjh\" (UID: \"76288b33-f834-43f4-ba1a-7be4a9ec82f0\") " pod="openshift-marketplace/redhat-marketplace-npfjh" Mar 14 07:14:08 crc kubenswrapper[4893]: I0314 07:14:08.643370 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76288b33-f834-43f4-ba1a-7be4a9ec82f0-catalog-content\") pod \"redhat-marketplace-npfjh\" (UID: \"76288b33-f834-43f4-ba1a-7be4a9ec82f0\") " pod="openshift-marketplace/redhat-marketplace-npfjh" Mar 14 07:14:08 crc kubenswrapper[4893]: I0314 07:14:08.651932 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-snbq4" Mar 14 07:14:08 crc kubenswrapper[4893]: I0314 07:14:08.744985 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76288b33-f834-43f4-ba1a-7be4a9ec82f0-utilities\") pod \"redhat-marketplace-npfjh\" (UID: \"76288b33-f834-43f4-ba1a-7be4a9ec82f0\") " pod="openshift-marketplace/redhat-marketplace-npfjh" Mar 14 07:14:08 crc kubenswrapper[4893]: I0314 07:14:08.745070 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76288b33-f834-43f4-ba1a-7be4a9ec82f0-catalog-content\") pod \"redhat-marketplace-npfjh\" (UID: \"76288b33-f834-43f4-ba1a-7be4a9ec82f0\") " pod="openshift-marketplace/redhat-marketplace-npfjh" Mar 14 07:14:08 crc kubenswrapper[4893]: I0314 07:14:08.745127 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pph6\" (UniqueName: \"kubernetes.io/projected/76288b33-f834-43f4-ba1a-7be4a9ec82f0-kube-api-access-4pph6\") pod \"redhat-marketplace-npfjh\" (UID: \"76288b33-f834-43f4-ba1a-7be4a9ec82f0\") " pod="openshift-marketplace/redhat-marketplace-npfjh" Mar 14 07:14:08 crc kubenswrapper[4893]: I0314 07:14:08.745748 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76288b33-f834-43f4-ba1a-7be4a9ec82f0-utilities\") pod \"redhat-marketplace-npfjh\" (UID: \"76288b33-f834-43f4-ba1a-7be4a9ec82f0\") " pod="openshift-marketplace/redhat-marketplace-npfjh" Mar 14 07:14:08 crc kubenswrapper[4893]: I0314 07:14:08.745904 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76288b33-f834-43f4-ba1a-7be4a9ec82f0-catalog-content\") pod \"redhat-marketplace-npfjh\" (UID: \"76288b33-f834-43f4-ba1a-7be4a9ec82f0\") " pod="openshift-marketplace/redhat-marketplace-npfjh" Mar 14 07:14:08 crc kubenswrapper[4893]: I0314 07:14:08.767436 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pph6\" (UniqueName: \"kubernetes.io/projected/76288b33-f834-43f4-ba1a-7be4a9ec82f0-kube-api-access-4pph6\") pod \"redhat-marketplace-npfjh\" (UID: \"76288b33-f834-43f4-ba1a-7be4a9ec82f0\") " pod="openshift-marketplace/redhat-marketplace-npfjh" Mar 14 07:14:08 crc kubenswrapper[4893]: I0314 07:14:08.866130 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-npfjh" Mar 14 07:14:09 crc kubenswrapper[4893]: I0314 07:14:09.056676 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-npfjh"] Mar 14 07:14:09 crc kubenswrapper[4893]: W0314 07:14:09.061346 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76288b33_f834_43f4_ba1a_7be4a9ec82f0.slice/crio-f2c775392b3e428c31ac6f46590a999640805444a2ab85eb778930f6f347aa4a WatchSource:0}: Error finding container f2c775392b3e428c31ac6f46590a999640805444a2ab85eb778930f6f347aa4a: Status 404 returned error can't find the container with id f2c775392b3e428c31ac6f46590a999640805444a2ab85eb778930f6f347aa4a Mar 14 07:14:09 crc kubenswrapper[4893]: I0314 07:14:09.621690 4893 generic.go:334] "Generic (PLEG): container finished" podID="76288b33-f834-43f4-ba1a-7be4a9ec82f0" containerID="6e6c05c0f80b01e3929ed2af4fd62baf09a7f1d736684a48342b97ec71f07c0a" exitCode=0 Mar 14 07:14:09 crc kubenswrapper[4893]: I0314 07:14:09.621751 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-npfjh" event={"ID":"76288b33-f834-43f4-ba1a-7be4a9ec82f0","Type":"ContainerDied","Data":"6e6c05c0f80b01e3929ed2af4fd62baf09a7f1d736684a48342b97ec71f07c0a"} Mar 14 07:14:09 crc kubenswrapper[4893]: I0314 07:14:09.622075 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-npfjh" event={"ID":"76288b33-f834-43f4-ba1a-7be4a9ec82f0","Type":"ContainerStarted","Data":"f2c775392b3e428c31ac6f46590a999640805444a2ab85eb778930f6f347aa4a"} Mar 14 07:14:09 crc kubenswrapper[4893]: I0314 07:14:09.624802 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-cqr9d" event={"ID":"d76392bf-1abc-406a-bd1e-bb95efdba8fc","Type":"ContainerStarted","Data":"1589df5f146d8352373405587894b01e9efd56e5dc64b0aafa12b8ebcf44b8cd"} Mar 14 07:14:09 crc kubenswrapper[4893]: I0314 07:14:09.660059 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-cqr9d" podStartSLOduration=2.68288523 podStartE2EDuration="10.660040059s" podCreationTimestamp="2026-03-14 07:13:59 +0000 UTC" firstStartedPulling="2026-03-14 07:14:00.763708393 +0000 UTC m=+920.025885185" lastFinishedPulling="2026-03-14 07:14:08.740863222 +0000 UTC m=+928.003040014" observedRunningTime="2026-03-14 07:14:09.659294991 +0000 UTC m=+928.921471843" watchObservedRunningTime="2026-03-14 07:14:09.660040059 +0000 UTC m=+928.922216851" Mar 14 07:14:10 crc kubenswrapper[4893]: I0314 07:14:10.272018 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-k468n" Mar 14 07:14:10 crc kubenswrapper[4893]: I0314 07:14:10.317369 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-snbq4"] Mar 14 07:14:10 crc kubenswrapper[4893]: I0314 07:14:10.582137 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-679b7bfb6c-gwjc2" Mar 14 07:14:10 crc kubenswrapper[4893]: I0314 07:14:10.582630 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-679b7bfb6c-gwjc2" Mar 14 07:14:10 crc kubenswrapper[4893]: I0314 07:14:10.590028 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-679b7bfb6c-gwjc2" Mar 14 07:14:10 crc kubenswrapper[4893]: I0314 07:14:10.631450 4893 generic.go:334] "Generic (PLEG): container finished" podID="76288b33-f834-43f4-ba1a-7be4a9ec82f0" containerID="793232b339a78f35c3aa4113346abc7394daab75a36cf1b7f7f65554abbaf291" exitCode=0 Mar 14 07:14:10 crc kubenswrapper[4893]: I0314 07:14:10.632878 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-npfjh" event={"ID":"76288b33-f834-43f4-ba1a-7be4a9ec82f0","Type":"ContainerDied","Data":"793232b339a78f35c3aa4113346abc7394daab75a36cf1b7f7f65554abbaf291"} Mar 14 07:14:10 crc kubenswrapper[4893]: I0314 07:14:10.637399 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-679b7bfb6c-gwjc2" Mar 14 07:14:10 crc kubenswrapper[4893]: I0314 07:14:10.698217 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-psm2j"] Mar 14 07:14:11 crc kubenswrapper[4893]: I0314 07:14:11.640344 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-snbq4" podUID="686f244c-6569-41e7-a316-de89833dad05" containerName="registry-server" containerID="cri-o://f881004e0fff8d534bcdbb75b0e327a43bcec8dafb657e5ec91e46e5ca852e2b" gracePeriod=2 Mar 14 07:14:11 crc kubenswrapper[4893]: I0314 07:14:11.640902 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-npfjh" event={"ID":"76288b33-f834-43f4-ba1a-7be4a9ec82f0","Type":"ContainerStarted","Data":"8fdf509605f98f2d1627b74accd73cc967ab8363dd1b5d5296bf9dbb91a7783b"} Mar 14 07:14:11 crc kubenswrapper[4893]: I0314 07:14:11.667138 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-npfjh" podStartSLOduration=2.26123897 podStartE2EDuration="3.667120204s" podCreationTimestamp="2026-03-14 07:14:08 +0000 UTC" firstStartedPulling="2026-03-14 07:14:09.624394446 +0000 UTC m=+928.886571248" lastFinishedPulling="2026-03-14 07:14:11.03027569 +0000 UTC m=+930.292452482" observedRunningTime="2026-03-14 07:14:11.665104364 +0000 UTC m=+930.927281186" watchObservedRunningTime="2026-03-14 07:14:11.667120204 +0000 UTC m=+930.929296996" Mar 14 07:14:12 crc kubenswrapper[4893]: I0314 07:14:12.229121 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-snbq4" Mar 14 07:14:12 crc kubenswrapper[4893]: I0314 07:14:12.394649 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tphsj\" (UniqueName: \"kubernetes.io/projected/686f244c-6569-41e7-a316-de89833dad05-kube-api-access-tphsj\") pod \"686f244c-6569-41e7-a316-de89833dad05\" (UID: \"686f244c-6569-41e7-a316-de89833dad05\") " Mar 14 07:14:12 crc kubenswrapper[4893]: I0314 07:14:12.394764 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/686f244c-6569-41e7-a316-de89833dad05-utilities\") pod \"686f244c-6569-41e7-a316-de89833dad05\" (UID: \"686f244c-6569-41e7-a316-de89833dad05\") " Mar 14 07:14:12 crc kubenswrapper[4893]: I0314 07:14:12.394812 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/686f244c-6569-41e7-a316-de89833dad05-catalog-content\") pod \"686f244c-6569-41e7-a316-de89833dad05\" (UID: \"686f244c-6569-41e7-a316-de89833dad05\") " Mar 14 07:14:12 crc kubenswrapper[4893]: I0314 07:14:12.395643 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/686f244c-6569-41e7-a316-de89833dad05-utilities" (OuterVolumeSpecName: "utilities") pod "686f244c-6569-41e7-a316-de89833dad05" (UID: "686f244c-6569-41e7-a316-de89833dad05"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:14:12 crc kubenswrapper[4893]: I0314 07:14:12.401057 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/686f244c-6569-41e7-a316-de89833dad05-kube-api-access-tphsj" (OuterVolumeSpecName: "kube-api-access-tphsj") pod "686f244c-6569-41e7-a316-de89833dad05" (UID: "686f244c-6569-41e7-a316-de89833dad05"). InnerVolumeSpecName "kube-api-access-tphsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:14:12 crc kubenswrapper[4893]: I0314 07:14:12.452805 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/686f244c-6569-41e7-a316-de89833dad05-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "686f244c-6569-41e7-a316-de89833dad05" (UID: "686f244c-6569-41e7-a316-de89833dad05"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:14:12 crc kubenswrapper[4893]: I0314 07:14:12.496413 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tphsj\" (UniqueName: \"kubernetes.io/projected/686f244c-6569-41e7-a316-de89833dad05-kube-api-access-tphsj\") on node \"crc\" DevicePath \"\"" Mar 14 07:14:12 crc kubenswrapper[4893]: I0314 07:14:12.496451 4893 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/686f244c-6569-41e7-a316-de89833dad05-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 07:14:12 crc kubenswrapper[4893]: I0314 07:14:12.496462 4893 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/686f244c-6569-41e7-a316-de89833dad05-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 07:14:12 crc kubenswrapper[4893]: I0314 07:14:12.649049 4893 generic.go:334] "Generic (PLEG): container finished" podID="686f244c-6569-41e7-a316-de89833dad05" containerID="f881004e0fff8d534bcdbb75b0e327a43bcec8dafb657e5ec91e46e5ca852e2b" exitCode=0 Mar 14 07:14:12 crc kubenswrapper[4893]: I0314 07:14:12.649120 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-snbq4" event={"ID":"686f244c-6569-41e7-a316-de89833dad05","Type":"ContainerDied","Data":"f881004e0fff8d534bcdbb75b0e327a43bcec8dafb657e5ec91e46e5ca852e2b"} Mar 14 07:14:12 crc kubenswrapper[4893]: I0314 07:14:12.649199 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-snbq4" event={"ID":"686f244c-6569-41e7-a316-de89833dad05","Type":"ContainerDied","Data":"c23f3770d8d51751074955fd710268f07814d8529021165472a74d9c7f73145d"} Mar 14 07:14:12 crc kubenswrapper[4893]: I0314 07:14:12.649195 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-snbq4" Mar 14 07:14:12 crc kubenswrapper[4893]: I0314 07:14:12.649217 4893 scope.go:117] "RemoveContainer" containerID="f881004e0fff8d534bcdbb75b0e327a43bcec8dafb657e5ec91e46e5ca852e2b" Mar 14 07:14:12 crc kubenswrapper[4893]: I0314 07:14:12.668336 4893 scope.go:117] "RemoveContainer" containerID="28f14880dc087134114847ce1d89721e91d9faf3b3c1ba3a1d27c58ad4bbce71" Mar 14 07:14:12 crc kubenswrapper[4893]: I0314 07:14:12.675478 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-snbq4"] Mar 14 07:14:12 crc kubenswrapper[4893]: I0314 07:14:12.680263 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-snbq4"] Mar 14 07:14:12 crc kubenswrapper[4893]: I0314 07:14:12.709939 4893 scope.go:117] "RemoveContainer" containerID="18119777fb8dca78364c38c2743c5ecdcf6a67d15f5d24ab15e6e3158e787067" Mar 14 07:14:12 crc kubenswrapper[4893]: I0314 07:14:12.725575 4893 scope.go:117] "RemoveContainer" containerID="f881004e0fff8d534bcdbb75b0e327a43bcec8dafb657e5ec91e46e5ca852e2b" Mar 14 07:14:12 crc kubenswrapper[4893]: E0314 07:14:12.726103 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f881004e0fff8d534bcdbb75b0e327a43bcec8dafb657e5ec91e46e5ca852e2b\": container with ID starting with f881004e0fff8d534bcdbb75b0e327a43bcec8dafb657e5ec91e46e5ca852e2b not found: ID does not exist" containerID="f881004e0fff8d534bcdbb75b0e327a43bcec8dafb657e5ec91e46e5ca852e2b" Mar 14 07:14:12 crc kubenswrapper[4893]: I0314 07:14:12.726143 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f881004e0fff8d534bcdbb75b0e327a43bcec8dafb657e5ec91e46e5ca852e2b"} err="failed to get container status \"f881004e0fff8d534bcdbb75b0e327a43bcec8dafb657e5ec91e46e5ca852e2b\": rpc error: code = NotFound desc = could not find container \"f881004e0fff8d534bcdbb75b0e327a43bcec8dafb657e5ec91e46e5ca852e2b\": container with ID starting with f881004e0fff8d534bcdbb75b0e327a43bcec8dafb657e5ec91e46e5ca852e2b not found: ID does not exist" Mar 14 07:14:12 crc kubenswrapper[4893]: I0314 07:14:12.726167 4893 scope.go:117] "RemoveContainer" containerID="28f14880dc087134114847ce1d89721e91d9faf3b3c1ba3a1d27c58ad4bbce71" Mar 14 07:14:12 crc kubenswrapper[4893]: E0314 07:14:12.726426 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28f14880dc087134114847ce1d89721e91d9faf3b3c1ba3a1d27c58ad4bbce71\": container with ID starting with 28f14880dc087134114847ce1d89721e91d9faf3b3c1ba3a1d27c58ad4bbce71 not found: ID does not exist" containerID="28f14880dc087134114847ce1d89721e91d9faf3b3c1ba3a1d27c58ad4bbce71" Mar 14 07:14:12 crc kubenswrapper[4893]: I0314 07:14:12.726453 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28f14880dc087134114847ce1d89721e91d9faf3b3c1ba3a1d27c58ad4bbce71"} err="failed to get container status \"28f14880dc087134114847ce1d89721e91d9faf3b3c1ba3a1d27c58ad4bbce71\": rpc error: code = NotFound desc = could not find container \"28f14880dc087134114847ce1d89721e91d9faf3b3c1ba3a1d27c58ad4bbce71\": container with ID starting with 28f14880dc087134114847ce1d89721e91d9faf3b3c1ba3a1d27c58ad4bbce71 not found: ID does not exist" Mar 14 07:14:12 crc kubenswrapper[4893]: I0314 07:14:12.726471 4893 scope.go:117] "RemoveContainer" containerID="18119777fb8dca78364c38c2743c5ecdcf6a67d15f5d24ab15e6e3158e787067" Mar 14 07:14:12 crc kubenswrapper[4893]: E0314 07:14:12.726856 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18119777fb8dca78364c38c2743c5ecdcf6a67d15f5d24ab15e6e3158e787067\": container with ID starting with 18119777fb8dca78364c38c2743c5ecdcf6a67d15f5d24ab15e6e3158e787067 not found: ID does not exist" containerID="18119777fb8dca78364c38c2743c5ecdcf6a67d15f5d24ab15e6e3158e787067" Mar 14 07:14:12 crc kubenswrapper[4893]: I0314 07:14:12.726902 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18119777fb8dca78364c38c2743c5ecdcf6a67d15f5d24ab15e6e3158e787067"} err="failed to get container status \"18119777fb8dca78364c38c2743c5ecdcf6a67d15f5d24ab15e6e3158e787067\": rpc error: code = NotFound desc = could not find container \"18119777fb8dca78364c38c2743c5ecdcf6a67d15f5d24ab15e6e3158e787067\": container with ID starting with 18119777fb8dca78364c38c2743c5ecdcf6a67d15f5d24ab15e6e3158e787067 not found: ID does not exist" Mar 14 07:14:13 crc kubenswrapper[4893]: I0314 07:14:13.382374 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="686f244c-6569-41e7-a316-de89833dad05" path="/var/lib/kubelet/pods/686f244c-6569-41e7-a316-de89833dad05/volumes" Mar 14 07:14:18 crc kubenswrapper[4893]: I0314 07:14:18.867305 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-npfjh" Mar 14 07:14:18 crc kubenswrapper[4893]: I0314 07:14:18.868243 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-npfjh" Mar 14 07:14:18 crc kubenswrapper[4893]: I0314 07:14:18.951718 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-npfjh" Mar 14 07:14:19 crc kubenswrapper[4893]: I0314 07:14:19.756411 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-npfjh" Mar 14 07:14:19 crc kubenswrapper[4893]: I0314 07:14:19.806214 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-npfjh"] Mar 14 07:14:20 crc kubenswrapper[4893]: I0314 07:14:20.225660 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-v47ts" Mar 14 07:14:21 crc kubenswrapper[4893]: I0314 07:14:21.711068 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-npfjh" podUID="76288b33-f834-43f4-ba1a-7be4a9ec82f0" containerName="registry-server" containerID="cri-o://8fdf509605f98f2d1627b74accd73cc967ab8363dd1b5d5296bf9dbb91a7783b" gracePeriod=2 Mar 14 07:14:22 crc kubenswrapper[4893]: I0314 07:14:22.614753 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-npfjh" Mar 14 07:14:22 crc kubenswrapper[4893]: I0314 07:14:22.719321 4893 generic.go:334] "Generic (PLEG): container finished" podID="76288b33-f834-43f4-ba1a-7be4a9ec82f0" containerID="8fdf509605f98f2d1627b74accd73cc967ab8363dd1b5d5296bf9dbb91a7783b" exitCode=0 Mar 14 07:14:22 crc kubenswrapper[4893]: I0314 07:14:22.719361 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-npfjh" event={"ID":"76288b33-f834-43f4-ba1a-7be4a9ec82f0","Type":"ContainerDied","Data":"8fdf509605f98f2d1627b74accd73cc967ab8363dd1b5d5296bf9dbb91a7783b"} Mar 14 07:14:22 crc kubenswrapper[4893]: I0314 07:14:22.719367 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-npfjh" Mar 14 07:14:22 crc kubenswrapper[4893]: I0314 07:14:22.719398 4893 scope.go:117] "RemoveContainer" containerID="8fdf509605f98f2d1627b74accd73cc967ab8363dd1b5d5296bf9dbb91a7783b" Mar 14 07:14:22 crc kubenswrapper[4893]: I0314 07:14:22.719388 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-npfjh" event={"ID":"76288b33-f834-43f4-ba1a-7be4a9ec82f0","Type":"ContainerDied","Data":"f2c775392b3e428c31ac6f46590a999640805444a2ab85eb778930f6f347aa4a"} Mar 14 07:14:22 crc kubenswrapper[4893]: I0314 07:14:22.721704 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4pph6\" (UniqueName: \"kubernetes.io/projected/76288b33-f834-43f4-ba1a-7be4a9ec82f0-kube-api-access-4pph6\") pod \"76288b33-f834-43f4-ba1a-7be4a9ec82f0\" (UID: \"76288b33-f834-43f4-ba1a-7be4a9ec82f0\") " Mar 14 07:14:22 crc kubenswrapper[4893]: I0314 07:14:22.721776 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76288b33-f834-43f4-ba1a-7be4a9ec82f0-catalog-content\") pod \"76288b33-f834-43f4-ba1a-7be4a9ec82f0\" (UID: \"76288b33-f834-43f4-ba1a-7be4a9ec82f0\") " Mar 14 07:14:22 crc kubenswrapper[4893]: I0314 07:14:22.721851 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76288b33-f834-43f4-ba1a-7be4a9ec82f0-utilities\") pod \"76288b33-f834-43f4-ba1a-7be4a9ec82f0\" (UID: \"76288b33-f834-43f4-ba1a-7be4a9ec82f0\") " Mar 14 07:14:22 crc kubenswrapper[4893]: I0314 07:14:22.722904 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76288b33-f834-43f4-ba1a-7be4a9ec82f0-utilities" (OuterVolumeSpecName: "utilities") pod "76288b33-f834-43f4-ba1a-7be4a9ec82f0" (UID: "76288b33-f834-43f4-ba1a-7be4a9ec82f0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:14:22 crc kubenswrapper[4893]: I0314 07:14:22.740767 4893 scope.go:117] "RemoveContainer" containerID="793232b339a78f35c3aa4113346abc7394daab75a36cf1b7f7f65554abbaf291" Mar 14 07:14:22 crc kubenswrapper[4893]: I0314 07:14:22.740846 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76288b33-f834-43f4-ba1a-7be4a9ec82f0-kube-api-access-4pph6" (OuterVolumeSpecName: "kube-api-access-4pph6") pod "76288b33-f834-43f4-ba1a-7be4a9ec82f0" (UID: "76288b33-f834-43f4-ba1a-7be4a9ec82f0"). InnerVolumeSpecName "kube-api-access-4pph6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:14:22 crc kubenswrapper[4893]: I0314 07:14:22.755416 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76288b33-f834-43f4-ba1a-7be4a9ec82f0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "76288b33-f834-43f4-ba1a-7be4a9ec82f0" (UID: "76288b33-f834-43f4-ba1a-7be4a9ec82f0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:14:22 crc kubenswrapper[4893]: I0314 07:14:22.762410 4893 scope.go:117] "RemoveContainer" containerID="6e6c05c0f80b01e3929ed2af4fd62baf09a7f1d736684a48342b97ec71f07c0a" Mar 14 07:14:22 crc kubenswrapper[4893]: I0314 07:14:22.783604 4893 scope.go:117] "RemoveContainer" containerID="8fdf509605f98f2d1627b74accd73cc967ab8363dd1b5d5296bf9dbb91a7783b" Mar 14 07:14:22 crc kubenswrapper[4893]: E0314 07:14:22.784028 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fdf509605f98f2d1627b74accd73cc967ab8363dd1b5d5296bf9dbb91a7783b\": container with ID starting with 8fdf509605f98f2d1627b74accd73cc967ab8363dd1b5d5296bf9dbb91a7783b not found: ID does not exist" containerID="8fdf509605f98f2d1627b74accd73cc967ab8363dd1b5d5296bf9dbb91a7783b" Mar 14 07:14:22 crc kubenswrapper[4893]: I0314 07:14:22.784073 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fdf509605f98f2d1627b74accd73cc967ab8363dd1b5d5296bf9dbb91a7783b"} err="failed to get container status \"8fdf509605f98f2d1627b74accd73cc967ab8363dd1b5d5296bf9dbb91a7783b\": rpc error: code = NotFound desc = could not find container \"8fdf509605f98f2d1627b74accd73cc967ab8363dd1b5d5296bf9dbb91a7783b\": container with ID starting with 8fdf509605f98f2d1627b74accd73cc967ab8363dd1b5d5296bf9dbb91a7783b not found: ID does not exist" Mar 14 07:14:22 crc kubenswrapper[4893]: I0314 07:14:22.784101 4893 scope.go:117] "RemoveContainer" containerID="793232b339a78f35c3aa4113346abc7394daab75a36cf1b7f7f65554abbaf291" Mar 14 07:14:22 crc kubenswrapper[4893]: E0314 07:14:22.784453 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"793232b339a78f35c3aa4113346abc7394daab75a36cf1b7f7f65554abbaf291\": container with ID starting with 793232b339a78f35c3aa4113346abc7394daab75a36cf1b7f7f65554abbaf291 not found: ID does not exist" containerID="793232b339a78f35c3aa4113346abc7394daab75a36cf1b7f7f65554abbaf291" Mar 14 07:14:22 crc kubenswrapper[4893]: I0314 07:14:22.784479 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"793232b339a78f35c3aa4113346abc7394daab75a36cf1b7f7f65554abbaf291"} err="failed to get container status \"793232b339a78f35c3aa4113346abc7394daab75a36cf1b7f7f65554abbaf291\": rpc error: code = NotFound desc = could not find container \"793232b339a78f35c3aa4113346abc7394daab75a36cf1b7f7f65554abbaf291\": container with ID starting with 793232b339a78f35c3aa4113346abc7394daab75a36cf1b7f7f65554abbaf291 not found: ID does not exist" Mar 14 07:14:22 crc kubenswrapper[4893]: I0314 07:14:22.784494 4893 scope.go:117] "RemoveContainer" containerID="6e6c05c0f80b01e3929ed2af4fd62baf09a7f1d736684a48342b97ec71f07c0a" Mar 14 07:14:22 crc kubenswrapper[4893]: E0314 07:14:22.784812 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e6c05c0f80b01e3929ed2af4fd62baf09a7f1d736684a48342b97ec71f07c0a\": container with ID starting with 6e6c05c0f80b01e3929ed2af4fd62baf09a7f1d736684a48342b97ec71f07c0a not found: ID does not exist" containerID="6e6c05c0f80b01e3929ed2af4fd62baf09a7f1d736684a48342b97ec71f07c0a" Mar 14 07:14:22 crc kubenswrapper[4893]: I0314 07:14:22.784833 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e6c05c0f80b01e3929ed2af4fd62baf09a7f1d736684a48342b97ec71f07c0a"} err="failed to get container status \"6e6c05c0f80b01e3929ed2af4fd62baf09a7f1d736684a48342b97ec71f07c0a\": rpc error: code = NotFound desc = could not find container \"6e6c05c0f80b01e3929ed2af4fd62baf09a7f1d736684a48342b97ec71f07c0a\": container with ID starting with 6e6c05c0f80b01e3929ed2af4fd62baf09a7f1d736684a48342b97ec71f07c0a not found: ID does not exist" Mar 14 07:14:22 crc kubenswrapper[4893]: I0314 07:14:22.823742 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4pph6\" (UniqueName: \"kubernetes.io/projected/76288b33-f834-43f4-ba1a-7be4a9ec82f0-kube-api-access-4pph6\") on node \"crc\" DevicePath \"\"" Mar 14 07:14:22 crc kubenswrapper[4893]: I0314 07:14:22.823774 4893 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76288b33-f834-43f4-ba1a-7be4a9ec82f0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 07:14:22 crc kubenswrapper[4893]: I0314 07:14:22.823783 4893 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76288b33-f834-43f4-ba1a-7be4a9ec82f0-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 07:14:23 crc kubenswrapper[4893]: I0314 07:14:23.048898 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-npfjh"] Mar 14 07:14:23 crc kubenswrapper[4893]: I0314 07:14:23.053706 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-npfjh"] Mar 14 07:14:23 crc kubenswrapper[4893]: I0314 07:14:23.386472 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76288b33-f834-43f4-ba1a-7be4a9ec82f0" path="/var/lib/kubelet/pods/76288b33-f834-43f4-ba1a-7be4a9ec82f0/volumes" Mar 14 07:14:26 crc kubenswrapper[4893]: I0314 07:14:26.736021 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-s5ngg"] Mar 14 07:14:26 crc kubenswrapper[4893]: E0314 07:14:26.738094 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76288b33-f834-43f4-ba1a-7be4a9ec82f0" containerName="extract-content" Mar 14 07:14:26 crc kubenswrapper[4893]: I0314 07:14:26.738284 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="76288b33-f834-43f4-ba1a-7be4a9ec82f0" containerName="extract-content" Mar 14 07:14:26 crc kubenswrapper[4893]: E0314 07:14:26.738436 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76288b33-f834-43f4-ba1a-7be4a9ec82f0" containerName="extract-utilities" Mar 14 07:14:26 crc kubenswrapper[4893]: I0314 07:14:26.785129 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="76288b33-f834-43f4-ba1a-7be4a9ec82f0" containerName="extract-utilities" Mar 14 07:14:26 crc kubenswrapper[4893]: E0314 07:14:26.785259 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="686f244c-6569-41e7-a316-de89833dad05" containerName="extract-utilities" Mar 14 07:14:26 crc kubenswrapper[4893]: I0314 07:14:26.785331 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="686f244c-6569-41e7-a316-de89833dad05" containerName="extract-utilities" Mar 14 07:14:26 crc kubenswrapper[4893]: E0314 07:14:26.785403 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="686f244c-6569-41e7-a316-de89833dad05" containerName="extract-content" Mar 14 07:14:26 crc kubenswrapper[4893]: I0314 07:14:26.785755 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="686f244c-6569-41e7-a316-de89833dad05" containerName="extract-content" Mar 14 07:14:26 crc kubenswrapper[4893]: E0314 07:14:26.785844 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76288b33-f834-43f4-ba1a-7be4a9ec82f0" containerName="registry-server" Mar 14 07:14:26 crc kubenswrapper[4893]: I0314 07:14:26.785919 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="76288b33-f834-43f4-ba1a-7be4a9ec82f0" containerName="registry-server" Mar 14 07:14:26 crc kubenswrapper[4893]: E0314 07:14:26.786000 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="686f244c-6569-41e7-a316-de89833dad05" containerName="registry-server" Mar 14 07:14:26 crc kubenswrapper[4893]: I0314 07:14:26.786072 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="686f244c-6569-41e7-a316-de89833dad05" containerName="registry-server" Mar 14 07:14:26 crc kubenswrapper[4893]: I0314 07:14:26.786508 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="686f244c-6569-41e7-a316-de89833dad05" containerName="registry-server" Mar 14 07:14:26 crc kubenswrapper[4893]: I0314 07:14:26.786650 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="76288b33-f834-43f4-ba1a-7be4a9ec82f0" containerName="registry-server" Mar 14 07:14:26 crc kubenswrapper[4893]: I0314 07:14:26.788073 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s5ngg"] Mar 14 07:14:26 crc kubenswrapper[4893]: I0314 07:14:26.788258 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s5ngg" Mar 14 07:14:26 crc kubenswrapper[4893]: I0314 07:14:26.984853 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83b7da54-102b-455a-89db-f760fa2085ef-utilities\") pod \"community-operators-s5ngg\" (UID: \"83b7da54-102b-455a-89db-f760fa2085ef\") " pod="openshift-marketplace/community-operators-s5ngg" Mar 14 07:14:26 crc kubenswrapper[4893]: I0314 07:14:26.985237 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83b7da54-102b-455a-89db-f760fa2085ef-catalog-content\") pod \"community-operators-s5ngg\" (UID: \"83b7da54-102b-455a-89db-f760fa2085ef\") " pod="openshift-marketplace/community-operators-s5ngg" Mar 14 07:14:26 crc kubenswrapper[4893]: I0314 07:14:26.985293 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrgw2\" (UniqueName: \"kubernetes.io/projected/83b7da54-102b-455a-89db-f760fa2085ef-kube-api-access-jrgw2\") pod \"community-operators-s5ngg\" (UID: \"83b7da54-102b-455a-89db-f760fa2085ef\") " pod="openshift-marketplace/community-operators-s5ngg" Mar 14 07:14:27 crc kubenswrapper[4893]: I0314 07:14:27.086145 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83b7da54-102b-455a-89db-f760fa2085ef-utilities\") pod \"community-operators-s5ngg\" (UID: \"83b7da54-102b-455a-89db-f760fa2085ef\") " pod="openshift-marketplace/community-operators-s5ngg" Mar 14 07:14:27 crc kubenswrapper[4893]: I0314 07:14:27.086231 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83b7da54-102b-455a-89db-f760fa2085ef-catalog-content\") pod \"community-operators-s5ngg\" (UID: \"83b7da54-102b-455a-89db-f760fa2085ef\") " pod="openshift-marketplace/community-operators-s5ngg" Mar 14 07:14:27 crc kubenswrapper[4893]: I0314 07:14:27.086278 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrgw2\" (UniqueName: \"kubernetes.io/projected/83b7da54-102b-455a-89db-f760fa2085ef-kube-api-access-jrgw2\") pod \"community-operators-s5ngg\" (UID: \"83b7da54-102b-455a-89db-f760fa2085ef\") " pod="openshift-marketplace/community-operators-s5ngg" Mar 14 07:14:27 crc kubenswrapper[4893]: I0314 07:14:27.086851 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83b7da54-102b-455a-89db-f760fa2085ef-utilities\") pod \"community-operators-s5ngg\" (UID: \"83b7da54-102b-455a-89db-f760fa2085ef\") " pod="openshift-marketplace/community-operators-s5ngg" Mar 14 07:14:27 crc kubenswrapper[4893]: I0314 07:14:27.087321 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83b7da54-102b-455a-89db-f760fa2085ef-catalog-content\") pod \"community-operators-s5ngg\" (UID: \"83b7da54-102b-455a-89db-f760fa2085ef\") " pod="openshift-marketplace/community-operators-s5ngg" Mar 14 07:14:27 crc kubenswrapper[4893]: I0314 07:14:27.111422 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrgw2\" (UniqueName: \"kubernetes.io/projected/83b7da54-102b-455a-89db-f760fa2085ef-kube-api-access-jrgw2\") pod \"community-operators-s5ngg\" (UID: \"83b7da54-102b-455a-89db-f760fa2085ef\") " pod="openshift-marketplace/community-operators-s5ngg" Mar 14 07:14:27 crc kubenswrapper[4893]: I0314 07:14:27.408511 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s5ngg" Mar 14 07:14:27 crc kubenswrapper[4893]: I0314 07:14:27.626907 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s5ngg"] Mar 14 07:14:27 crc kubenswrapper[4893]: I0314 07:14:27.781828 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s5ngg" event={"ID":"83b7da54-102b-455a-89db-f760fa2085ef","Type":"ContainerStarted","Data":"793a78de6de0d1a1d6beed8d39a9cd891fbaebeb5b59f16c0ebbb8999fde2efe"} Mar 14 07:14:28 crc kubenswrapper[4893]: I0314 07:14:28.789333 4893 generic.go:334] "Generic (PLEG): container finished" podID="83b7da54-102b-455a-89db-f760fa2085ef" containerID="156c4c265f98f30f1f0de724b2c6233bf7dcdd6f655a4a8142347b164ea76203" exitCode=0 Mar 14 07:14:28 crc kubenswrapper[4893]: I0314 07:14:28.789415 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s5ngg" event={"ID":"83b7da54-102b-455a-89db-f760fa2085ef","Type":"ContainerDied","Data":"156c4c265f98f30f1f0de724b2c6233bf7dcdd6f655a4a8142347b164ea76203"} Mar 14 07:14:29 crc kubenswrapper[4893]: I0314 07:14:29.796726 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s5ngg" event={"ID":"83b7da54-102b-455a-89db-f760fa2085ef","Type":"ContainerStarted","Data":"2e5222605f0c83b1f7ac81856f5a78e70a98891f2f7086cbc77eca719b47c870"} Mar 14 07:14:30 crc kubenswrapper[4893]: I0314 07:14:30.808791 4893 generic.go:334] "Generic (PLEG): container finished" podID="83b7da54-102b-455a-89db-f760fa2085ef" containerID="2e5222605f0c83b1f7ac81856f5a78e70a98891f2f7086cbc77eca719b47c870" exitCode=0 Mar 14 07:14:30 crc kubenswrapper[4893]: I0314 07:14:30.809410 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s5ngg" event={"ID":"83b7da54-102b-455a-89db-f760fa2085ef","Type":"ContainerDied","Data":"2e5222605f0c83b1f7ac81856f5a78e70a98891f2f7086cbc77eca719b47c870"} Mar 14 07:14:31 crc kubenswrapper[4893]: I0314 07:14:31.819133 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s5ngg" event={"ID":"83b7da54-102b-455a-89db-f760fa2085ef","Type":"ContainerStarted","Data":"23c7bbe8dff0c2aa90d40a71a801fb9be93b59ecd2c1b1a56b1b8d895c817172"} Mar 14 07:14:31 crc kubenswrapper[4893]: I0314 07:14:31.845625 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-s5ngg" podStartSLOduration=3.405850353 podStartE2EDuration="5.845552211s" podCreationTimestamp="2026-03-14 07:14:26 +0000 UTC" firstStartedPulling="2026-03-14 07:14:28.793224572 +0000 UTC m=+948.055401374" lastFinishedPulling="2026-03-14 07:14:31.23292642 +0000 UTC m=+950.495103232" observedRunningTime="2026-03-14 07:14:31.842868116 +0000 UTC m=+951.105044928" watchObservedRunningTime="2026-03-14 07:14:31.845552211 +0000 UTC m=+951.107729013" Mar 14 07:14:35 crc kubenswrapper[4893]: I0314 07:14:35.759139 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-psm2j" podUID="ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9" containerName="console" containerID="cri-o://cb290f12124e3f2357962cddd4eb5a16f8241a5b4d7405a119ac670484aa3c11" gracePeriod=15 Mar 14 07:14:36 crc kubenswrapper[4893]: I0314 07:14:36.245826 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-psm2j_ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9/console/0.log" Mar 14 07:14:36 crc kubenswrapper[4893]: I0314 07:14:36.246175 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-psm2j" Mar 14 07:14:36 crc kubenswrapper[4893]: I0314 07:14:36.347460 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9-console-config\") pod \"ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9\" (UID: \"ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9\") " Mar 14 07:14:36 crc kubenswrapper[4893]: I0314 07:14:36.347537 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rg6g\" (UniqueName: \"kubernetes.io/projected/ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9-kube-api-access-4rg6g\") pod \"ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9\" (UID: \"ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9\") " Mar 14 07:14:36 crc kubenswrapper[4893]: I0314 07:14:36.347573 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9-console-oauth-config\") pod \"ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9\" (UID: \"ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9\") " Mar 14 07:14:36 crc kubenswrapper[4893]: I0314 07:14:36.347597 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9-trusted-ca-bundle\") pod \"ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9\" (UID: \"ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9\") " Mar 14 07:14:36 crc kubenswrapper[4893]: I0314 07:14:36.347630 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9-oauth-serving-cert\") pod \"ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9\" (UID: \"ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9\") " Mar 14 07:14:36 crc kubenswrapper[4893]: I0314 07:14:36.347711 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9-service-ca\") pod \"ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9\" (UID: \"ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9\") " Mar 14 07:14:36 crc kubenswrapper[4893]: I0314 07:14:36.347740 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9-console-serving-cert\") pod \"ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9\" (UID: \"ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9\") " Mar 14 07:14:36 crc kubenswrapper[4893]: I0314 07:14:36.348421 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9" (UID: "ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:14:36 crc kubenswrapper[4893]: I0314 07:14:36.348447 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9-service-ca" (OuterVolumeSpecName: "service-ca") pod "ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9" (UID: "ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:14:36 crc kubenswrapper[4893]: I0314 07:14:36.348695 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9" (UID: "ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:14:36 crc kubenswrapper[4893]: I0314 07:14:36.349074 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9-console-config" (OuterVolumeSpecName: "console-config") pod "ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9" (UID: "ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:14:36 crc kubenswrapper[4893]: I0314 07:14:36.353264 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9-kube-api-access-4rg6g" (OuterVolumeSpecName: "kube-api-access-4rg6g") pod "ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9" (UID: "ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9"). InnerVolumeSpecName "kube-api-access-4rg6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:14:36 crc kubenswrapper[4893]: I0314 07:14:36.353273 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9" (UID: "ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:14:36 crc kubenswrapper[4893]: I0314 07:14:36.356245 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9" (UID: "ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:14:36 crc kubenswrapper[4893]: I0314 07:14:36.449496 4893 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9-service-ca\") on node \"crc\" DevicePath \"\"" Mar 14 07:14:36 crc kubenswrapper[4893]: I0314 07:14:36.449578 4893 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:14:36 crc kubenswrapper[4893]: I0314 07:14:36.449600 4893 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9-console-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:14:36 crc kubenswrapper[4893]: I0314 07:14:36.449618 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rg6g\" (UniqueName: \"kubernetes.io/projected/ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9-kube-api-access-4rg6g\") on node \"crc\" DevicePath \"\"" Mar 14 07:14:36 crc kubenswrapper[4893]: I0314 07:14:36.449636 4893 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:14:36 crc kubenswrapper[4893]: I0314 07:14:36.449652 4893 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:14:36 crc kubenswrapper[4893]: I0314 07:14:36.449669 4893 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:14:36 crc kubenswrapper[4893]: I0314 07:14:36.852984 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-psm2j_ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9/console/0.log" Mar 14 07:14:36 crc kubenswrapper[4893]: I0314 07:14:36.853295 4893 generic.go:334] "Generic (PLEG): container finished" podID="ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9" containerID="cb290f12124e3f2357962cddd4eb5a16f8241a5b4d7405a119ac670484aa3c11" exitCode=2 Mar 14 07:14:36 crc kubenswrapper[4893]: I0314 07:14:36.853330 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-psm2j" event={"ID":"ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9","Type":"ContainerDied","Data":"cb290f12124e3f2357962cddd4eb5a16f8241a5b4d7405a119ac670484aa3c11"} Mar 14 07:14:36 crc kubenswrapper[4893]: I0314 07:14:36.853361 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-psm2j" event={"ID":"ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9","Type":"ContainerDied","Data":"864651349f8df1f5d4599ace23fddee6f61035514ad0b0ba08ca8d7eb8b4b65c"} Mar 14 07:14:36 crc kubenswrapper[4893]: I0314 07:14:36.853381 4893 scope.go:117] "RemoveContainer" containerID="cb290f12124e3f2357962cddd4eb5a16f8241a5b4d7405a119ac670484aa3c11" Mar 14 07:14:36 crc kubenswrapper[4893]: I0314 07:14:36.853400 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-psm2j" Mar 14 07:14:36 crc kubenswrapper[4893]: I0314 07:14:36.877856 4893 scope.go:117] "RemoveContainer" containerID="cb290f12124e3f2357962cddd4eb5a16f8241a5b4d7405a119ac670484aa3c11" Mar 14 07:14:36 crc kubenswrapper[4893]: E0314 07:14:36.878375 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb290f12124e3f2357962cddd4eb5a16f8241a5b4d7405a119ac670484aa3c11\": container with ID starting with cb290f12124e3f2357962cddd4eb5a16f8241a5b4d7405a119ac670484aa3c11 not found: ID does not exist" containerID="cb290f12124e3f2357962cddd4eb5a16f8241a5b4d7405a119ac670484aa3c11" Mar 14 07:14:36 crc kubenswrapper[4893]: I0314 07:14:36.878415 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb290f12124e3f2357962cddd4eb5a16f8241a5b4d7405a119ac670484aa3c11"} err="failed to get container status \"cb290f12124e3f2357962cddd4eb5a16f8241a5b4d7405a119ac670484aa3c11\": rpc error: code = NotFound desc = could not find container \"cb290f12124e3f2357962cddd4eb5a16f8241a5b4d7405a119ac670484aa3c11\": container with ID starting with cb290f12124e3f2357962cddd4eb5a16f8241a5b4d7405a119ac670484aa3c11 not found: ID does not exist" Mar 14 07:14:36 crc kubenswrapper[4893]: I0314 07:14:36.893252 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-psm2j"] Mar 14 07:14:36 crc kubenswrapper[4893]: I0314 07:14:36.897265 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-psm2j"] Mar 14 07:14:37 crc kubenswrapper[4893]: I0314 07:14:37.387458 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9" path="/var/lib/kubelet/pods/ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9/volumes" Mar 14 07:14:37 crc kubenswrapper[4893]: I0314 07:14:37.409426 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-s5ngg" Mar 14 07:14:37 crc kubenswrapper[4893]: I0314 07:14:37.410139 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-s5ngg" Mar 14 07:14:37 crc kubenswrapper[4893]: I0314 07:14:37.456946 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-s5ngg" Mar 14 07:14:37 crc kubenswrapper[4893]: I0314 07:14:37.560612 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1gxkwj"] Mar 14 07:14:37 crc kubenswrapper[4893]: E0314 07:14:37.561282 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9" containerName="console" Mar 14 07:14:37 crc kubenswrapper[4893]: I0314 07:14:37.561435 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9" containerName="console" Mar 14 07:14:37 crc kubenswrapper[4893]: I0314 07:14:37.561830 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba3cbd4f-22c0-45f3-8f49-ac687b5d7ab9" containerName="console" Mar 14 07:14:37 crc kubenswrapper[4893]: I0314 07:14:37.563319 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1gxkwj" Mar 14 07:14:37 crc kubenswrapper[4893]: I0314 07:14:37.566605 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 14 07:14:37 crc kubenswrapper[4893]: I0314 07:14:37.566768 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1gxkwj"] Mar 14 07:14:37 crc kubenswrapper[4893]: I0314 07:14:37.666051 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4zkv\" (UniqueName: \"kubernetes.io/projected/901f8898-8f02-4d53-9795-99e707b401c6-kube-api-access-z4zkv\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1gxkwj\" (UID: \"901f8898-8f02-4d53-9795-99e707b401c6\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1gxkwj" Mar 14 07:14:37 crc kubenswrapper[4893]: I0314 07:14:37.666199 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/901f8898-8f02-4d53-9795-99e707b401c6-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1gxkwj\" (UID: \"901f8898-8f02-4d53-9795-99e707b401c6\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1gxkwj" Mar 14 07:14:37 crc kubenswrapper[4893]: I0314 07:14:37.666227 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/901f8898-8f02-4d53-9795-99e707b401c6-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1gxkwj\" (UID: \"901f8898-8f02-4d53-9795-99e707b401c6\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1gxkwj" Mar 14 07:14:37 crc kubenswrapper[4893]: I0314 07:14:37.768015 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4zkv\" (UniqueName: \"kubernetes.io/projected/901f8898-8f02-4d53-9795-99e707b401c6-kube-api-access-z4zkv\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1gxkwj\" (UID: \"901f8898-8f02-4d53-9795-99e707b401c6\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1gxkwj" Mar 14 07:14:37 crc kubenswrapper[4893]: I0314 07:14:37.768116 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/901f8898-8f02-4d53-9795-99e707b401c6-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1gxkwj\" (UID: \"901f8898-8f02-4d53-9795-99e707b401c6\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1gxkwj" Mar 14 07:14:37 crc kubenswrapper[4893]: I0314 07:14:37.768143 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/901f8898-8f02-4d53-9795-99e707b401c6-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1gxkwj\" (UID: \"901f8898-8f02-4d53-9795-99e707b401c6\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1gxkwj" Mar 14 07:14:37 crc kubenswrapper[4893]: I0314 07:14:37.768739 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/901f8898-8f02-4d53-9795-99e707b401c6-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1gxkwj\" (UID: \"901f8898-8f02-4d53-9795-99e707b401c6\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1gxkwj" Mar 14 07:14:37 crc kubenswrapper[4893]: I0314 07:14:37.768810 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/901f8898-8f02-4d53-9795-99e707b401c6-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1gxkwj\" (UID: \"901f8898-8f02-4d53-9795-99e707b401c6\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1gxkwj" Mar 14 07:14:37 crc kubenswrapper[4893]: I0314 07:14:37.786061 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4zkv\" (UniqueName: \"kubernetes.io/projected/901f8898-8f02-4d53-9795-99e707b401c6-kube-api-access-z4zkv\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1gxkwj\" (UID: \"901f8898-8f02-4d53-9795-99e707b401c6\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1gxkwj" Mar 14 07:14:37 crc kubenswrapper[4893]: I0314 07:14:37.884853 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1gxkwj" Mar 14 07:14:37 crc kubenswrapper[4893]: I0314 07:14:37.912577 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-s5ngg" Mar 14 07:14:38 crc kubenswrapper[4893]: I0314 07:14:38.112793 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1gxkwj"] Mar 14 07:14:38 crc kubenswrapper[4893]: I0314 07:14:38.868119 4893 generic.go:334] "Generic (PLEG): container finished" podID="901f8898-8f02-4d53-9795-99e707b401c6" containerID="e0008b8bf3d5645a2d200ee376247a3db2512541169fa0f5ed0d762fd8d1ecfa" exitCode=0 Mar 14 07:14:38 crc kubenswrapper[4893]: I0314 07:14:38.868291 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1gxkwj" event={"ID":"901f8898-8f02-4d53-9795-99e707b401c6","Type":"ContainerDied","Data":"e0008b8bf3d5645a2d200ee376247a3db2512541169fa0f5ed0d762fd8d1ecfa"} Mar 14 07:14:38 crc kubenswrapper[4893]: I0314 07:14:38.868461 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1gxkwj" event={"ID":"901f8898-8f02-4d53-9795-99e707b401c6","Type":"ContainerStarted","Data":"54e58c84266e286231c05725d4234304125d4c4430eeccb5e080b0351d58fac1"} Mar 14 07:14:40 crc kubenswrapper[4893]: I0314 07:14:40.887180 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1gxkwj" event={"ID":"901f8898-8f02-4d53-9795-99e707b401c6","Type":"ContainerStarted","Data":"3ce1f793f240dd720d89f26cd3a23c37e142e0c739f00e1f4a4442fb1650806e"} Mar 14 07:14:40 crc kubenswrapper[4893]: I0314 07:14:40.923444 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-s5ngg"] Mar 14 07:14:40 crc kubenswrapper[4893]: I0314 07:14:40.923838 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-s5ngg" podUID="83b7da54-102b-455a-89db-f760fa2085ef" containerName="registry-server" containerID="cri-o://23c7bbe8dff0c2aa90d40a71a801fb9be93b59ecd2c1b1a56b1b8d895c817172" gracePeriod=2 Mar 14 07:14:41 crc kubenswrapper[4893]: I0314 07:14:41.828361 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s5ngg" Mar 14 07:14:41 crc kubenswrapper[4893]: I0314 07:14:41.898705 4893 generic.go:334] "Generic (PLEG): container finished" podID="901f8898-8f02-4d53-9795-99e707b401c6" containerID="3ce1f793f240dd720d89f26cd3a23c37e142e0c739f00e1f4a4442fb1650806e" exitCode=0 Mar 14 07:14:41 crc kubenswrapper[4893]: I0314 07:14:41.898841 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1gxkwj" event={"ID":"901f8898-8f02-4d53-9795-99e707b401c6","Type":"ContainerDied","Data":"3ce1f793f240dd720d89f26cd3a23c37e142e0c739f00e1f4a4442fb1650806e"} Mar 14 07:14:41 crc kubenswrapper[4893]: I0314 07:14:41.906043 4893 generic.go:334] "Generic (PLEG): container finished" podID="83b7da54-102b-455a-89db-f760fa2085ef" containerID="23c7bbe8dff0c2aa90d40a71a801fb9be93b59ecd2c1b1a56b1b8d895c817172" exitCode=0 Mar 14 07:14:41 crc kubenswrapper[4893]: I0314 07:14:41.906144 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s5ngg" event={"ID":"83b7da54-102b-455a-89db-f760fa2085ef","Type":"ContainerDied","Data":"23c7bbe8dff0c2aa90d40a71a801fb9be93b59ecd2c1b1a56b1b8d895c817172"} Mar 14 07:14:41 crc kubenswrapper[4893]: I0314 07:14:41.906175 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s5ngg" Mar 14 07:14:41 crc kubenswrapper[4893]: I0314 07:14:41.906232 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s5ngg" event={"ID":"83b7da54-102b-455a-89db-f760fa2085ef","Type":"ContainerDied","Data":"793a78de6de0d1a1d6beed8d39a9cd891fbaebeb5b59f16c0ebbb8999fde2efe"} Mar 14 07:14:41 crc kubenswrapper[4893]: I0314 07:14:41.906265 4893 scope.go:117] "RemoveContainer" containerID="23c7bbe8dff0c2aa90d40a71a801fb9be93b59ecd2c1b1a56b1b8d895c817172" Mar 14 07:14:41 crc kubenswrapper[4893]: I0314 07:14:41.924205 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrgw2\" (UniqueName: \"kubernetes.io/projected/83b7da54-102b-455a-89db-f760fa2085ef-kube-api-access-jrgw2\") pod \"83b7da54-102b-455a-89db-f760fa2085ef\" (UID: \"83b7da54-102b-455a-89db-f760fa2085ef\") " Mar 14 07:14:41 crc kubenswrapper[4893]: I0314 07:14:41.924589 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83b7da54-102b-455a-89db-f760fa2085ef-catalog-content\") pod \"83b7da54-102b-455a-89db-f760fa2085ef\" (UID: \"83b7da54-102b-455a-89db-f760fa2085ef\") " Mar 14 07:14:41 crc kubenswrapper[4893]: I0314 07:14:41.924653 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83b7da54-102b-455a-89db-f760fa2085ef-utilities\") pod \"83b7da54-102b-455a-89db-f760fa2085ef\" (UID: \"83b7da54-102b-455a-89db-f760fa2085ef\") " Mar 14 07:14:41 crc kubenswrapper[4893]: I0314 07:14:41.925618 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83b7da54-102b-455a-89db-f760fa2085ef-utilities" (OuterVolumeSpecName: "utilities") pod "83b7da54-102b-455a-89db-f760fa2085ef" (UID: "83b7da54-102b-455a-89db-f760fa2085ef"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:14:41 crc kubenswrapper[4893]: I0314 07:14:41.929614 4893 scope.go:117] "RemoveContainer" containerID="2e5222605f0c83b1f7ac81856f5a78e70a98891f2f7086cbc77eca719b47c870" Mar 14 07:14:41 crc kubenswrapper[4893]: I0314 07:14:41.932315 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83b7da54-102b-455a-89db-f760fa2085ef-kube-api-access-jrgw2" (OuterVolumeSpecName: "kube-api-access-jrgw2") pod "83b7da54-102b-455a-89db-f760fa2085ef" (UID: "83b7da54-102b-455a-89db-f760fa2085ef"). InnerVolumeSpecName "kube-api-access-jrgw2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:14:41 crc kubenswrapper[4893]: I0314 07:14:41.971356 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83b7da54-102b-455a-89db-f760fa2085ef-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "83b7da54-102b-455a-89db-f760fa2085ef" (UID: "83b7da54-102b-455a-89db-f760fa2085ef"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:14:41 crc kubenswrapper[4893]: I0314 07:14:41.988804 4893 scope.go:117] "RemoveContainer" containerID="156c4c265f98f30f1f0de724b2c6233bf7dcdd6f655a4a8142347b164ea76203" Mar 14 07:14:42 crc kubenswrapper[4893]: I0314 07:14:42.006094 4893 scope.go:117] "RemoveContainer" containerID="23c7bbe8dff0c2aa90d40a71a801fb9be93b59ecd2c1b1a56b1b8d895c817172" Mar 14 07:14:42 crc kubenswrapper[4893]: E0314 07:14:42.006755 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23c7bbe8dff0c2aa90d40a71a801fb9be93b59ecd2c1b1a56b1b8d895c817172\": container with ID starting with 23c7bbe8dff0c2aa90d40a71a801fb9be93b59ecd2c1b1a56b1b8d895c817172 not found: ID does not exist" containerID="23c7bbe8dff0c2aa90d40a71a801fb9be93b59ecd2c1b1a56b1b8d895c817172" Mar 14 07:14:42 crc kubenswrapper[4893]: I0314 07:14:42.006869 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23c7bbe8dff0c2aa90d40a71a801fb9be93b59ecd2c1b1a56b1b8d895c817172"} err="failed to get container status \"23c7bbe8dff0c2aa90d40a71a801fb9be93b59ecd2c1b1a56b1b8d895c817172\": rpc error: code = NotFound desc = could not find container \"23c7bbe8dff0c2aa90d40a71a801fb9be93b59ecd2c1b1a56b1b8d895c817172\": container with ID starting with 23c7bbe8dff0c2aa90d40a71a801fb9be93b59ecd2c1b1a56b1b8d895c817172 not found: ID does not exist" Mar 14 07:14:42 crc kubenswrapper[4893]: I0314 07:14:42.006897 4893 scope.go:117] "RemoveContainer" containerID="2e5222605f0c83b1f7ac81856f5a78e70a98891f2f7086cbc77eca719b47c870" Mar 14 07:14:42 crc kubenswrapper[4893]: E0314 07:14:42.007414 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e5222605f0c83b1f7ac81856f5a78e70a98891f2f7086cbc77eca719b47c870\": container with ID starting with 2e5222605f0c83b1f7ac81856f5a78e70a98891f2f7086cbc77eca719b47c870 not found: ID does not exist" containerID="2e5222605f0c83b1f7ac81856f5a78e70a98891f2f7086cbc77eca719b47c870" Mar 14 07:14:42 crc kubenswrapper[4893]: I0314 07:14:42.007455 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e5222605f0c83b1f7ac81856f5a78e70a98891f2f7086cbc77eca719b47c870"} err="failed to get container status \"2e5222605f0c83b1f7ac81856f5a78e70a98891f2f7086cbc77eca719b47c870\": rpc error: code = NotFound desc = could not find container \"2e5222605f0c83b1f7ac81856f5a78e70a98891f2f7086cbc77eca719b47c870\": container with ID starting with 2e5222605f0c83b1f7ac81856f5a78e70a98891f2f7086cbc77eca719b47c870 not found: ID does not exist" Mar 14 07:14:42 crc kubenswrapper[4893]: I0314 07:14:42.007485 4893 scope.go:117] "RemoveContainer" containerID="156c4c265f98f30f1f0de724b2c6233bf7dcdd6f655a4a8142347b164ea76203" Mar 14 07:14:42 crc kubenswrapper[4893]: E0314 07:14:42.007778 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"156c4c265f98f30f1f0de724b2c6233bf7dcdd6f655a4a8142347b164ea76203\": container with ID starting with 156c4c265f98f30f1f0de724b2c6233bf7dcdd6f655a4a8142347b164ea76203 not found: ID does not exist" containerID="156c4c265f98f30f1f0de724b2c6233bf7dcdd6f655a4a8142347b164ea76203" Mar 14 07:14:42 crc kubenswrapper[4893]: I0314 07:14:42.007814 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"156c4c265f98f30f1f0de724b2c6233bf7dcdd6f655a4a8142347b164ea76203"} err="failed to get container status \"156c4c265f98f30f1f0de724b2c6233bf7dcdd6f655a4a8142347b164ea76203\": rpc error: code = NotFound desc = could not find container \"156c4c265f98f30f1f0de724b2c6233bf7dcdd6f655a4a8142347b164ea76203\": container with ID starting with 156c4c265f98f30f1f0de724b2c6233bf7dcdd6f655a4a8142347b164ea76203 not found: ID does not exist" Mar 14 07:14:42 crc kubenswrapper[4893]: I0314 07:14:42.027712 4893 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83b7da54-102b-455a-89db-f760fa2085ef-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 07:14:42 crc kubenswrapper[4893]: I0314 07:14:42.027747 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrgw2\" (UniqueName: \"kubernetes.io/projected/83b7da54-102b-455a-89db-f760fa2085ef-kube-api-access-jrgw2\") on node \"crc\" DevicePath \"\"" Mar 14 07:14:42 crc kubenswrapper[4893]: I0314 07:14:42.027764 4893 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83b7da54-102b-455a-89db-f760fa2085ef-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 07:14:42 crc kubenswrapper[4893]: I0314 07:14:42.263908 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-s5ngg"] Mar 14 07:14:42 crc kubenswrapper[4893]: I0314 07:14:42.268384 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-s5ngg"] Mar 14 07:14:42 crc kubenswrapper[4893]: I0314 07:14:42.915313 4893 generic.go:334] "Generic (PLEG): container finished" podID="901f8898-8f02-4d53-9795-99e707b401c6" containerID="7c00d920fd45fc176389bfc8002b407b70b11615925820f1801797df198fe528" exitCode=0 Mar 14 07:14:42 crc kubenswrapper[4893]: I0314 07:14:42.915447 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1gxkwj" event={"ID":"901f8898-8f02-4d53-9795-99e707b401c6","Type":"ContainerDied","Data":"7c00d920fd45fc176389bfc8002b407b70b11615925820f1801797df198fe528"} Mar 14 07:14:43 crc kubenswrapper[4893]: I0314 07:14:43.388750 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83b7da54-102b-455a-89db-f760fa2085ef" path="/var/lib/kubelet/pods/83b7da54-102b-455a-89db-f760fa2085ef/volumes" Mar 14 07:14:44 crc kubenswrapper[4893]: I0314 07:14:44.238490 4893 scope.go:117] "RemoveContainer" containerID="ce002a6786167410da0b822e7b73868b4f7987f4ff00bb4e971102daae2a214a" Mar 14 07:14:44 crc kubenswrapper[4893]: I0314 07:14:44.251794 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1gxkwj" Mar 14 07:14:44 crc kubenswrapper[4893]: I0314 07:14:44.356969 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/901f8898-8f02-4d53-9795-99e707b401c6-bundle\") pod \"901f8898-8f02-4d53-9795-99e707b401c6\" (UID: \"901f8898-8f02-4d53-9795-99e707b401c6\") " Mar 14 07:14:44 crc kubenswrapper[4893]: I0314 07:14:44.357014 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/901f8898-8f02-4d53-9795-99e707b401c6-util\") pod \"901f8898-8f02-4d53-9795-99e707b401c6\" (UID: \"901f8898-8f02-4d53-9795-99e707b401c6\") " Mar 14 07:14:44 crc kubenswrapper[4893]: I0314 07:14:44.357128 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4zkv\" (UniqueName: \"kubernetes.io/projected/901f8898-8f02-4d53-9795-99e707b401c6-kube-api-access-z4zkv\") pod \"901f8898-8f02-4d53-9795-99e707b401c6\" (UID: \"901f8898-8f02-4d53-9795-99e707b401c6\") " Mar 14 07:14:44 crc kubenswrapper[4893]: I0314 07:14:44.358778 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/901f8898-8f02-4d53-9795-99e707b401c6-bundle" (OuterVolumeSpecName: "bundle") pod "901f8898-8f02-4d53-9795-99e707b401c6" (UID: "901f8898-8f02-4d53-9795-99e707b401c6"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:14:44 crc kubenswrapper[4893]: I0314 07:14:44.361514 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/901f8898-8f02-4d53-9795-99e707b401c6-kube-api-access-z4zkv" (OuterVolumeSpecName: "kube-api-access-z4zkv") pod "901f8898-8f02-4d53-9795-99e707b401c6" (UID: "901f8898-8f02-4d53-9795-99e707b401c6"). InnerVolumeSpecName "kube-api-access-z4zkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:14:44 crc kubenswrapper[4893]: I0314 07:14:44.371489 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/901f8898-8f02-4d53-9795-99e707b401c6-util" (OuterVolumeSpecName: "util") pod "901f8898-8f02-4d53-9795-99e707b401c6" (UID: "901f8898-8f02-4d53-9795-99e707b401c6"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:14:44 crc kubenswrapper[4893]: I0314 07:14:44.459181 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4zkv\" (UniqueName: \"kubernetes.io/projected/901f8898-8f02-4d53-9795-99e707b401c6-kube-api-access-z4zkv\") on node \"crc\" DevicePath \"\"" Mar 14 07:14:44 crc kubenswrapper[4893]: I0314 07:14:44.459244 4893 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/901f8898-8f02-4d53-9795-99e707b401c6-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:14:44 crc kubenswrapper[4893]: I0314 07:14:44.459267 4893 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/901f8898-8f02-4d53-9795-99e707b401c6-util\") on node \"crc\" DevicePath \"\"" Mar 14 07:14:44 crc kubenswrapper[4893]: I0314 07:14:44.938027 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1gxkwj" event={"ID":"901f8898-8f02-4d53-9795-99e707b401c6","Type":"ContainerDied","Data":"54e58c84266e286231c05725d4234304125d4c4430eeccb5e080b0351d58fac1"} Mar 14 07:14:44 crc kubenswrapper[4893]: I0314 07:14:44.938081 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54e58c84266e286231c05725d4234304125d4c4430eeccb5e080b0351d58fac1" Mar 14 07:14:44 crc kubenswrapper[4893]: I0314 07:14:44.938095 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1gxkwj" Mar 14 07:14:53 crc kubenswrapper[4893]: I0314 07:14:53.906920 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-7447f7bf6-2crtw"] Mar 14 07:14:53 crc kubenswrapper[4893]: E0314 07:14:53.908637 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83b7da54-102b-455a-89db-f760fa2085ef" containerName="extract-content" Mar 14 07:14:53 crc kubenswrapper[4893]: I0314 07:14:53.908745 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="83b7da54-102b-455a-89db-f760fa2085ef" containerName="extract-content" Mar 14 07:14:53 crc kubenswrapper[4893]: E0314 07:14:53.908840 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="901f8898-8f02-4d53-9795-99e707b401c6" containerName="extract" Mar 14 07:14:53 crc kubenswrapper[4893]: I0314 07:14:53.908916 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="901f8898-8f02-4d53-9795-99e707b401c6" containerName="extract" Mar 14 07:14:53 crc kubenswrapper[4893]: E0314 07:14:53.908978 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83b7da54-102b-455a-89db-f760fa2085ef" containerName="registry-server" Mar 14 07:14:53 crc kubenswrapper[4893]: I0314 07:14:53.909027 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="83b7da54-102b-455a-89db-f760fa2085ef" containerName="registry-server" Mar 14 07:14:53 crc kubenswrapper[4893]: E0314 07:14:53.909074 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="901f8898-8f02-4d53-9795-99e707b401c6" containerName="util" Mar 14 07:14:53 crc kubenswrapper[4893]: I0314 07:14:53.909140 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="901f8898-8f02-4d53-9795-99e707b401c6" containerName="util" Mar 14 07:14:53 crc kubenswrapper[4893]: E0314 07:14:53.909203 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83b7da54-102b-455a-89db-f760fa2085ef" containerName="extract-utilities" Mar 14 07:14:53 crc kubenswrapper[4893]: I0314 07:14:53.909273 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="83b7da54-102b-455a-89db-f760fa2085ef" containerName="extract-utilities" Mar 14 07:14:53 crc kubenswrapper[4893]: E0314 07:14:53.909364 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="901f8898-8f02-4d53-9795-99e707b401c6" containerName="pull" Mar 14 07:14:53 crc kubenswrapper[4893]: I0314 07:14:53.909434 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="901f8898-8f02-4d53-9795-99e707b401c6" containerName="pull" Mar 14 07:14:53 crc kubenswrapper[4893]: I0314 07:14:53.909608 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="901f8898-8f02-4d53-9795-99e707b401c6" containerName="extract" Mar 14 07:14:53 crc kubenswrapper[4893]: I0314 07:14:53.909678 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="83b7da54-102b-455a-89db-f760fa2085ef" containerName="registry-server" Mar 14 07:14:53 crc kubenswrapper[4893]: I0314 07:14:53.910162 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7447f7bf6-2crtw" Mar 14 07:14:53 crc kubenswrapper[4893]: I0314 07:14:53.912596 4893 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 14 07:14:53 crc kubenswrapper[4893]: I0314 07:14:53.912597 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 14 07:14:53 crc kubenswrapper[4893]: I0314 07:14:53.912680 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 14 07:14:53 crc kubenswrapper[4893]: I0314 07:14:53.913267 4893 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 14 07:14:53 crc kubenswrapper[4893]: I0314 07:14:53.914698 4893 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-6m6d6" Mar 14 07:14:53 crc kubenswrapper[4893]: I0314 07:14:53.927119 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7447f7bf6-2crtw"] Mar 14 07:14:54 crc kubenswrapper[4893]: I0314 07:14:54.083927 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dx492\" (UniqueName: \"kubernetes.io/projected/02873c8a-a664-48db-bc40-5c9862429b0f-kube-api-access-dx492\") pod \"metallb-operator-controller-manager-7447f7bf6-2crtw\" (UID: \"02873c8a-a664-48db-bc40-5c9862429b0f\") " pod="metallb-system/metallb-operator-controller-manager-7447f7bf6-2crtw" Mar 14 07:14:54 crc kubenswrapper[4893]: I0314 07:14:54.084031 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/02873c8a-a664-48db-bc40-5c9862429b0f-apiservice-cert\") pod \"metallb-operator-controller-manager-7447f7bf6-2crtw\" (UID: \"02873c8a-a664-48db-bc40-5c9862429b0f\") " pod="metallb-system/metallb-operator-controller-manager-7447f7bf6-2crtw" Mar 14 07:14:54 crc kubenswrapper[4893]: I0314 07:14:54.084068 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/02873c8a-a664-48db-bc40-5c9862429b0f-webhook-cert\") pod \"metallb-operator-controller-manager-7447f7bf6-2crtw\" (UID: \"02873c8a-a664-48db-bc40-5c9862429b0f\") " pod="metallb-system/metallb-operator-controller-manager-7447f7bf6-2crtw" Mar 14 07:14:54 crc kubenswrapper[4893]: I0314 07:14:54.185641 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dx492\" (UniqueName: \"kubernetes.io/projected/02873c8a-a664-48db-bc40-5c9862429b0f-kube-api-access-dx492\") pod \"metallb-operator-controller-manager-7447f7bf6-2crtw\" (UID: \"02873c8a-a664-48db-bc40-5c9862429b0f\") " pod="metallb-system/metallb-operator-controller-manager-7447f7bf6-2crtw" Mar 14 07:14:54 crc kubenswrapper[4893]: I0314 07:14:54.185800 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/02873c8a-a664-48db-bc40-5c9862429b0f-apiservice-cert\") pod \"metallb-operator-controller-manager-7447f7bf6-2crtw\" (UID: \"02873c8a-a664-48db-bc40-5c9862429b0f\") " pod="metallb-system/metallb-operator-controller-manager-7447f7bf6-2crtw" Mar 14 07:14:54 crc kubenswrapper[4893]: I0314 07:14:54.185830 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/02873c8a-a664-48db-bc40-5c9862429b0f-webhook-cert\") pod \"metallb-operator-controller-manager-7447f7bf6-2crtw\" (UID: \"02873c8a-a664-48db-bc40-5c9862429b0f\") " pod="metallb-system/metallb-operator-controller-manager-7447f7bf6-2crtw" Mar 14 07:14:54 crc kubenswrapper[4893]: I0314 07:14:54.192796 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/02873c8a-a664-48db-bc40-5c9862429b0f-apiservice-cert\") pod \"metallb-operator-controller-manager-7447f7bf6-2crtw\" (UID: \"02873c8a-a664-48db-bc40-5c9862429b0f\") " pod="metallb-system/metallb-operator-controller-manager-7447f7bf6-2crtw" Mar 14 07:14:54 crc kubenswrapper[4893]: I0314 07:14:54.192831 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/02873c8a-a664-48db-bc40-5c9862429b0f-webhook-cert\") pod \"metallb-operator-controller-manager-7447f7bf6-2crtw\" (UID: \"02873c8a-a664-48db-bc40-5c9862429b0f\") " pod="metallb-system/metallb-operator-controller-manager-7447f7bf6-2crtw" Mar 14 07:14:54 crc kubenswrapper[4893]: I0314 07:14:54.208211 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dx492\" (UniqueName: \"kubernetes.io/projected/02873c8a-a664-48db-bc40-5c9862429b0f-kube-api-access-dx492\") pod \"metallb-operator-controller-manager-7447f7bf6-2crtw\" (UID: \"02873c8a-a664-48db-bc40-5c9862429b0f\") " pod="metallb-system/metallb-operator-controller-manager-7447f7bf6-2crtw" Mar 14 07:14:54 crc kubenswrapper[4893]: I0314 07:14:54.228297 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7447f7bf6-2crtw" Mar 14 07:14:54 crc kubenswrapper[4893]: I0314 07:14:54.252852 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-7f5cb5468c-x85cn"] Mar 14 07:14:54 crc kubenswrapper[4893]: I0314 07:14:54.253552 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7f5cb5468c-x85cn" Mar 14 07:14:54 crc kubenswrapper[4893]: I0314 07:14:54.255765 4893 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 14 07:14:54 crc kubenswrapper[4893]: I0314 07:14:54.255822 4893 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 14 07:14:54 crc kubenswrapper[4893]: I0314 07:14:54.268425 4893 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-47h7z" Mar 14 07:14:54 crc kubenswrapper[4893]: I0314 07:14:54.273003 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7f5cb5468c-x85cn"] Mar 14 07:14:54 crc kubenswrapper[4893]: I0314 07:14:54.289595 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88tgv\" (UniqueName: \"kubernetes.io/projected/fe78adb3-52b5-406b-b4e9-64111677415f-kube-api-access-88tgv\") pod \"metallb-operator-webhook-server-7f5cb5468c-x85cn\" (UID: \"fe78adb3-52b5-406b-b4e9-64111677415f\") " pod="metallb-system/metallb-operator-webhook-server-7f5cb5468c-x85cn" Mar 14 07:14:54 crc kubenswrapper[4893]: I0314 07:14:54.289736 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fe78adb3-52b5-406b-b4e9-64111677415f-apiservice-cert\") pod \"metallb-operator-webhook-server-7f5cb5468c-x85cn\" (UID: \"fe78adb3-52b5-406b-b4e9-64111677415f\") " pod="metallb-system/metallb-operator-webhook-server-7f5cb5468c-x85cn" Mar 14 07:14:54 crc kubenswrapper[4893]: I0314 07:14:54.289791 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fe78adb3-52b5-406b-b4e9-64111677415f-webhook-cert\") pod \"metallb-operator-webhook-server-7f5cb5468c-x85cn\" (UID: \"fe78adb3-52b5-406b-b4e9-64111677415f\") " pod="metallb-system/metallb-operator-webhook-server-7f5cb5468c-x85cn" Mar 14 07:14:54 crc kubenswrapper[4893]: I0314 07:14:54.392119 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88tgv\" (UniqueName: \"kubernetes.io/projected/fe78adb3-52b5-406b-b4e9-64111677415f-kube-api-access-88tgv\") pod \"metallb-operator-webhook-server-7f5cb5468c-x85cn\" (UID: \"fe78adb3-52b5-406b-b4e9-64111677415f\") " pod="metallb-system/metallb-operator-webhook-server-7f5cb5468c-x85cn" Mar 14 07:14:54 crc kubenswrapper[4893]: I0314 07:14:54.392616 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fe78adb3-52b5-406b-b4e9-64111677415f-apiservice-cert\") pod \"metallb-operator-webhook-server-7f5cb5468c-x85cn\" (UID: \"fe78adb3-52b5-406b-b4e9-64111677415f\") " pod="metallb-system/metallb-operator-webhook-server-7f5cb5468c-x85cn" Mar 14 07:14:54 crc kubenswrapper[4893]: I0314 07:14:54.392664 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fe78adb3-52b5-406b-b4e9-64111677415f-webhook-cert\") pod \"metallb-operator-webhook-server-7f5cb5468c-x85cn\" (UID: \"fe78adb3-52b5-406b-b4e9-64111677415f\") " pod="metallb-system/metallb-operator-webhook-server-7f5cb5468c-x85cn" Mar 14 07:14:54 crc kubenswrapper[4893]: I0314 07:14:54.413899 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fe78adb3-52b5-406b-b4e9-64111677415f-webhook-cert\") pod \"metallb-operator-webhook-server-7f5cb5468c-x85cn\" (UID: \"fe78adb3-52b5-406b-b4e9-64111677415f\") " pod="metallb-system/metallb-operator-webhook-server-7f5cb5468c-x85cn" Mar 14 07:14:54 crc kubenswrapper[4893]: I0314 07:14:54.423071 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88tgv\" (UniqueName: \"kubernetes.io/projected/fe78adb3-52b5-406b-b4e9-64111677415f-kube-api-access-88tgv\") pod \"metallb-operator-webhook-server-7f5cb5468c-x85cn\" (UID: \"fe78adb3-52b5-406b-b4e9-64111677415f\") " pod="metallb-system/metallb-operator-webhook-server-7f5cb5468c-x85cn" Mar 14 07:14:54 crc kubenswrapper[4893]: I0314 07:14:54.426825 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fe78adb3-52b5-406b-b4e9-64111677415f-apiservice-cert\") pod \"metallb-operator-webhook-server-7f5cb5468c-x85cn\" (UID: \"fe78adb3-52b5-406b-b4e9-64111677415f\") " pod="metallb-system/metallb-operator-webhook-server-7f5cb5468c-x85cn" Mar 14 07:14:54 crc kubenswrapper[4893]: I0314 07:14:54.543843 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7447f7bf6-2crtw"] Mar 14 07:14:54 crc kubenswrapper[4893]: W0314 07:14:54.552115 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02873c8a_a664_48db_bc40_5c9862429b0f.slice/crio-2877e16fb4448eb5f5e8109f37bd97559b7473c9cad15ddb1020499eab5cb26c WatchSource:0}: Error finding container 2877e16fb4448eb5f5e8109f37bd97559b7473c9cad15ddb1020499eab5cb26c: Status 404 returned error can't find the container with id 2877e16fb4448eb5f5e8109f37bd97559b7473c9cad15ddb1020499eab5cb26c Mar 14 07:14:54 crc kubenswrapper[4893]: I0314 07:14:54.604433 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7f5cb5468c-x85cn" Mar 14 07:14:54 crc kubenswrapper[4893]: I0314 07:14:54.788118 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7f5cb5468c-x85cn"] Mar 14 07:14:54 crc kubenswrapper[4893]: W0314 07:14:54.799472 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe78adb3_52b5_406b_b4e9_64111677415f.slice/crio-ac3d06963960e99d6e654ddb51e2e05f0bf5d174a22e11b6f1260f61b4243b8e WatchSource:0}: Error finding container ac3d06963960e99d6e654ddb51e2e05f0bf5d174a22e11b6f1260f61b4243b8e: Status 404 returned error can't find the container with id ac3d06963960e99d6e654ddb51e2e05f0bf5d174a22e11b6f1260f61b4243b8e Mar 14 07:14:54 crc kubenswrapper[4893]: I0314 07:14:54.994506 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7f5cb5468c-x85cn" event={"ID":"fe78adb3-52b5-406b-b4e9-64111677415f","Type":"ContainerStarted","Data":"ac3d06963960e99d6e654ddb51e2e05f0bf5d174a22e11b6f1260f61b4243b8e"} Mar 14 07:14:54 crc kubenswrapper[4893]: I0314 07:14:54.995642 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7447f7bf6-2crtw" event={"ID":"02873c8a-a664-48db-bc40-5c9862429b0f","Type":"ContainerStarted","Data":"2877e16fb4448eb5f5e8109f37bd97559b7473c9cad15ddb1020499eab5cb26c"} Mar 14 07:14:58 crc kubenswrapper[4893]: I0314 07:14:58.020379 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7447f7bf6-2crtw" event={"ID":"02873c8a-a664-48db-bc40-5c9862429b0f","Type":"ContainerStarted","Data":"aa9480c5ed2c78bd0534bf8acbc1d84a1fe14fc96e6a0944d293937c1aaa555a"} Mar 14 07:14:58 crc kubenswrapper[4893]: I0314 07:14:58.021059 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-7447f7bf6-2crtw" Mar 14 07:14:58 crc kubenswrapper[4893]: I0314 07:14:58.038566 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-7447f7bf6-2crtw" podStartSLOduration=2.042256294 podStartE2EDuration="5.038551941s" podCreationTimestamp="2026-03-14 07:14:53 +0000 UTC" firstStartedPulling="2026-03-14 07:14:54.554568873 +0000 UTC m=+973.816745665" lastFinishedPulling="2026-03-14 07:14:57.55086452 +0000 UTC m=+976.813041312" observedRunningTime="2026-03-14 07:14:58.036776868 +0000 UTC m=+977.298953660" watchObservedRunningTime="2026-03-14 07:14:58.038551941 +0000 UTC m=+977.300728733" Mar 14 07:14:59 crc kubenswrapper[4893]: I0314 07:14:59.732158 4893 patch_prober.go:28] interesting pod/machine-config-daemon-d4x6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:14:59 crc kubenswrapper[4893]: I0314 07:14:59.732574 4893 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:15:00 crc kubenswrapper[4893]: I0314 07:15:00.047288 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7f5cb5468c-x85cn" event={"ID":"fe78adb3-52b5-406b-b4e9-64111677415f","Type":"ContainerStarted","Data":"85895af595c6941cec908a50d44f4264ec4e375ea7550e9c130c27b850b945f4"} Mar 14 07:15:00 crc kubenswrapper[4893]: I0314 07:15:00.047449 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-7f5cb5468c-x85cn" Mar 14 07:15:00 crc kubenswrapper[4893]: I0314 07:15:00.070343 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-7f5cb5468c-x85cn" podStartSLOduration=1.6723982240000002 podStartE2EDuration="6.070323381s" podCreationTimestamp="2026-03-14 07:14:54 +0000 UTC" firstStartedPulling="2026-03-14 07:14:54.801758535 +0000 UTC m=+974.063935327" lastFinishedPulling="2026-03-14 07:14:59.199683692 +0000 UTC m=+978.461860484" observedRunningTime="2026-03-14 07:15:00.067814309 +0000 UTC m=+979.329991111" watchObservedRunningTime="2026-03-14 07:15:00.070323381 +0000 UTC m=+979.332500193" Mar 14 07:15:00 crc kubenswrapper[4893]: I0314 07:15:00.127141 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557875-qf9nd"] Mar 14 07:15:00 crc kubenswrapper[4893]: I0314 07:15:00.127924 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557875-qf9nd" Mar 14 07:15:00 crc kubenswrapper[4893]: I0314 07:15:00.130503 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 14 07:15:00 crc kubenswrapper[4893]: I0314 07:15:00.136832 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 14 07:15:00 crc kubenswrapper[4893]: I0314 07:15:00.138290 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557875-qf9nd"] Mar 14 07:15:00 crc kubenswrapper[4893]: I0314 07:15:00.294273 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a27ae91a-070d-493c-a949-3ae6ffe8d411-config-volume\") pod \"collect-profiles-29557875-qf9nd\" (UID: \"a27ae91a-070d-493c-a949-3ae6ffe8d411\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557875-qf9nd" Mar 14 07:15:00 crc kubenswrapper[4893]: I0314 07:15:00.294337 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlbr9\" (UniqueName: \"kubernetes.io/projected/a27ae91a-070d-493c-a949-3ae6ffe8d411-kube-api-access-zlbr9\") pod \"collect-profiles-29557875-qf9nd\" (UID: \"a27ae91a-070d-493c-a949-3ae6ffe8d411\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557875-qf9nd" Mar 14 07:15:00 crc kubenswrapper[4893]: I0314 07:15:00.294413 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a27ae91a-070d-493c-a949-3ae6ffe8d411-secret-volume\") pod \"collect-profiles-29557875-qf9nd\" (UID: \"a27ae91a-070d-493c-a949-3ae6ffe8d411\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557875-qf9nd" Mar 14 07:15:00 crc kubenswrapper[4893]: I0314 07:15:00.395344 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a27ae91a-070d-493c-a949-3ae6ffe8d411-secret-volume\") pod \"collect-profiles-29557875-qf9nd\" (UID: \"a27ae91a-070d-493c-a949-3ae6ffe8d411\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557875-qf9nd" Mar 14 07:15:00 crc kubenswrapper[4893]: I0314 07:15:00.395438 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a27ae91a-070d-493c-a949-3ae6ffe8d411-config-volume\") pod \"collect-profiles-29557875-qf9nd\" (UID: \"a27ae91a-070d-493c-a949-3ae6ffe8d411\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557875-qf9nd" Mar 14 07:15:00 crc kubenswrapper[4893]: I0314 07:15:00.395476 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlbr9\" (UniqueName: \"kubernetes.io/projected/a27ae91a-070d-493c-a949-3ae6ffe8d411-kube-api-access-zlbr9\") pod \"collect-profiles-29557875-qf9nd\" (UID: \"a27ae91a-070d-493c-a949-3ae6ffe8d411\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557875-qf9nd" Mar 14 07:15:00 crc kubenswrapper[4893]: I0314 07:15:00.397593 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a27ae91a-070d-493c-a949-3ae6ffe8d411-config-volume\") pod \"collect-profiles-29557875-qf9nd\" (UID: \"a27ae91a-070d-493c-a949-3ae6ffe8d411\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557875-qf9nd" Mar 14 07:15:00 crc kubenswrapper[4893]: I0314 07:15:00.413255 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a27ae91a-070d-493c-a949-3ae6ffe8d411-secret-volume\") pod \"collect-profiles-29557875-qf9nd\" (UID: \"a27ae91a-070d-493c-a949-3ae6ffe8d411\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557875-qf9nd" Mar 14 07:15:00 crc kubenswrapper[4893]: I0314 07:15:00.418162 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlbr9\" (UniqueName: \"kubernetes.io/projected/a27ae91a-070d-493c-a949-3ae6ffe8d411-kube-api-access-zlbr9\") pod \"collect-profiles-29557875-qf9nd\" (UID: \"a27ae91a-070d-493c-a949-3ae6ffe8d411\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557875-qf9nd" Mar 14 07:15:00 crc kubenswrapper[4893]: I0314 07:15:00.449167 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557875-qf9nd" Mar 14 07:15:00 crc kubenswrapper[4893]: I0314 07:15:00.857601 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557875-qf9nd"] Mar 14 07:15:01 crc kubenswrapper[4893]: I0314 07:15:01.053473 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557875-qf9nd" event={"ID":"a27ae91a-070d-493c-a949-3ae6ffe8d411","Type":"ContainerStarted","Data":"edbf875ce088881f17ee94c2d60bb6b28c0e3260767d018d787baece34481800"} Mar 14 07:15:01 crc kubenswrapper[4893]: I0314 07:15:01.053533 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557875-qf9nd" event={"ID":"a27ae91a-070d-493c-a949-3ae6ffe8d411","Type":"ContainerStarted","Data":"a5def67af36765498ac290661e4060847827ef466334fa4432a8ca081c1ba8e9"} Mar 14 07:15:01 crc kubenswrapper[4893]: I0314 07:15:01.075029 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29557875-qf9nd" podStartSLOduration=1.074197532 podStartE2EDuration="1.074197532s" podCreationTimestamp="2026-03-14 07:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:15:01.06675282 +0000 UTC m=+980.328929612" watchObservedRunningTime="2026-03-14 07:15:01.074197532 +0000 UTC m=+980.336374324" Mar 14 07:15:02 crc kubenswrapper[4893]: I0314 07:15:02.063757 4893 generic.go:334] "Generic (PLEG): container finished" podID="a27ae91a-070d-493c-a949-3ae6ffe8d411" containerID="edbf875ce088881f17ee94c2d60bb6b28c0e3260767d018d787baece34481800" exitCode=0 Mar 14 07:15:02 crc kubenswrapper[4893]: I0314 07:15:02.063837 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557875-qf9nd" event={"ID":"a27ae91a-070d-493c-a949-3ae6ffe8d411","Type":"ContainerDied","Data":"edbf875ce088881f17ee94c2d60bb6b28c0e3260767d018d787baece34481800"} Mar 14 07:15:03 crc kubenswrapper[4893]: I0314 07:15:03.294512 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557875-qf9nd" Mar 14 07:15:03 crc kubenswrapper[4893]: I0314 07:15:03.440025 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a27ae91a-070d-493c-a949-3ae6ffe8d411-secret-volume\") pod \"a27ae91a-070d-493c-a949-3ae6ffe8d411\" (UID: \"a27ae91a-070d-493c-a949-3ae6ffe8d411\") " Mar 14 07:15:03 crc kubenswrapper[4893]: I0314 07:15:03.440162 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a27ae91a-070d-493c-a949-3ae6ffe8d411-config-volume\") pod \"a27ae91a-070d-493c-a949-3ae6ffe8d411\" (UID: \"a27ae91a-070d-493c-a949-3ae6ffe8d411\") " Mar 14 07:15:03 crc kubenswrapper[4893]: I0314 07:15:03.440249 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zlbr9\" (UniqueName: \"kubernetes.io/projected/a27ae91a-070d-493c-a949-3ae6ffe8d411-kube-api-access-zlbr9\") pod \"a27ae91a-070d-493c-a949-3ae6ffe8d411\" (UID: \"a27ae91a-070d-493c-a949-3ae6ffe8d411\") " Mar 14 07:15:03 crc kubenswrapper[4893]: I0314 07:15:03.441126 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a27ae91a-070d-493c-a949-3ae6ffe8d411-config-volume" (OuterVolumeSpecName: "config-volume") pod "a27ae91a-070d-493c-a949-3ae6ffe8d411" (UID: "a27ae91a-070d-493c-a949-3ae6ffe8d411"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:15:03 crc kubenswrapper[4893]: I0314 07:15:03.446877 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a27ae91a-070d-493c-a949-3ae6ffe8d411-kube-api-access-zlbr9" (OuterVolumeSpecName: "kube-api-access-zlbr9") pod "a27ae91a-070d-493c-a949-3ae6ffe8d411" (UID: "a27ae91a-070d-493c-a949-3ae6ffe8d411"). InnerVolumeSpecName "kube-api-access-zlbr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:15:03 crc kubenswrapper[4893]: I0314 07:15:03.447072 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a27ae91a-070d-493c-a949-3ae6ffe8d411-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a27ae91a-070d-493c-a949-3ae6ffe8d411" (UID: "a27ae91a-070d-493c-a949-3ae6ffe8d411"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:15:03 crc kubenswrapper[4893]: I0314 07:15:03.542189 4893 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a27ae91a-070d-493c-a949-3ae6ffe8d411-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 14 07:15:03 crc kubenswrapper[4893]: I0314 07:15:03.542225 4893 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a27ae91a-070d-493c-a949-3ae6ffe8d411-config-volume\") on node \"crc\" DevicePath \"\"" Mar 14 07:15:03 crc kubenswrapper[4893]: I0314 07:15:03.542240 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zlbr9\" (UniqueName: \"kubernetes.io/projected/a27ae91a-070d-493c-a949-3ae6ffe8d411-kube-api-access-zlbr9\") on node \"crc\" DevicePath \"\"" Mar 14 07:15:04 crc kubenswrapper[4893]: I0314 07:15:04.074760 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557875-qf9nd" event={"ID":"a27ae91a-070d-493c-a949-3ae6ffe8d411","Type":"ContainerDied","Data":"a5def67af36765498ac290661e4060847827ef466334fa4432a8ca081c1ba8e9"} Mar 14 07:15:04 crc kubenswrapper[4893]: I0314 07:15:04.074814 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5def67af36765498ac290661e4060847827ef466334fa4432a8ca081c1ba8e9" Mar 14 07:15:04 crc kubenswrapper[4893]: I0314 07:15:04.074854 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557875-qf9nd" Mar 14 07:15:14 crc kubenswrapper[4893]: I0314 07:15:14.610678 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-7f5cb5468c-x85cn" Mar 14 07:15:29 crc kubenswrapper[4893]: I0314 07:15:29.731614 4893 patch_prober.go:28] interesting pod/machine-config-daemon-d4x6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:15:29 crc kubenswrapper[4893]: I0314 07:15:29.732226 4893 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:15:34 crc kubenswrapper[4893]: I0314 07:15:34.231287 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-7447f7bf6-2crtw" Mar 14 07:15:34 crc kubenswrapper[4893]: I0314 07:15:34.892057 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-7m6db"] Mar 14 07:15:34 crc kubenswrapper[4893]: E0314 07:15:34.892793 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a27ae91a-070d-493c-a949-3ae6ffe8d411" containerName="collect-profiles" Mar 14 07:15:34 crc kubenswrapper[4893]: I0314 07:15:34.892950 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="a27ae91a-070d-493c-a949-3ae6ffe8d411" containerName="collect-profiles" Mar 14 07:15:34 crc kubenswrapper[4893]: I0314 07:15:34.893236 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="a27ae91a-070d-493c-a949-3ae6ffe8d411" containerName="collect-profiles" Mar 14 07:15:34 crc kubenswrapper[4893]: I0314 07:15:34.896597 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-7m6db" Mar 14 07:15:34 crc kubenswrapper[4893]: I0314 07:15:34.897886 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-s55qs"] Mar 14 07:15:34 crc kubenswrapper[4893]: I0314 07:15:34.898541 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-s55qs" Mar 14 07:15:34 crc kubenswrapper[4893]: I0314 07:15:34.899516 4893 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-j6qwd" Mar 14 07:15:34 crc kubenswrapper[4893]: I0314 07:15:34.899869 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 14 07:15:34 crc kubenswrapper[4893]: I0314 07:15:34.900081 4893 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 14 07:15:34 crc kubenswrapper[4893]: I0314 07:15:34.902710 4893 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 14 07:15:34 crc kubenswrapper[4893]: I0314 07:15:34.912306 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-s55qs"] Mar 14 07:15:34 crc kubenswrapper[4893]: I0314 07:15:34.990267 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-2pzp9"] Mar 14 07:15:34 crc kubenswrapper[4893]: I0314 07:15:34.991202 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-2pzp9" Mar 14 07:15:34 crc kubenswrapper[4893]: I0314 07:15:34.992781 4893 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 14 07:15:34 crc kubenswrapper[4893]: I0314 07:15:34.993049 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 14 07:15:34 crc kubenswrapper[4893]: I0314 07:15:34.993603 4893 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 14 07:15:34 crc kubenswrapper[4893]: I0314 07:15:34.995200 4893 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-47f84" Mar 14 07:15:35 crc kubenswrapper[4893]: I0314 07:15:35.005293 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-clzml"] Mar 14 07:15:35 crc kubenswrapper[4893]: I0314 07:15:35.006381 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-clzml" Mar 14 07:15:35 crc kubenswrapper[4893]: I0314 07:15:35.015544 4893 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 14 07:15:35 crc kubenswrapper[4893]: I0314 07:15:35.024964 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-clzml"] Mar 14 07:15:35 crc kubenswrapper[4893]: I0314 07:15:35.034556 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c697e0ec-f27c-4fff-9e27-bb5ba9738ad5-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-s55qs\" (UID: \"c697e0ec-f27c-4fff-9e27-bb5ba9738ad5\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-s55qs" Mar 14 07:15:35 crc kubenswrapper[4893]: I0314 07:15:35.034624 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/17f61fcc-48c5-42ef-97a4-9440159a79fc-metrics-certs\") pod \"frr-k8s-7m6db\" (UID: \"17f61fcc-48c5-42ef-97a4-9440159a79fc\") " pod="metallb-system/frr-k8s-7m6db" Mar 14 07:15:35 crc kubenswrapper[4893]: I0314 07:15:35.034662 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/17f61fcc-48c5-42ef-97a4-9440159a79fc-frr-startup\") pod \"frr-k8s-7m6db\" (UID: \"17f61fcc-48c5-42ef-97a4-9440159a79fc\") " pod="metallb-system/frr-k8s-7m6db" Mar 14 07:15:35 crc kubenswrapper[4893]: I0314 07:15:35.034681 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/17f61fcc-48c5-42ef-97a4-9440159a79fc-reloader\") pod \"frr-k8s-7m6db\" (UID: \"17f61fcc-48c5-42ef-97a4-9440159a79fc\") " pod="metallb-system/frr-k8s-7m6db" Mar 14 07:15:35 crc kubenswrapper[4893]: I0314 07:15:35.034700 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/17f61fcc-48c5-42ef-97a4-9440159a79fc-frr-conf\") pod \"frr-k8s-7m6db\" (UID: \"17f61fcc-48c5-42ef-97a4-9440159a79fc\") " pod="metallb-system/frr-k8s-7m6db" Mar 14 07:15:35 crc kubenswrapper[4893]: I0314 07:15:35.034721 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lm5sp\" (UniqueName: \"kubernetes.io/projected/c697e0ec-f27c-4fff-9e27-bb5ba9738ad5-kube-api-access-lm5sp\") pod \"frr-k8s-webhook-server-bcc4b6f68-s55qs\" (UID: \"c697e0ec-f27c-4fff-9e27-bb5ba9738ad5\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-s55qs" Mar 14 07:15:35 crc kubenswrapper[4893]: I0314 07:15:35.034774 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vl6q8\" (UniqueName: \"kubernetes.io/projected/17f61fcc-48c5-42ef-97a4-9440159a79fc-kube-api-access-vl6q8\") pod \"frr-k8s-7m6db\" (UID: \"17f61fcc-48c5-42ef-97a4-9440159a79fc\") " pod="metallb-system/frr-k8s-7m6db" Mar 14 07:15:35 crc kubenswrapper[4893]: I0314 07:15:35.034803 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/17f61fcc-48c5-42ef-97a4-9440159a79fc-frr-sockets\") pod \"frr-k8s-7m6db\" (UID: \"17f61fcc-48c5-42ef-97a4-9440159a79fc\") " pod="metallb-system/frr-k8s-7m6db" Mar 14 07:15:35 crc kubenswrapper[4893]: I0314 07:15:35.034821 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/17f61fcc-48c5-42ef-97a4-9440159a79fc-metrics\") pod \"frr-k8s-7m6db\" (UID: \"17f61fcc-48c5-42ef-97a4-9440159a79fc\") " pod="metallb-system/frr-k8s-7m6db" Mar 14 07:15:35 crc kubenswrapper[4893]: I0314 07:15:35.136080 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c697e0ec-f27c-4fff-9e27-bb5ba9738ad5-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-s55qs\" (UID: \"c697e0ec-f27c-4fff-9e27-bb5ba9738ad5\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-s55qs" Mar 14 07:15:35 crc kubenswrapper[4893]: I0314 07:15:35.136467 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/92e0ed09-c3c1-42a9-9405-fcf906686b43-cert\") pod \"controller-7bb4cc7c98-clzml\" (UID: \"92e0ed09-c3c1-42a9-9405-fcf906686b43\") " pod="metallb-system/controller-7bb4cc7c98-clzml" Mar 14 07:15:35 crc kubenswrapper[4893]: I0314 07:15:35.136730 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqmgl\" (UniqueName: \"kubernetes.io/projected/92e0ed09-c3c1-42a9-9405-fcf906686b43-kube-api-access-nqmgl\") pod \"controller-7bb4cc7c98-clzml\" (UID: \"92e0ed09-c3c1-42a9-9405-fcf906686b43\") " pod="metallb-system/controller-7bb4cc7c98-clzml" Mar 14 07:15:35 crc kubenswrapper[4893]: I0314 07:15:35.136965 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/25c5b449-1fb7-4fa3-80da-3227b24100d5-metrics-certs\") pod \"speaker-2pzp9\" (UID: \"25c5b449-1fb7-4fa3-80da-3227b24100d5\") " pod="metallb-system/speaker-2pzp9" Mar 14 07:15:35 crc kubenswrapper[4893]: I0314 07:15:35.137193 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/92e0ed09-c3c1-42a9-9405-fcf906686b43-metrics-certs\") pod \"controller-7bb4cc7c98-clzml\" (UID: \"92e0ed09-c3c1-42a9-9405-fcf906686b43\") " pod="metallb-system/controller-7bb4cc7c98-clzml" Mar 14 07:15:35 crc kubenswrapper[4893]: I0314 07:15:35.137376 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/25c5b449-1fb7-4fa3-80da-3227b24100d5-memberlist\") pod \"speaker-2pzp9\" (UID: \"25c5b449-1fb7-4fa3-80da-3227b24100d5\") " pod="metallb-system/speaker-2pzp9" Mar 14 07:15:35 crc kubenswrapper[4893]: I0314 07:15:35.137577 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/17f61fcc-48c5-42ef-97a4-9440159a79fc-metrics-certs\") pod \"frr-k8s-7m6db\" (UID: \"17f61fcc-48c5-42ef-97a4-9440159a79fc\") " pod="metallb-system/frr-k8s-7m6db" Mar 14 07:15:35 crc kubenswrapper[4893]: I0314 07:15:35.137755 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/17f61fcc-48c5-42ef-97a4-9440159a79fc-frr-startup\") pod \"frr-k8s-7m6db\" (UID: \"17f61fcc-48c5-42ef-97a4-9440159a79fc\") " pod="metallb-system/frr-k8s-7m6db" Mar 14 07:15:35 crc kubenswrapper[4893]: I0314 07:15:35.137928 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/17f61fcc-48c5-42ef-97a4-9440159a79fc-reloader\") pod \"frr-k8s-7m6db\" (UID: \"17f61fcc-48c5-42ef-97a4-9440159a79fc\") " pod="metallb-system/frr-k8s-7m6db" Mar 14 07:15:35 crc kubenswrapper[4893]: I0314 07:15:35.138088 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlh54\" (UniqueName: \"kubernetes.io/projected/25c5b449-1fb7-4fa3-80da-3227b24100d5-kube-api-access-mlh54\") pod \"speaker-2pzp9\" (UID: \"25c5b449-1fb7-4fa3-80da-3227b24100d5\") " pod="metallb-system/speaker-2pzp9" Mar 14 07:15:35 crc kubenswrapper[4893]: I0314 07:15:35.138268 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/17f61fcc-48c5-42ef-97a4-9440159a79fc-frr-conf\") pod \"frr-k8s-7m6db\" (UID: \"17f61fcc-48c5-42ef-97a4-9440159a79fc\") " pod="metallb-system/frr-k8s-7m6db" Mar 14 07:15:35 crc kubenswrapper[4893]: I0314 07:15:35.138444 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lm5sp\" (UniqueName: \"kubernetes.io/projected/c697e0ec-f27c-4fff-9e27-bb5ba9738ad5-kube-api-access-lm5sp\") pod \"frr-k8s-webhook-server-bcc4b6f68-s55qs\" (UID: \"c697e0ec-f27c-4fff-9e27-bb5ba9738ad5\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-s55qs" Mar 14 07:15:35 crc kubenswrapper[4893]: I0314 07:15:35.138672 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vl6q8\" (UniqueName: \"kubernetes.io/projected/17f61fcc-48c5-42ef-97a4-9440159a79fc-kube-api-access-vl6q8\") pod \"frr-k8s-7m6db\" (UID: \"17f61fcc-48c5-42ef-97a4-9440159a79fc\") " pod="metallb-system/frr-k8s-7m6db" Mar 14 07:15:35 crc kubenswrapper[4893]: I0314 07:15:35.138890 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/25c5b449-1fb7-4fa3-80da-3227b24100d5-metallb-excludel2\") pod \"speaker-2pzp9\" (UID: \"25c5b449-1fb7-4fa3-80da-3227b24100d5\") " pod="metallb-system/speaker-2pzp9" Mar 14 07:15:35 crc kubenswrapper[4893]: I0314 07:15:35.139071 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/17f61fcc-48c5-42ef-97a4-9440159a79fc-frr-sockets\") pod \"frr-k8s-7m6db\" (UID: \"17f61fcc-48c5-42ef-97a4-9440159a79fc\") " pod="metallb-system/frr-k8s-7m6db" Mar 14 07:15:35 crc kubenswrapper[4893]: I0314 07:15:35.139226 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/17f61fcc-48c5-42ef-97a4-9440159a79fc-metrics\") pod \"frr-k8s-7m6db\" (UID: \"17f61fcc-48c5-42ef-97a4-9440159a79fc\") " pod="metallb-system/frr-k8s-7m6db" Mar 14 07:15:35 crc kubenswrapper[4893]: I0314 07:15:35.139345 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/17f61fcc-48c5-42ef-97a4-9440159a79fc-frr-startup\") pod \"frr-k8s-7m6db\" (UID: \"17f61fcc-48c5-42ef-97a4-9440159a79fc\") " pod="metallb-system/frr-k8s-7m6db" Mar 14 07:15:35 crc kubenswrapper[4893]: I0314 07:15:35.138509 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/17f61fcc-48c5-42ef-97a4-9440159a79fc-reloader\") pod \"frr-k8s-7m6db\" (UID: \"17f61fcc-48c5-42ef-97a4-9440159a79fc\") " pod="metallb-system/frr-k8s-7m6db" Mar 14 07:15:35 crc kubenswrapper[4893]: I0314 07:15:35.138723 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/17f61fcc-48c5-42ef-97a4-9440159a79fc-frr-conf\") pod \"frr-k8s-7m6db\" (UID: \"17f61fcc-48c5-42ef-97a4-9440159a79fc\") " pod="metallb-system/frr-k8s-7m6db" Mar 14 07:15:35 crc kubenswrapper[4893]: I0314 07:15:35.140145 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/17f61fcc-48c5-42ef-97a4-9440159a79fc-metrics\") pod \"frr-k8s-7m6db\" (UID: \"17f61fcc-48c5-42ef-97a4-9440159a79fc\") " pod="metallb-system/frr-k8s-7m6db" Mar 14 07:15:35 crc kubenswrapper[4893]: I0314 07:15:35.140689 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/17f61fcc-48c5-42ef-97a4-9440159a79fc-frr-sockets\") pod \"frr-k8s-7m6db\" (UID: \"17f61fcc-48c5-42ef-97a4-9440159a79fc\") " pod="metallb-system/frr-k8s-7m6db" Mar 14 07:15:35 crc kubenswrapper[4893]: I0314 07:15:35.141977 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/17f61fcc-48c5-42ef-97a4-9440159a79fc-metrics-certs\") pod \"frr-k8s-7m6db\" (UID: \"17f61fcc-48c5-42ef-97a4-9440159a79fc\") " pod="metallb-system/frr-k8s-7m6db" Mar 14 07:15:35 crc kubenswrapper[4893]: I0314 07:15:35.142135 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c697e0ec-f27c-4fff-9e27-bb5ba9738ad5-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-s55qs\" (UID: \"c697e0ec-f27c-4fff-9e27-bb5ba9738ad5\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-s55qs" Mar 14 07:15:35 crc kubenswrapper[4893]: I0314 07:15:35.159946 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vl6q8\" (UniqueName: \"kubernetes.io/projected/17f61fcc-48c5-42ef-97a4-9440159a79fc-kube-api-access-vl6q8\") pod \"frr-k8s-7m6db\" (UID: \"17f61fcc-48c5-42ef-97a4-9440159a79fc\") " pod="metallb-system/frr-k8s-7m6db" Mar 14 07:15:35 crc kubenswrapper[4893]: I0314 07:15:35.160453 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lm5sp\" (UniqueName: \"kubernetes.io/projected/c697e0ec-f27c-4fff-9e27-bb5ba9738ad5-kube-api-access-lm5sp\") pod \"frr-k8s-webhook-server-bcc4b6f68-s55qs\" (UID: \"c697e0ec-f27c-4fff-9e27-bb5ba9738ad5\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-s55qs" Mar 14 07:15:35 crc kubenswrapper[4893]: I0314 07:15:35.214022 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-7m6db" Mar 14 07:15:35 crc kubenswrapper[4893]: I0314 07:15:35.223383 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-s55qs" Mar 14 07:15:35 crc kubenswrapper[4893]: I0314 07:15:35.247193 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/92e0ed09-c3c1-42a9-9405-fcf906686b43-cert\") pod \"controller-7bb4cc7c98-clzml\" (UID: \"92e0ed09-c3c1-42a9-9405-fcf906686b43\") " pod="metallb-system/controller-7bb4cc7c98-clzml" Mar 14 07:15:35 crc kubenswrapper[4893]: I0314 07:15:35.247749 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqmgl\" (UniqueName: \"kubernetes.io/projected/92e0ed09-c3c1-42a9-9405-fcf906686b43-kube-api-access-nqmgl\") pod \"controller-7bb4cc7c98-clzml\" (UID: \"92e0ed09-c3c1-42a9-9405-fcf906686b43\") " pod="metallb-system/controller-7bb4cc7c98-clzml" Mar 14 07:15:35 crc kubenswrapper[4893]: I0314 07:15:35.247832 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/25c5b449-1fb7-4fa3-80da-3227b24100d5-metrics-certs\") pod \"speaker-2pzp9\" (UID: \"25c5b449-1fb7-4fa3-80da-3227b24100d5\") " pod="metallb-system/speaker-2pzp9" Mar 14 07:15:35 crc kubenswrapper[4893]: I0314 07:15:35.247912 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/92e0ed09-c3c1-42a9-9405-fcf906686b43-metrics-certs\") pod \"controller-7bb4cc7c98-clzml\" (UID: \"92e0ed09-c3c1-42a9-9405-fcf906686b43\") " pod="metallb-system/controller-7bb4cc7c98-clzml" Mar 14 07:15:35 crc kubenswrapper[4893]: I0314 07:15:35.247991 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/25c5b449-1fb7-4fa3-80da-3227b24100d5-memberlist\") pod \"speaker-2pzp9\" (UID: \"25c5b449-1fb7-4fa3-80da-3227b24100d5\") " pod="metallb-system/speaker-2pzp9" Mar 14 07:15:35 crc kubenswrapper[4893]: I0314 07:15:35.248076 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlh54\" (UniqueName: \"kubernetes.io/projected/25c5b449-1fb7-4fa3-80da-3227b24100d5-kube-api-access-mlh54\") pod \"speaker-2pzp9\" (UID: \"25c5b449-1fb7-4fa3-80da-3227b24100d5\") " pod="metallb-system/speaker-2pzp9" Mar 14 07:15:35 crc kubenswrapper[4893]: I0314 07:15:35.248162 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/25c5b449-1fb7-4fa3-80da-3227b24100d5-metallb-excludel2\") pod \"speaker-2pzp9\" (UID: \"25c5b449-1fb7-4fa3-80da-3227b24100d5\") " pod="metallb-system/speaker-2pzp9" Mar 14 07:15:35 crc kubenswrapper[4893]: I0314 07:15:35.248908 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/25c5b449-1fb7-4fa3-80da-3227b24100d5-metallb-excludel2\") pod \"speaker-2pzp9\" (UID: \"25c5b449-1fb7-4fa3-80da-3227b24100d5\") " pod="metallb-system/speaker-2pzp9" Mar 14 07:15:35 crc kubenswrapper[4893]: E0314 07:15:35.249475 4893 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Mar 14 07:15:35 crc kubenswrapper[4893]: E0314 07:15:35.249608 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92e0ed09-c3c1-42a9-9405-fcf906686b43-metrics-certs podName:92e0ed09-c3c1-42a9-9405-fcf906686b43 nodeName:}" failed. No retries permitted until 2026-03-14 07:15:35.749589928 +0000 UTC m=+1015.011766720 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/92e0ed09-c3c1-42a9-9405-fcf906686b43-metrics-certs") pod "controller-7bb4cc7c98-clzml" (UID: "92e0ed09-c3c1-42a9-9405-fcf906686b43") : secret "controller-certs-secret" not found Mar 14 07:15:35 crc kubenswrapper[4893]: E0314 07:15:35.249966 4893 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 14 07:15:35 crc kubenswrapper[4893]: E0314 07:15:35.250067 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/25c5b449-1fb7-4fa3-80da-3227b24100d5-memberlist podName:25c5b449-1fb7-4fa3-80da-3227b24100d5 nodeName:}" failed. No retries permitted until 2026-03-14 07:15:35.750055129 +0000 UTC m=+1015.012231921 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/25c5b449-1fb7-4fa3-80da-3227b24100d5-memberlist") pod "speaker-2pzp9" (UID: "25c5b449-1fb7-4fa3-80da-3227b24100d5") : secret "metallb-memberlist" not found Mar 14 07:15:35 crc kubenswrapper[4893]: I0314 07:15:35.251842 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/25c5b449-1fb7-4fa3-80da-3227b24100d5-metrics-certs\") pod \"speaker-2pzp9\" (UID: \"25c5b449-1fb7-4fa3-80da-3227b24100d5\") " pod="metallb-system/speaker-2pzp9" Mar 14 07:15:35 crc kubenswrapper[4893]: I0314 07:15:35.252374 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/92e0ed09-c3c1-42a9-9405-fcf906686b43-cert\") pod \"controller-7bb4cc7c98-clzml\" (UID: \"92e0ed09-c3c1-42a9-9405-fcf906686b43\") " pod="metallb-system/controller-7bb4cc7c98-clzml" Mar 14 07:15:35 crc kubenswrapper[4893]: I0314 07:15:35.267090 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqmgl\" (UniqueName: \"kubernetes.io/projected/92e0ed09-c3c1-42a9-9405-fcf906686b43-kube-api-access-nqmgl\") pod \"controller-7bb4cc7c98-clzml\" (UID: \"92e0ed09-c3c1-42a9-9405-fcf906686b43\") " pod="metallb-system/controller-7bb4cc7c98-clzml" Mar 14 07:15:35 crc kubenswrapper[4893]: I0314 07:15:35.271708 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlh54\" (UniqueName: \"kubernetes.io/projected/25c5b449-1fb7-4fa3-80da-3227b24100d5-kube-api-access-mlh54\") pod \"speaker-2pzp9\" (UID: \"25c5b449-1fb7-4fa3-80da-3227b24100d5\") " pod="metallb-system/speaker-2pzp9" Mar 14 07:15:35 crc kubenswrapper[4893]: I0314 07:15:35.486586 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-s55qs"] Mar 14 07:15:35 crc kubenswrapper[4893]: I0314 07:15:35.754742 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/92e0ed09-c3c1-42a9-9405-fcf906686b43-metrics-certs\") pod \"controller-7bb4cc7c98-clzml\" (UID: \"92e0ed09-c3c1-42a9-9405-fcf906686b43\") " pod="metallb-system/controller-7bb4cc7c98-clzml" Mar 14 07:15:35 crc kubenswrapper[4893]: I0314 07:15:35.755468 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/25c5b449-1fb7-4fa3-80da-3227b24100d5-memberlist\") pod \"speaker-2pzp9\" (UID: \"25c5b449-1fb7-4fa3-80da-3227b24100d5\") " pod="metallb-system/speaker-2pzp9" Mar 14 07:15:35 crc kubenswrapper[4893]: E0314 07:15:35.755825 4893 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 14 07:15:35 crc kubenswrapper[4893]: E0314 07:15:35.755992 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/25c5b449-1fb7-4fa3-80da-3227b24100d5-memberlist podName:25c5b449-1fb7-4fa3-80da-3227b24100d5 nodeName:}" failed. No retries permitted until 2026-03-14 07:15:36.755937885 +0000 UTC m=+1016.018114717 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/25c5b449-1fb7-4fa3-80da-3227b24100d5-memberlist") pod "speaker-2pzp9" (UID: "25c5b449-1fb7-4fa3-80da-3227b24100d5") : secret "metallb-memberlist" not found Mar 14 07:15:35 crc kubenswrapper[4893]: I0314 07:15:35.763588 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/92e0ed09-c3c1-42a9-9405-fcf906686b43-metrics-certs\") pod \"controller-7bb4cc7c98-clzml\" (UID: \"92e0ed09-c3c1-42a9-9405-fcf906686b43\") " pod="metallb-system/controller-7bb4cc7c98-clzml" Mar 14 07:15:35 crc kubenswrapper[4893]: I0314 07:15:35.918367 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-clzml" Mar 14 07:15:36 crc kubenswrapper[4893]: I0314 07:15:36.196482 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-clzml"] Mar 14 07:15:36 crc kubenswrapper[4893]: W0314 07:15:36.197490 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92e0ed09_c3c1_42a9_9405_fcf906686b43.slice/crio-089cbbe7d09967a98e0ee02569d7689ba6cada87c1132db96b35fdb17306330f WatchSource:0}: Error finding container 089cbbe7d09967a98e0ee02569d7689ba6cada87c1132db96b35fdb17306330f: Status 404 returned error can't find the container with id 089cbbe7d09967a98e0ee02569d7689ba6cada87c1132db96b35fdb17306330f Mar 14 07:15:36 crc kubenswrapper[4893]: I0314 07:15:36.279802 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-clzml" event={"ID":"92e0ed09-c3c1-42a9-9405-fcf906686b43","Type":"ContainerStarted","Data":"089cbbe7d09967a98e0ee02569d7689ba6cada87c1132db96b35fdb17306330f"} Mar 14 07:15:36 crc kubenswrapper[4893]: I0314 07:15:36.281037 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7m6db" event={"ID":"17f61fcc-48c5-42ef-97a4-9440159a79fc","Type":"ContainerStarted","Data":"39f4306aadf8498d80d4654568deedd606fdf83f8dc30394960bee808a8ad17c"} Mar 14 07:15:36 crc kubenswrapper[4893]: I0314 07:15:36.281875 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-s55qs" event={"ID":"c697e0ec-f27c-4fff-9e27-bb5ba9738ad5","Type":"ContainerStarted","Data":"efaeef8949da52e49e91933c9c1e52faf5baa664176f8364dac83bc5d0b688ee"} Mar 14 07:15:36 crc kubenswrapper[4893]: I0314 07:15:36.774447 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/25c5b449-1fb7-4fa3-80da-3227b24100d5-memberlist\") pod \"speaker-2pzp9\" (UID: \"25c5b449-1fb7-4fa3-80da-3227b24100d5\") " pod="metallb-system/speaker-2pzp9" Mar 14 07:15:36 crc kubenswrapper[4893]: I0314 07:15:36.782412 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/25c5b449-1fb7-4fa3-80da-3227b24100d5-memberlist\") pod \"speaker-2pzp9\" (UID: \"25c5b449-1fb7-4fa3-80da-3227b24100d5\") " pod="metallb-system/speaker-2pzp9" Mar 14 07:15:36 crc kubenswrapper[4893]: I0314 07:15:36.804824 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-2pzp9" Mar 14 07:15:37 crc kubenswrapper[4893]: I0314 07:15:37.308331 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-2pzp9" event={"ID":"25c5b449-1fb7-4fa3-80da-3227b24100d5","Type":"ContainerStarted","Data":"575f41d704de672724d7adf1dd7471d2839dc88d11e98d9e200a1c5f0f1a5d10"} Mar 14 07:15:37 crc kubenswrapper[4893]: I0314 07:15:37.310481 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-2pzp9" event={"ID":"25c5b449-1fb7-4fa3-80da-3227b24100d5","Type":"ContainerStarted","Data":"49b882bf4c88b108f01e840d3b7beccfe4b35bb461d945d98d03951d775bb567"} Mar 14 07:15:37 crc kubenswrapper[4893]: I0314 07:15:37.318761 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-clzml" event={"ID":"92e0ed09-c3c1-42a9-9405-fcf906686b43","Type":"ContainerStarted","Data":"989ce7e3406caf2901e3734948fc6f83229f5fe5f0600f10cd9f83d9b0789af1"} Mar 14 07:15:37 crc kubenswrapper[4893]: I0314 07:15:37.318802 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-clzml" event={"ID":"92e0ed09-c3c1-42a9-9405-fcf906686b43","Type":"ContainerStarted","Data":"1137e3320d2d9bd232fa169014a080dc1d5fa4942bd401f2d43deaa0a3c25329"} Mar 14 07:15:37 crc kubenswrapper[4893]: I0314 07:15:37.319656 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-clzml" Mar 14 07:15:37 crc kubenswrapper[4893]: I0314 07:15:37.341775 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-clzml" podStartSLOduration=3.341760296 podStartE2EDuration="3.341760296s" podCreationTimestamp="2026-03-14 07:15:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:15:37.339340857 +0000 UTC m=+1016.601517659" watchObservedRunningTime="2026-03-14 07:15:37.341760296 +0000 UTC m=+1016.603937078" Mar 14 07:15:38 crc kubenswrapper[4893]: I0314 07:15:38.326672 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-2pzp9" event={"ID":"25c5b449-1fb7-4fa3-80da-3227b24100d5","Type":"ContainerStarted","Data":"c1b089ee365d53b0551b5efd1fdb6348f7b1621f3661d4d9fb7724a46cdd2c10"} Mar 14 07:15:38 crc kubenswrapper[4893]: I0314 07:15:38.326718 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-2pzp9" Mar 14 07:15:38 crc kubenswrapper[4893]: I0314 07:15:38.346032 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-2pzp9" podStartSLOduration=4.346015616 podStartE2EDuration="4.346015616s" podCreationTimestamp="2026-03-14 07:15:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:15:38.343377141 +0000 UTC m=+1017.605553943" watchObservedRunningTime="2026-03-14 07:15:38.346015616 +0000 UTC m=+1017.608192408" Mar 14 07:15:43 crc kubenswrapper[4893]: I0314 07:15:43.378222 4893 generic.go:334] "Generic (PLEG): container finished" podID="17f61fcc-48c5-42ef-97a4-9440159a79fc" containerID="3e9244fe22f155b4e51fd1700167438a114dd28a6aff1ff93d78dc24b4f69988" exitCode=0 Mar 14 07:15:43 crc kubenswrapper[4893]: I0314 07:15:43.389836 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7m6db" event={"ID":"17f61fcc-48c5-42ef-97a4-9440159a79fc","Type":"ContainerDied","Data":"3e9244fe22f155b4e51fd1700167438a114dd28a6aff1ff93d78dc24b4f69988"} Mar 14 07:15:43 crc kubenswrapper[4893]: I0314 07:15:43.389917 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-s55qs" event={"ID":"c697e0ec-f27c-4fff-9e27-bb5ba9738ad5","Type":"ContainerStarted","Data":"3010c3d34b9b2e1d6002ab777803e267deb3d0447931856f0a6bd7a1dc6adaeb"} Mar 14 07:15:43 crc kubenswrapper[4893]: I0314 07:15:43.389972 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-s55qs" Mar 14 07:15:43 crc kubenswrapper[4893]: I0314 07:15:43.448984 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-s55qs" podStartSLOduration=2.278669496 podStartE2EDuration="9.448956307s" podCreationTimestamp="2026-03-14 07:15:34 +0000 UTC" firstStartedPulling="2026-03-14 07:15:35.496376 +0000 UTC m=+1014.758552792" lastFinishedPulling="2026-03-14 07:15:42.666662791 +0000 UTC m=+1021.928839603" observedRunningTime="2026-03-14 07:15:43.440012907 +0000 UTC m=+1022.702189719" watchObservedRunningTime="2026-03-14 07:15:43.448956307 +0000 UTC m=+1022.711133139" Mar 14 07:15:44 crc kubenswrapper[4893]: I0314 07:15:44.394636 4893 generic.go:334] "Generic (PLEG): container finished" podID="17f61fcc-48c5-42ef-97a4-9440159a79fc" containerID="fc6690566fac1500104b4bd868f3feb5d997c01e4092527dab34b2ecccfa62e5" exitCode=0 Mar 14 07:15:44 crc kubenswrapper[4893]: I0314 07:15:44.394728 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7m6db" event={"ID":"17f61fcc-48c5-42ef-97a4-9440159a79fc","Type":"ContainerDied","Data":"fc6690566fac1500104b4bd868f3feb5d997c01e4092527dab34b2ecccfa62e5"} Mar 14 07:15:45 crc kubenswrapper[4893]: I0314 07:15:45.405486 4893 generic.go:334] "Generic (PLEG): container finished" podID="17f61fcc-48c5-42ef-97a4-9440159a79fc" containerID="8ca264207f37b347b7bc8ffdc11d733f82f5414f3a58eac58ef84119459cccd0" exitCode=0 Mar 14 07:15:45 crc kubenswrapper[4893]: I0314 07:15:45.405604 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7m6db" event={"ID":"17f61fcc-48c5-42ef-97a4-9440159a79fc","Type":"ContainerDied","Data":"8ca264207f37b347b7bc8ffdc11d733f82f5414f3a58eac58ef84119459cccd0"} Mar 14 07:15:46 crc kubenswrapper[4893]: I0314 07:15:46.419632 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7m6db" event={"ID":"17f61fcc-48c5-42ef-97a4-9440159a79fc","Type":"ContainerStarted","Data":"241c9f12ea0ea77a26eb983c766103c2706181caa4d1c73fe59217992dd37e01"} Mar 14 07:15:46 crc kubenswrapper[4893]: I0314 07:15:46.419907 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7m6db" event={"ID":"17f61fcc-48c5-42ef-97a4-9440159a79fc","Type":"ContainerStarted","Data":"07309be32aa832e392ea8127b1ab1b5288d7b3ebf56941b6c7f95366dd2479bf"} Mar 14 07:15:46 crc kubenswrapper[4893]: I0314 07:15:46.419922 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7m6db" event={"ID":"17f61fcc-48c5-42ef-97a4-9440159a79fc","Type":"ContainerStarted","Data":"688bf37ec19eeb75df7785f14653b0797515d74a7e6c5a70594a44355807d1ac"} Mar 14 07:15:46 crc kubenswrapper[4893]: I0314 07:15:46.419933 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7m6db" event={"ID":"17f61fcc-48c5-42ef-97a4-9440159a79fc","Type":"ContainerStarted","Data":"baaf8a2101308472087abf7a99cf0302dde2963258875996abc02513b5568904"} Mar 14 07:15:46 crc kubenswrapper[4893]: I0314 07:15:46.419943 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7m6db" event={"ID":"17f61fcc-48c5-42ef-97a4-9440159a79fc","Type":"ContainerStarted","Data":"5246588a7330d3bc542917c95f0720125c67bb60df6bd8743b1fd26471b09875"} Mar 14 07:15:47 crc kubenswrapper[4893]: I0314 07:15:47.434715 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7m6db" event={"ID":"17f61fcc-48c5-42ef-97a4-9440159a79fc","Type":"ContainerStarted","Data":"933e3a9ec807b8c8e15278dbf6b29a2286ccf36d6a25e498e6739a831442725a"} Mar 14 07:15:47 crc kubenswrapper[4893]: I0314 07:15:47.435011 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-7m6db" Mar 14 07:15:47 crc kubenswrapper[4893]: I0314 07:15:47.467808 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-7m6db" podStartSLOduration=6.256752973 podStartE2EDuration="13.467784681s" podCreationTimestamp="2026-03-14 07:15:34 +0000 UTC" firstStartedPulling="2026-03-14 07:15:35.44735196 +0000 UTC m=+1014.709528762" lastFinishedPulling="2026-03-14 07:15:42.658383638 +0000 UTC m=+1021.920560470" observedRunningTime="2026-03-14 07:15:47.466563871 +0000 UTC m=+1026.728740723" watchObservedRunningTime="2026-03-14 07:15:47.467784681 +0000 UTC m=+1026.729961513" Mar 14 07:15:50 crc kubenswrapper[4893]: I0314 07:15:50.214875 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-7m6db" Mar 14 07:15:50 crc kubenswrapper[4893]: I0314 07:15:50.278149 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-7m6db" Mar 14 07:15:55 crc kubenswrapper[4893]: I0314 07:15:55.216929 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-7m6db" Mar 14 07:15:55 crc kubenswrapper[4893]: I0314 07:15:55.230025 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-s55qs" Mar 14 07:15:55 crc kubenswrapper[4893]: I0314 07:15:55.923195 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-clzml" Mar 14 07:15:56 crc kubenswrapper[4893]: I0314 07:15:56.808090 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-2pzp9" Mar 14 07:15:58 crc kubenswrapper[4893]: I0314 07:15:58.220726 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fhp5n"] Mar 14 07:15:58 crc kubenswrapper[4893]: I0314 07:15:58.222637 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fhp5n" Mar 14 07:15:58 crc kubenswrapper[4893]: I0314 07:15:58.224609 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 14 07:15:58 crc kubenswrapper[4893]: I0314 07:15:58.238094 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fhp5n"] Mar 14 07:15:58 crc kubenswrapper[4893]: I0314 07:15:58.287113 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fb3756e8-b82d-40c5-90c6-0cc4114980a6-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fhp5n\" (UID: \"fb3756e8-b82d-40c5-90c6-0cc4114980a6\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fhp5n" Mar 14 07:15:58 crc kubenswrapper[4893]: I0314 07:15:58.287196 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ckd6\" (UniqueName: \"kubernetes.io/projected/fb3756e8-b82d-40c5-90c6-0cc4114980a6-kube-api-access-2ckd6\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fhp5n\" (UID: \"fb3756e8-b82d-40c5-90c6-0cc4114980a6\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fhp5n" Mar 14 07:15:58 crc kubenswrapper[4893]: I0314 07:15:58.287336 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fb3756e8-b82d-40c5-90c6-0cc4114980a6-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fhp5n\" (UID: \"fb3756e8-b82d-40c5-90c6-0cc4114980a6\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fhp5n" Mar 14 07:15:58 crc kubenswrapper[4893]: I0314 07:15:58.388607 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ckd6\" (UniqueName: \"kubernetes.io/projected/fb3756e8-b82d-40c5-90c6-0cc4114980a6-kube-api-access-2ckd6\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fhp5n\" (UID: \"fb3756e8-b82d-40c5-90c6-0cc4114980a6\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fhp5n" Mar 14 07:15:58 crc kubenswrapper[4893]: I0314 07:15:58.388673 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fb3756e8-b82d-40c5-90c6-0cc4114980a6-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fhp5n\" (UID: \"fb3756e8-b82d-40c5-90c6-0cc4114980a6\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fhp5n" Mar 14 07:15:58 crc kubenswrapper[4893]: I0314 07:15:58.388758 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fb3756e8-b82d-40c5-90c6-0cc4114980a6-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fhp5n\" (UID: \"fb3756e8-b82d-40c5-90c6-0cc4114980a6\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fhp5n" Mar 14 07:15:58 crc kubenswrapper[4893]: I0314 07:15:58.389503 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fb3756e8-b82d-40c5-90c6-0cc4114980a6-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fhp5n\" (UID: \"fb3756e8-b82d-40c5-90c6-0cc4114980a6\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fhp5n" Mar 14 07:15:58 crc kubenswrapper[4893]: I0314 07:15:58.389690 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fb3756e8-b82d-40c5-90c6-0cc4114980a6-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fhp5n\" (UID: \"fb3756e8-b82d-40c5-90c6-0cc4114980a6\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fhp5n" Mar 14 07:15:58 crc kubenswrapper[4893]: I0314 07:15:58.415638 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ckd6\" (UniqueName: \"kubernetes.io/projected/fb3756e8-b82d-40c5-90c6-0cc4114980a6-kube-api-access-2ckd6\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fhp5n\" (UID: \"fb3756e8-b82d-40c5-90c6-0cc4114980a6\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fhp5n" Mar 14 07:15:58 crc kubenswrapper[4893]: I0314 07:15:58.578082 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fhp5n" Mar 14 07:15:58 crc kubenswrapper[4893]: I0314 07:15:58.840713 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fhp5n"] Mar 14 07:15:58 crc kubenswrapper[4893]: W0314 07:15:58.845208 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb3756e8_b82d_40c5_90c6_0cc4114980a6.slice/crio-555e3e795c1b2144fafa5a52a0f04d8158edf781d28c0498dd358d3734082ad2 WatchSource:0}: Error finding container 555e3e795c1b2144fafa5a52a0f04d8158edf781d28c0498dd358d3734082ad2: Status 404 returned error can't find the container with id 555e3e795c1b2144fafa5a52a0f04d8158edf781d28c0498dd358d3734082ad2 Mar 14 07:15:59 crc kubenswrapper[4893]: I0314 07:15:59.514019 4893 generic.go:334] "Generic (PLEG): container finished" podID="fb3756e8-b82d-40c5-90c6-0cc4114980a6" containerID="6a883c51d5daaee4c022e3908e59eb73864fae01cb173034440e69780e63ca29" exitCode=0 Mar 14 07:15:59 crc kubenswrapper[4893]: I0314 07:15:59.514095 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fhp5n" event={"ID":"fb3756e8-b82d-40c5-90c6-0cc4114980a6","Type":"ContainerDied","Data":"6a883c51d5daaee4c022e3908e59eb73864fae01cb173034440e69780e63ca29"} Mar 14 07:15:59 crc kubenswrapper[4893]: I0314 07:15:59.514343 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fhp5n" event={"ID":"fb3756e8-b82d-40c5-90c6-0cc4114980a6","Type":"ContainerStarted","Data":"555e3e795c1b2144fafa5a52a0f04d8158edf781d28c0498dd358d3734082ad2"} Mar 14 07:15:59 crc kubenswrapper[4893]: I0314 07:15:59.731357 4893 patch_prober.go:28] interesting pod/machine-config-daemon-d4x6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:15:59 crc kubenswrapper[4893]: I0314 07:15:59.731412 4893 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:15:59 crc kubenswrapper[4893]: I0314 07:15:59.731458 4893 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" Mar 14 07:15:59 crc kubenswrapper[4893]: I0314 07:15:59.732071 4893 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2fd7b7357426a71964d49e99a8163fe1e89a54e8bb9c768156381da3bae22bd0"} pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 07:15:59 crc kubenswrapper[4893]: I0314 07:15:59.732123 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" containerID="cri-o://2fd7b7357426a71964d49e99a8163fe1e89a54e8bb9c768156381da3bae22bd0" gracePeriod=600 Mar 14 07:16:00 crc kubenswrapper[4893]: I0314 07:16:00.137743 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557876-nn5pp"] Mar 14 07:16:00 crc kubenswrapper[4893]: I0314 07:16:00.138919 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557876-nn5pp" Mar 14 07:16:00 crc kubenswrapper[4893]: I0314 07:16:00.141113 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 07:16:00 crc kubenswrapper[4893]: I0314 07:16:00.141202 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-44qb7" Mar 14 07:16:00 crc kubenswrapper[4893]: I0314 07:16:00.141341 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 07:16:00 crc kubenswrapper[4893]: I0314 07:16:00.145129 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557876-nn5pp"] Mar 14 07:16:00 crc kubenswrapper[4893]: I0314 07:16:00.311161 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdgq7\" (UniqueName: \"kubernetes.io/projected/db8ca722-d281-4ac8-9da6-74dbd04787c4-kube-api-access-cdgq7\") pod \"auto-csr-approver-29557876-nn5pp\" (UID: \"db8ca722-d281-4ac8-9da6-74dbd04787c4\") " pod="openshift-infra/auto-csr-approver-29557876-nn5pp" Mar 14 07:16:00 crc kubenswrapper[4893]: I0314 07:16:00.412490 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdgq7\" (UniqueName: \"kubernetes.io/projected/db8ca722-d281-4ac8-9da6-74dbd04787c4-kube-api-access-cdgq7\") pod \"auto-csr-approver-29557876-nn5pp\" (UID: \"db8ca722-d281-4ac8-9da6-74dbd04787c4\") " pod="openshift-infra/auto-csr-approver-29557876-nn5pp" Mar 14 07:16:00 crc kubenswrapper[4893]: I0314 07:16:00.430441 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdgq7\" (UniqueName: \"kubernetes.io/projected/db8ca722-d281-4ac8-9da6-74dbd04787c4-kube-api-access-cdgq7\") pod \"auto-csr-approver-29557876-nn5pp\" (UID: \"db8ca722-d281-4ac8-9da6-74dbd04787c4\") " pod="openshift-infra/auto-csr-approver-29557876-nn5pp" Mar 14 07:16:00 crc kubenswrapper[4893]: I0314 07:16:00.455711 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557876-nn5pp" Mar 14 07:16:00 crc kubenswrapper[4893]: I0314 07:16:00.523491 4893 generic.go:334] "Generic (PLEG): container finished" podID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerID="2fd7b7357426a71964d49e99a8163fe1e89a54e8bb9c768156381da3bae22bd0" exitCode=0 Mar 14 07:16:00 crc kubenswrapper[4893]: I0314 07:16:00.523541 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" event={"ID":"ad6724e5-48cf-4417-ae51-b1cb8c6af70d","Type":"ContainerDied","Data":"2fd7b7357426a71964d49e99a8163fe1e89a54e8bb9c768156381da3bae22bd0"} Mar 14 07:16:00 crc kubenswrapper[4893]: I0314 07:16:00.523582 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" event={"ID":"ad6724e5-48cf-4417-ae51-b1cb8c6af70d","Type":"ContainerStarted","Data":"6b7e5a6c1e81433472238895d55cad009404cc5608a11ac60397f9c58ede773f"} Mar 14 07:16:00 crc kubenswrapper[4893]: I0314 07:16:00.523601 4893 scope.go:117] "RemoveContainer" containerID="41d06da2c2df1866ba771ef2b7559c95677812a35cd5740f54ac967f6074fe35" Mar 14 07:16:00 crc kubenswrapper[4893]: I0314 07:16:00.861959 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557876-nn5pp"] Mar 14 07:16:00 crc kubenswrapper[4893]: W0314 07:16:00.876436 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb8ca722_d281_4ac8_9da6_74dbd04787c4.slice/crio-3ee3dc7ee8e7270433ee6a6cca29d152f1e18ecfa45c92fe1ef7ae66bd11354d WatchSource:0}: Error finding container 3ee3dc7ee8e7270433ee6a6cca29d152f1e18ecfa45c92fe1ef7ae66bd11354d: Status 404 returned error can't find the container with id 3ee3dc7ee8e7270433ee6a6cca29d152f1e18ecfa45c92fe1ef7ae66bd11354d Mar 14 07:16:01 crc kubenswrapper[4893]: I0314 07:16:01.530512 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557876-nn5pp" event={"ID":"db8ca722-d281-4ac8-9da6-74dbd04787c4","Type":"ContainerStarted","Data":"3ee3dc7ee8e7270433ee6a6cca29d152f1e18ecfa45c92fe1ef7ae66bd11354d"} Mar 14 07:16:03 crc kubenswrapper[4893]: I0314 07:16:03.546251 4893 generic.go:334] "Generic (PLEG): container finished" podID="fb3756e8-b82d-40c5-90c6-0cc4114980a6" containerID="019a75dd8382c6282202195fb5faa7663e00f829836801a01a2ae607e0fafa63" exitCode=0 Mar 14 07:16:03 crc kubenswrapper[4893]: I0314 07:16:03.546326 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fhp5n" event={"ID":"fb3756e8-b82d-40c5-90c6-0cc4114980a6","Type":"ContainerDied","Data":"019a75dd8382c6282202195fb5faa7663e00f829836801a01a2ae607e0fafa63"} Mar 14 07:16:03 crc kubenswrapper[4893]: I0314 07:16:03.548643 4893 generic.go:334] "Generic (PLEG): container finished" podID="db8ca722-d281-4ac8-9da6-74dbd04787c4" containerID="f79609637c7322108aa2e766ff58838b6e7ab55adc2ab5d441df7f60abf1fb4f" exitCode=0 Mar 14 07:16:03 crc kubenswrapper[4893]: I0314 07:16:03.548687 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557876-nn5pp" event={"ID":"db8ca722-d281-4ac8-9da6-74dbd04787c4","Type":"ContainerDied","Data":"f79609637c7322108aa2e766ff58838b6e7ab55adc2ab5d441df7f60abf1fb4f"} Mar 14 07:16:04 crc kubenswrapper[4893]: I0314 07:16:04.558240 4893 generic.go:334] "Generic (PLEG): container finished" podID="fb3756e8-b82d-40c5-90c6-0cc4114980a6" containerID="ff5cb05602cce3d2788d47477d1ab86d844363def65a601329976e4e5a5fc45f" exitCode=0 Mar 14 07:16:04 crc kubenswrapper[4893]: I0314 07:16:04.558361 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fhp5n" event={"ID":"fb3756e8-b82d-40c5-90c6-0cc4114980a6","Type":"ContainerDied","Data":"ff5cb05602cce3d2788d47477d1ab86d844363def65a601329976e4e5a5fc45f"} Mar 14 07:16:04 crc kubenswrapper[4893]: I0314 07:16:04.822640 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557876-nn5pp" Mar 14 07:16:04 crc kubenswrapper[4893]: I0314 07:16:04.987175 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cdgq7\" (UniqueName: \"kubernetes.io/projected/db8ca722-d281-4ac8-9da6-74dbd04787c4-kube-api-access-cdgq7\") pod \"db8ca722-d281-4ac8-9da6-74dbd04787c4\" (UID: \"db8ca722-d281-4ac8-9da6-74dbd04787c4\") " Mar 14 07:16:04 crc kubenswrapper[4893]: I0314 07:16:04.993369 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db8ca722-d281-4ac8-9da6-74dbd04787c4-kube-api-access-cdgq7" (OuterVolumeSpecName: "kube-api-access-cdgq7") pod "db8ca722-d281-4ac8-9da6-74dbd04787c4" (UID: "db8ca722-d281-4ac8-9da6-74dbd04787c4"). InnerVolumeSpecName "kube-api-access-cdgq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:16:05 crc kubenswrapper[4893]: I0314 07:16:05.089023 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cdgq7\" (UniqueName: \"kubernetes.io/projected/db8ca722-d281-4ac8-9da6-74dbd04787c4-kube-api-access-cdgq7\") on node \"crc\" DevicePath \"\"" Mar 14 07:16:05 crc kubenswrapper[4893]: I0314 07:16:05.566987 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557876-nn5pp" event={"ID":"db8ca722-d281-4ac8-9da6-74dbd04787c4","Type":"ContainerDied","Data":"3ee3dc7ee8e7270433ee6a6cca29d152f1e18ecfa45c92fe1ef7ae66bd11354d"} Mar 14 07:16:05 crc kubenswrapper[4893]: I0314 07:16:05.567034 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ee3dc7ee8e7270433ee6a6cca29d152f1e18ecfa45c92fe1ef7ae66bd11354d" Mar 14 07:16:05 crc kubenswrapper[4893]: I0314 07:16:05.567087 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557876-nn5pp" Mar 14 07:16:05 crc kubenswrapper[4893]: I0314 07:16:05.884977 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fhp5n" Mar 14 07:16:05 crc kubenswrapper[4893]: I0314 07:16:05.889184 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557870-gxmgb"] Mar 14 07:16:05 crc kubenswrapper[4893]: I0314 07:16:05.895466 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557870-gxmgb"] Mar 14 07:16:06 crc kubenswrapper[4893]: I0314 07:16:06.005438 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ckd6\" (UniqueName: \"kubernetes.io/projected/fb3756e8-b82d-40c5-90c6-0cc4114980a6-kube-api-access-2ckd6\") pod \"fb3756e8-b82d-40c5-90c6-0cc4114980a6\" (UID: \"fb3756e8-b82d-40c5-90c6-0cc4114980a6\") " Mar 14 07:16:06 crc kubenswrapper[4893]: I0314 07:16:06.005676 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fb3756e8-b82d-40c5-90c6-0cc4114980a6-bundle\") pod \"fb3756e8-b82d-40c5-90c6-0cc4114980a6\" (UID: \"fb3756e8-b82d-40c5-90c6-0cc4114980a6\") " Mar 14 07:16:06 crc kubenswrapper[4893]: I0314 07:16:06.005711 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fb3756e8-b82d-40c5-90c6-0cc4114980a6-util\") pod \"fb3756e8-b82d-40c5-90c6-0cc4114980a6\" (UID: \"fb3756e8-b82d-40c5-90c6-0cc4114980a6\") " Mar 14 07:16:06 crc kubenswrapper[4893]: I0314 07:16:06.006443 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb3756e8-b82d-40c5-90c6-0cc4114980a6-bundle" (OuterVolumeSpecName: "bundle") pod "fb3756e8-b82d-40c5-90c6-0cc4114980a6" (UID: "fb3756e8-b82d-40c5-90c6-0cc4114980a6"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:16:06 crc kubenswrapper[4893]: I0314 07:16:06.010025 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb3756e8-b82d-40c5-90c6-0cc4114980a6-kube-api-access-2ckd6" (OuterVolumeSpecName: "kube-api-access-2ckd6") pod "fb3756e8-b82d-40c5-90c6-0cc4114980a6" (UID: "fb3756e8-b82d-40c5-90c6-0cc4114980a6"). InnerVolumeSpecName "kube-api-access-2ckd6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:16:06 crc kubenswrapper[4893]: I0314 07:16:06.015054 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb3756e8-b82d-40c5-90c6-0cc4114980a6-util" (OuterVolumeSpecName: "util") pod "fb3756e8-b82d-40c5-90c6-0cc4114980a6" (UID: "fb3756e8-b82d-40c5-90c6-0cc4114980a6"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:16:06 crc kubenswrapper[4893]: I0314 07:16:06.107360 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ckd6\" (UniqueName: \"kubernetes.io/projected/fb3756e8-b82d-40c5-90c6-0cc4114980a6-kube-api-access-2ckd6\") on node \"crc\" DevicePath \"\"" Mar 14 07:16:06 crc kubenswrapper[4893]: I0314 07:16:06.107393 4893 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fb3756e8-b82d-40c5-90c6-0cc4114980a6-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:16:06 crc kubenswrapper[4893]: I0314 07:16:06.107402 4893 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fb3756e8-b82d-40c5-90c6-0cc4114980a6-util\") on node \"crc\" DevicePath \"\"" Mar 14 07:16:06 crc kubenswrapper[4893]: I0314 07:16:06.577149 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fhp5n" event={"ID":"fb3756e8-b82d-40c5-90c6-0cc4114980a6","Type":"ContainerDied","Data":"555e3e795c1b2144fafa5a52a0f04d8158edf781d28c0498dd358d3734082ad2"} Mar 14 07:16:06 crc kubenswrapper[4893]: I0314 07:16:06.577193 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fhp5n" Mar 14 07:16:06 crc kubenswrapper[4893]: I0314 07:16:06.577196 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="555e3e795c1b2144fafa5a52a0f04d8158edf781d28c0498dd358d3734082ad2" Mar 14 07:16:07 crc kubenswrapper[4893]: I0314 07:16:07.384839 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c3d9444-32d8-44e1-93cd-5ab047703eec" path="/var/lib/kubelet/pods/7c3d9444-32d8-44e1-93cd-5ab047703eec/volumes" Mar 14 07:16:11 crc kubenswrapper[4893]: I0314 07:16:11.368413 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-g7k78"] Mar 14 07:16:11 crc kubenswrapper[4893]: E0314 07:16:11.369381 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb3756e8-b82d-40c5-90c6-0cc4114980a6" containerName="pull" Mar 14 07:16:11 crc kubenswrapper[4893]: I0314 07:16:11.369404 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb3756e8-b82d-40c5-90c6-0cc4114980a6" containerName="pull" Mar 14 07:16:11 crc kubenswrapper[4893]: E0314 07:16:11.369425 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db8ca722-d281-4ac8-9da6-74dbd04787c4" containerName="oc" Mar 14 07:16:11 crc kubenswrapper[4893]: I0314 07:16:11.369437 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="db8ca722-d281-4ac8-9da6-74dbd04787c4" containerName="oc" Mar 14 07:16:11 crc kubenswrapper[4893]: E0314 07:16:11.369457 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb3756e8-b82d-40c5-90c6-0cc4114980a6" containerName="util" Mar 14 07:16:11 crc kubenswrapper[4893]: I0314 07:16:11.369471 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb3756e8-b82d-40c5-90c6-0cc4114980a6" containerName="util" Mar 14 07:16:11 crc kubenswrapper[4893]: E0314 07:16:11.369498 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb3756e8-b82d-40c5-90c6-0cc4114980a6" containerName="extract" Mar 14 07:16:11 crc kubenswrapper[4893]: I0314 07:16:11.369511 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb3756e8-b82d-40c5-90c6-0cc4114980a6" containerName="extract" Mar 14 07:16:11 crc kubenswrapper[4893]: I0314 07:16:11.369728 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="db8ca722-d281-4ac8-9da6-74dbd04787c4" containerName="oc" Mar 14 07:16:11 crc kubenswrapper[4893]: I0314 07:16:11.369762 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb3756e8-b82d-40c5-90c6-0cc4114980a6" containerName="extract" Mar 14 07:16:11 crc kubenswrapper[4893]: I0314 07:16:11.370385 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-g7k78" Mar 14 07:16:11 crc kubenswrapper[4893]: I0314 07:16:11.375958 4893 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-pfml9" Mar 14 07:16:11 crc kubenswrapper[4893]: I0314 07:16:11.376777 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Mar 14 07:16:11 crc kubenswrapper[4893]: I0314 07:16:11.380983 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Mar 14 07:16:11 crc kubenswrapper[4893]: I0314 07:16:11.458577 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-g7k78"] Mar 14 07:16:11 crc kubenswrapper[4893]: I0314 07:16:11.474832 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8e15bd35-d265-4150-8fc4-88c26c9fde23-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-g7k78\" (UID: \"8e15bd35-d265-4150-8fc4-88c26c9fde23\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-g7k78" Mar 14 07:16:11 crc kubenswrapper[4893]: I0314 07:16:11.475010 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbp6q\" (UniqueName: \"kubernetes.io/projected/8e15bd35-d265-4150-8fc4-88c26c9fde23-kube-api-access-vbp6q\") pod \"cert-manager-operator-controller-manager-66c8bdd694-g7k78\" (UID: \"8e15bd35-d265-4150-8fc4-88c26c9fde23\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-g7k78" Mar 14 07:16:11 crc kubenswrapper[4893]: I0314 07:16:11.576418 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbp6q\" (UniqueName: \"kubernetes.io/projected/8e15bd35-d265-4150-8fc4-88c26c9fde23-kube-api-access-vbp6q\") pod \"cert-manager-operator-controller-manager-66c8bdd694-g7k78\" (UID: \"8e15bd35-d265-4150-8fc4-88c26c9fde23\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-g7k78" Mar 14 07:16:11 crc kubenswrapper[4893]: I0314 07:16:11.576659 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8e15bd35-d265-4150-8fc4-88c26c9fde23-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-g7k78\" (UID: \"8e15bd35-d265-4150-8fc4-88c26c9fde23\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-g7k78" Mar 14 07:16:11 crc kubenswrapper[4893]: I0314 07:16:11.577405 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8e15bd35-d265-4150-8fc4-88c26c9fde23-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-g7k78\" (UID: \"8e15bd35-d265-4150-8fc4-88c26c9fde23\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-g7k78" Mar 14 07:16:11 crc kubenswrapper[4893]: I0314 07:16:11.636713 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbp6q\" (UniqueName: \"kubernetes.io/projected/8e15bd35-d265-4150-8fc4-88c26c9fde23-kube-api-access-vbp6q\") pod \"cert-manager-operator-controller-manager-66c8bdd694-g7k78\" (UID: \"8e15bd35-d265-4150-8fc4-88c26c9fde23\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-g7k78" Mar 14 07:16:11 crc kubenswrapper[4893]: I0314 07:16:11.691690 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-g7k78" Mar 14 07:16:12 crc kubenswrapper[4893]: I0314 07:16:12.148064 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-g7k78"] Mar 14 07:16:12 crc kubenswrapper[4893]: I0314 07:16:12.616253 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-g7k78" event={"ID":"8e15bd35-d265-4150-8fc4-88c26c9fde23","Type":"ContainerStarted","Data":"c2a978959d0161b79c4e79bd9f997dc785966461d78889a36caceb3ffcb96b7d"} Mar 14 07:16:15 crc kubenswrapper[4893]: I0314 07:16:15.635828 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-g7k78" event={"ID":"8e15bd35-d265-4150-8fc4-88c26c9fde23","Type":"ContainerStarted","Data":"21ed4640009697d0d00e71e3c1e93971e7fff6d71ce5fcc58699a5355b62fa68"} Mar 14 07:16:15 crc kubenswrapper[4893]: I0314 07:16:15.666136 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-g7k78" podStartSLOduration=1.883851747 podStartE2EDuration="4.66612221s" podCreationTimestamp="2026-03-14 07:16:11 +0000 UTC" firstStartedPulling="2026-03-14 07:16:12.15474955 +0000 UTC m=+1051.416926342" lastFinishedPulling="2026-03-14 07:16:14.937020003 +0000 UTC m=+1054.199196805" observedRunningTime="2026-03-14 07:16:15.66367004 +0000 UTC m=+1054.925846832" watchObservedRunningTime="2026-03-14 07:16:15.66612221 +0000 UTC m=+1054.928298992" Mar 14 07:16:18 crc kubenswrapper[4893]: I0314 07:16:18.678508 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-khrgn"] Mar 14 07:16:18 crc kubenswrapper[4893]: I0314 07:16:18.679666 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-khrgn" Mar 14 07:16:18 crc kubenswrapper[4893]: I0314 07:16:18.682394 4893 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-6nzgg" Mar 14 07:16:18 crc kubenswrapper[4893]: I0314 07:16:18.682573 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 14 07:16:18 crc kubenswrapper[4893]: I0314 07:16:18.682631 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 14 07:16:18 crc kubenswrapper[4893]: I0314 07:16:18.693366 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqzdh\" (UniqueName: \"kubernetes.io/projected/6ccf1897-752e-420b-9ba9-ed767e6eac38-kube-api-access-qqzdh\") pod \"cert-manager-webhook-6888856db4-khrgn\" (UID: \"6ccf1897-752e-420b-9ba9-ed767e6eac38\") " pod="cert-manager/cert-manager-webhook-6888856db4-khrgn" Mar 14 07:16:18 crc kubenswrapper[4893]: I0314 07:16:18.693436 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6ccf1897-752e-420b-9ba9-ed767e6eac38-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-khrgn\" (UID: \"6ccf1897-752e-420b-9ba9-ed767e6eac38\") " pod="cert-manager/cert-manager-webhook-6888856db4-khrgn" Mar 14 07:16:18 crc kubenswrapper[4893]: I0314 07:16:18.729906 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-khrgn"] Mar 14 07:16:18 crc kubenswrapper[4893]: I0314 07:16:18.794767 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6ccf1897-752e-420b-9ba9-ed767e6eac38-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-khrgn\" (UID: \"6ccf1897-752e-420b-9ba9-ed767e6eac38\") " pod="cert-manager/cert-manager-webhook-6888856db4-khrgn" Mar 14 07:16:18 crc kubenswrapper[4893]: I0314 07:16:18.794886 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqzdh\" (UniqueName: \"kubernetes.io/projected/6ccf1897-752e-420b-9ba9-ed767e6eac38-kube-api-access-qqzdh\") pod \"cert-manager-webhook-6888856db4-khrgn\" (UID: \"6ccf1897-752e-420b-9ba9-ed767e6eac38\") " pod="cert-manager/cert-manager-webhook-6888856db4-khrgn" Mar 14 07:16:18 crc kubenswrapper[4893]: I0314 07:16:18.813116 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6ccf1897-752e-420b-9ba9-ed767e6eac38-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-khrgn\" (UID: \"6ccf1897-752e-420b-9ba9-ed767e6eac38\") " pod="cert-manager/cert-manager-webhook-6888856db4-khrgn" Mar 14 07:16:18 crc kubenswrapper[4893]: I0314 07:16:18.813834 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqzdh\" (UniqueName: \"kubernetes.io/projected/6ccf1897-752e-420b-9ba9-ed767e6eac38-kube-api-access-qqzdh\") pod \"cert-manager-webhook-6888856db4-khrgn\" (UID: \"6ccf1897-752e-420b-9ba9-ed767e6eac38\") " pod="cert-manager/cert-manager-webhook-6888856db4-khrgn" Mar 14 07:16:19 crc kubenswrapper[4893]: I0314 07:16:19.000135 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-khrgn" Mar 14 07:16:19 crc kubenswrapper[4893]: I0314 07:16:19.217592 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-khrgn"] Mar 14 07:16:19 crc kubenswrapper[4893]: I0314 07:16:19.654958 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-khrgn" event={"ID":"6ccf1897-752e-420b-9ba9-ed767e6eac38","Type":"ContainerStarted","Data":"c959e7b5557f50aac4eba6375cecbdbe60e1c2ef9d59234cf300b6a8dd25c372"} Mar 14 07:16:23 crc kubenswrapper[4893]: I0314 07:16:23.682598 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-khrgn" event={"ID":"6ccf1897-752e-420b-9ba9-ed767e6eac38","Type":"ContainerStarted","Data":"7c4007cb310fed6afd51bd1983ad48676947180f8262b7af2f445257b129ceb8"} Mar 14 07:16:23 crc kubenswrapper[4893]: I0314 07:16:23.683155 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-6888856db4-khrgn" Mar 14 07:16:23 crc kubenswrapper[4893]: I0314 07:16:23.704815 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-6888856db4-khrgn" podStartSLOduration=1.848147134 podStartE2EDuration="5.704784928s" podCreationTimestamp="2026-03-14 07:16:18 +0000 UTC" firstStartedPulling="2026-03-14 07:16:19.22500921 +0000 UTC m=+1058.487186002" lastFinishedPulling="2026-03-14 07:16:23.081647004 +0000 UTC m=+1062.343823796" observedRunningTime="2026-03-14 07:16:23.700072433 +0000 UTC m=+1062.962249255" watchObservedRunningTime="2026-03-14 07:16:23.704784928 +0000 UTC m=+1062.966961760" Mar 14 07:16:25 crc kubenswrapper[4893]: I0314 07:16:25.030428 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-fdtkk"] Mar 14 07:16:25 crc kubenswrapper[4893]: I0314 07:16:25.031155 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-fdtkk" Mar 14 07:16:25 crc kubenswrapper[4893]: I0314 07:16:25.033343 4893 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-k262v" Mar 14 07:16:25 crc kubenswrapper[4893]: I0314 07:16:25.045932 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-fdtkk"] Mar 14 07:16:25 crc kubenswrapper[4893]: I0314 07:16:25.079364 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26vqp\" (UniqueName: \"kubernetes.io/projected/8db0bf07-4adf-46db-8410-0e7b02326c81-kube-api-access-26vqp\") pod \"cert-manager-cainjector-5545bd876-fdtkk\" (UID: \"8db0bf07-4adf-46db-8410-0e7b02326c81\") " pod="cert-manager/cert-manager-cainjector-5545bd876-fdtkk" Mar 14 07:16:25 crc kubenswrapper[4893]: I0314 07:16:25.079439 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8db0bf07-4adf-46db-8410-0e7b02326c81-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-fdtkk\" (UID: \"8db0bf07-4adf-46db-8410-0e7b02326c81\") " pod="cert-manager/cert-manager-cainjector-5545bd876-fdtkk" Mar 14 07:16:25 crc kubenswrapper[4893]: I0314 07:16:25.180344 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26vqp\" (UniqueName: \"kubernetes.io/projected/8db0bf07-4adf-46db-8410-0e7b02326c81-kube-api-access-26vqp\") pod \"cert-manager-cainjector-5545bd876-fdtkk\" (UID: \"8db0bf07-4adf-46db-8410-0e7b02326c81\") " pod="cert-manager/cert-manager-cainjector-5545bd876-fdtkk" Mar 14 07:16:25 crc kubenswrapper[4893]: I0314 07:16:25.180424 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8db0bf07-4adf-46db-8410-0e7b02326c81-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-fdtkk\" (UID: \"8db0bf07-4adf-46db-8410-0e7b02326c81\") " pod="cert-manager/cert-manager-cainjector-5545bd876-fdtkk" Mar 14 07:16:25 crc kubenswrapper[4893]: I0314 07:16:25.201178 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26vqp\" (UniqueName: \"kubernetes.io/projected/8db0bf07-4adf-46db-8410-0e7b02326c81-kube-api-access-26vqp\") pod \"cert-manager-cainjector-5545bd876-fdtkk\" (UID: \"8db0bf07-4adf-46db-8410-0e7b02326c81\") " pod="cert-manager/cert-manager-cainjector-5545bd876-fdtkk" Mar 14 07:16:25 crc kubenswrapper[4893]: I0314 07:16:25.206093 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8db0bf07-4adf-46db-8410-0e7b02326c81-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-fdtkk\" (UID: \"8db0bf07-4adf-46db-8410-0e7b02326c81\") " pod="cert-manager/cert-manager-cainjector-5545bd876-fdtkk" Mar 14 07:16:25 crc kubenswrapper[4893]: I0314 07:16:25.346166 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-fdtkk" Mar 14 07:16:25 crc kubenswrapper[4893]: I0314 07:16:25.766703 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-fdtkk"] Mar 14 07:16:25 crc kubenswrapper[4893]: W0314 07:16:25.776895 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8db0bf07_4adf_46db_8410_0e7b02326c81.slice/crio-e880d03190725a850463730504bc4e60e4c1f319d86e46d914f61ff7fb05dc30 WatchSource:0}: Error finding container e880d03190725a850463730504bc4e60e4c1f319d86e46d914f61ff7fb05dc30: Status 404 returned error can't find the container with id e880d03190725a850463730504bc4e60e4c1f319d86e46d914f61ff7fb05dc30 Mar 14 07:16:26 crc kubenswrapper[4893]: I0314 07:16:26.705016 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-fdtkk" event={"ID":"8db0bf07-4adf-46db-8410-0e7b02326c81","Type":"ContainerStarted","Data":"c44b6d5a667b450017ec6ce6c412c951e3d7cae9df659dfa76a7fdf3b03152b6"} Mar 14 07:16:26 crc kubenswrapper[4893]: I0314 07:16:26.705365 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-fdtkk" event={"ID":"8db0bf07-4adf-46db-8410-0e7b02326c81","Type":"ContainerStarted","Data":"e880d03190725a850463730504bc4e60e4c1f319d86e46d914f61ff7fb05dc30"} Mar 14 07:16:26 crc kubenswrapper[4893]: I0314 07:16:26.731051 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-5545bd876-fdtkk" podStartSLOduration=1.730992735 podStartE2EDuration="1.730992735s" podCreationTimestamp="2026-03-14 07:16:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:16:26.725817248 +0000 UTC m=+1065.987994080" watchObservedRunningTime="2026-03-14 07:16:26.730992735 +0000 UTC m=+1065.993169567" Mar 14 07:16:29 crc kubenswrapper[4893]: I0314 07:16:29.004699 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-6888856db4-khrgn" Mar 14 07:16:37 crc kubenswrapper[4893]: I0314 07:16:37.691368 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-545d4d4674-7vxnn"] Mar 14 07:16:37 crc kubenswrapper[4893]: I0314 07:16:37.693883 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-7vxnn" Mar 14 07:16:37 crc kubenswrapper[4893]: I0314 07:16:37.697844 4893 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-lssl8" Mar 14 07:16:37 crc kubenswrapper[4893]: I0314 07:16:37.706768 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-7vxnn"] Mar 14 07:16:37 crc kubenswrapper[4893]: I0314 07:16:37.870044 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6gd2\" (UniqueName: \"kubernetes.io/projected/4b666b54-a5a5-4494-b03c-0d08c0bf9a79-kube-api-access-w6gd2\") pod \"cert-manager-545d4d4674-7vxnn\" (UID: \"4b666b54-a5a5-4494-b03c-0d08c0bf9a79\") " pod="cert-manager/cert-manager-545d4d4674-7vxnn" Mar 14 07:16:37 crc kubenswrapper[4893]: I0314 07:16:37.870168 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4b666b54-a5a5-4494-b03c-0d08c0bf9a79-bound-sa-token\") pod \"cert-manager-545d4d4674-7vxnn\" (UID: \"4b666b54-a5a5-4494-b03c-0d08c0bf9a79\") " pod="cert-manager/cert-manager-545d4d4674-7vxnn" Mar 14 07:16:37 crc kubenswrapper[4893]: I0314 07:16:37.971789 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6gd2\" (UniqueName: \"kubernetes.io/projected/4b666b54-a5a5-4494-b03c-0d08c0bf9a79-kube-api-access-w6gd2\") pod \"cert-manager-545d4d4674-7vxnn\" (UID: \"4b666b54-a5a5-4494-b03c-0d08c0bf9a79\") " pod="cert-manager/cert-manager-545d4d4674-7vxnn" Mar 14 07:16:37 crc kubenswrapper[4893]: I0314 07:16:37.972249 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4b666b54-a5a5-4494-b03c-0d08c0bf9a79-bound-sa-token\") pod \"cert-manager-545d4d4674-7vxnn\" (UID: \"4b666b54-a5a5-4494-b03c-0d08c0bf9a79\") " pod="cert-manager/cert-manager-545d4d4674-7vxnn" Mar 14 07:16:38 crc kubenswrapper[4893]: I0314 07:16:38.004347 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4b666b54-a5a5-4494-b03c-0d08c0bf9a79-bound-sa-token\") pod \"cert-manager-545d4d4674-7vxnn\" (UID: \"4b666b54-a5a5-4494-b03c-0d08c0bf9a79\") " pod="cert-manager/cert-manager-545d4d4674-7vxnn" Mar 14 07:16:38 crc kubenswrapper[4893]: I0314 07:16:38.012268 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6gd2\" (UniqueName: \"kubernetes.io/projected/4b666b54-a5a5-4494-b03c-0d08c0bf9a79-kube-api-access-w6gd2\") pod \"cert-manager-545d4d4674-7vxnn\" (UID: \"4b666b54-a5a5-4494-b03c-0d08c0bf9a79\") " pod="cert-manager/cert-manager-545d4d4674-7vxnn" Mar 14 07:16:38 crc kubenswrapper[4893]: I0314 07:16:38.026064 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-7vxnn" Mar 14 07:16:38 crc kubenswrapper[4893]: I0314 07:16:38.516051 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-7vxnn"] Mar 14 07:16:38 crc kubenswrapper[4893]: W0314 07:16:38.524738 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b666b54_a5a5_4494_b03c_0d08c0bf9a79.slice/crio-03854aeca869b861c77368d8c49e0580fb29c4974d8490d0cc18a0ec313b6a51 WatchSource:0}: Error finding container 03854aeca869b861c77368d8c49e0580fb29c4974d8490d0cc18a0ec313b6a51: Status 404 returned error can't find the container with id 03854aeca869b861c77368d8c49e0580fb29c4974d8490d0cc18a0ec313b6a51 Mar 14 07:16:38 crc kubenswrapper[4893]: I0314 07:16:38.794276 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-7vxnn" event={"ID":"4b666b54-a5a5-4494-b03c-0d08c0bf9a79","Type":"ContainerStarted","Data":"043495ae97f7a6173a0d2f5ea01e78a1b64cf28f53896196bc1a2789ff792693"} Mar 14 07:16:38 crc kubenswrapper[4893]: I0314 07:16:38.795473 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-7vxnn" event={"ID":"4b666b54-a5a5-4494-b03c-0d08c0bf9a79","Type":"ContainerStarted","Data":"03854aeca869b861c77368d8c49e0580fb29c4974d8490d0cc18a0ec313b6a51"} Mar 14 07:16:38 crc kubenswrapper[4893]: I0314 07:16:38.821357 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-545d4d4674-7vxnn" podStartSLOduration=1.821340185 podStartE2EDuration="1.821340185s" podCreationTimestamp="2026-03-14 07:16:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:16:38.818121916 +0000 UTC m=+1078.080298748" watchObservedRunningTime="2026-03-14 07:16:38.821340185 +0000 UTC m=+1078.083516977" Mar 14 07:16:42 crc kubenswrapper[4893]: I0314 07:16:42.120630 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-hjq59"] Mar 14 07:16:42 crc kubenswrapper[4893]: I0314 07:16:42.122384 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-hjq59" Mar 14 07:16:42 crc kubenswrapper[4893]: I0314 07:16:42.124758 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-w8qrq" Mar 14 07:16:42 crc kubenswrapper[4893]: I0314 07:16:42.126290 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 14 07:16:42 crc kubenswrapper[4893]: I0314 07:16:42.126716 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 14 07:16:42 crc kubenswrapper[4893]: I0314 07:16:42.138606 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-hjq59"] Mar 14 07:16:42 crc kubenswrapper[4893]: I0314 07:16:42.230481 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w75mz\" (UniqueName: \"kubernetes.io/projected/1233c63c-e1d9-42d7-9086-f73ba0933e36-kube-api-access-w75mz\") pod \"openstack-operator-index-hjq59\" (UID: \"1233c63c-e1d9-42d7-9086-f73ba0933e36\") " pod="openstack-operators/openstack-operator-index-hjq59" Mar 14 07:16:42 crc kubenswrapper[4893]: I0314 07:16:42.332067 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w75mz\" (UniqueName: \"kubernetes.io/projected/1233c63c-e1d9-42d7-9086-f73ba0933e36-kube-api-access-w75mz\") pod \"openstack-operator-index-hjq59\" (UID: \"1233c63c-e1d9-42d7-9086-f73ba0933e36\") " pod="openstack-operators/openstack-operator-index-hjq59" Mar 14 07:16:42 crc kubenswrapper[4893]: I0314 07:16:42.355191 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w75mz\" (UniqueName: \"kubernetes.io/projected/1233c63c-e1d9-42d7-9086-f73ba0933e36-kube-api-access-w75mz\") pod \"openstack-operator-index-hjq59\" (UID: \"1233c63c-e1d9-42d7-9086-f73ba0933e36\") " pod="openstack-operators/openstack-operator-index-hjq59" Mar 14 07:16:42 crc kubenswrapper[4893]: I0314 07:16:42.460836 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-hjq59" Mar 14 07:16:42 crc kubenswrapper[4893]: I0314 07:16:42.752128 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-hjq59"] Mar 14 07:16:42 crc kubenswrapper[4893]: W0314 07:16:42.782241 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1233c63c_e1d9_42d7_9086_f73ba0933e36.slice/crio-1dc12f34f59eae7d682ce9ec4e4a82fd8212285081112e4f9c6330cdae459010 WatchSource:0}: Error finding container 1dc12f34f59eae7d682ce9ec4e4a82fd8212285081112e4f9c6330cdae459010: Status 404 returned error can't find the container with id 1dc12f34f59eae7d682ce9ec4e4a82fd8212285081112e4f9c6330cdae459010 Mar 14 07:16:42 crc kubenswrapper[4893]: I0314 07:16:42.826670 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-hjq59" event={"ID":"1233c63c-e1d9-42d7-9086-f73ba0933e36","Type":"ContainerStarted","Data":"1dc12f34f59eae7d682ce9ec4e4a82fd8212285081112e4f9c6330cdae459010"} Mar 14 07:16:43 crc kubenswrapper[4893]: I0314 07:16:43.835761 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-hjq59" event={"ID":"1233c63c-e1d9-42d7-9086-f73ba0933e36","Type":"ContainerStarted","Data":"c67e800b373de6f66ff6e690167442067599fbf55f186ba34c01b7d02b48b2b1"} Mar 14 07:16:43 crc kubenswrapper[4893]: I0314 07:16:43.855181 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-hjq59" podStartSLOduration=1.014689972 podStartE2EDuration="1.855162205s" podCreationTimestamp="2026-03-14 07:16:42 +0000 UTC" firstStartedPulling="2026-03-14 07:16:42.786165196 +0000 UTC m=+1082.048341998" lastFinishedPulling="2026-03-14 07:16:43.626637439 +0000 UTC m=+1082.888814231" observedRunningTime="2026-03-14 07:16:43.851126935 +0000 UTC m=+1083.113303807" watchObservedRunningTime="2026-03-14 07:16:43.855162205 +0000 UTC m=+1083.117339007" Mar 14 07:16:44 crc kubenswrapper[4893]: I0314 07:16:44.364308 4893 scope.go:117] "RemoveContainer" containerID="d9a956f8c5ca6d1103ec66377fa782a47cd493892c94e61a329217220fa197fb" Mar 14 07:16:45 crc kubenswrapper[4893]: I0314 07:16:45.469929 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-hjq59"] Mar 14 07:16:45 crc kubenswrapper[4893]: I0314 07:16:45.849291 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-hjq59" podUID="1233c63c-e1d9-42d7-9086-f73ba0933e36" containerName="registry-server" containerID="cri-o://c67e800b373de6f66ff6e690167442067599fbf55f186ba34c01b7d02b48b2b1" gracePeriod=2 Mar 14 07:16:46 crc kubenswrapper[4893]: I0314 07:16:46.084910 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-cvls2"] Mar 14 07:16:46 crc kubenswrapper[4893]: I0314 07:16:46.085823 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-cvls2" Mar 14 07:16:46 crc kubenswrapper[4893]: I0314 07:16:46.102110 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-cvls2"] Mar 14 07:16:46 crc kubenswrapper[4893]: I0314 07:16:46.187129 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4hdf\" (UniqueName: \"kubernetes.io/projected/1ec921c3-291e-460e-abe1-d981ae06b426-kube-api-access-c4hdf\") pod \"openstack-operator-index-cvls2\" (UID: \"1ec921c3-291e-460e-abe1-d981ae06b426\") " pod="openstack-operators/openstack-operator-index-cvls2" Mar 14 07:16:46 crc kubenswrapper[4893]: I0314 07:16:46.288185 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4hdf\" (UniqueName: \"kubernetes.io/projected/1ec921c3-291e-460e-abe1-d981ae06b426-kube-api-access-c4hdf\") pod \"openstack-operator-index-cvls2\" (UID: \"1ec921c3-291e-460e-abe1-d981ae06b426\") " pod="openstack-operators/openstack-operator-index-cvls2" Mar 14 07:16:46 crc kubenswrapper[4893]: I0314 07:16:46.312187 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4hdf\" (UniqueName: \"kubernetes.io/projected/1ec921c3-291e-460e-abe1-d981ae06b426-kube-api-access-c4hdf\") pod \"openstack-operator-index-cvls2\" (UID: \"1ec921c3-291e-460e-abe1-d981ae06b426\") " pod="openstack-operators/openstack-operator-index-cvls2" Mar 14 07:16:46 crc kubenswrapper[4893]: I0314 07:16:46.372103 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-hjq59" Mar 14 07:16:46 crc kubenswrapper[4893]: I0314 07:16:46.410985 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-cvls2" Mar 14 07:16:46 crc kubenswrapper[4893]: I0314 07:16:46.490094 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w75mz\" (UniqueName: \"kubernetes.io/projected/1233c63c-e1d9-42d7-9086-f73ba0933e36-kube-api-access-w75mz\") pod \"1233c63c-e1d9-42d7-9086-f73ba0933e36\" (UID: \"1233c63c-e1d9-42d7-9086-f73ba0933e36\") " Mar 14 07:16:46 crc kubenswrapper[4893]: I0314 07:16:46.496695 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1233c63c-e1d9-42d7-9086-f73ba0933e36-kube-api-access-w75mz" (OuterVolumeSpecName: "kube-api-access-w75mz") pod "1233c63c-e1d9-42d7-9086-f73ba0933e36" (UID: "1233c63c-e1d9-42d7-9086-f73ba0933e36"). InnerVolumeSpecName "kube-api-access-w75mz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:16:46 crc kubenswrapper[4893]: I0314 07:16:46.591845 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w75mz\" (UniqueName: \"kubernetes.io/projected/1233c63c-e1d9-42d7-9086-f73ba0933e36-kube-api-access-w75mz\") on node \"crc\" DevicePath \"\"" Mar 14 07:16:46 crc kubenswrapper[4893]: I0314 07:16:46.602256 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-cvls2"] Mar 14 07:16:46 crc kubenswrapper[4893]: W0314 07:16:46.607579 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ec921c3_291e_460e_abe1_d981ae06b426.slice/crio-6b8fd73bdcf127ab7a87e478ef408a45f521eceb3a7a6ba2087061764f1c1ba0 WatchSource:0}: Error finding container 6b8fd73bdcf127ab7a87e478ef408a45f521eceb3a7a6ba2087061764f1c1ba0: Status 404 returned error can't find the container with id 6b8fd73bdcf127ab7a87e478ef408a45f521eceb3a7a6ba2087061764f1c1ba0 Mar 14 07:16:46 crc kubenswrapper[4893]: I0314 07:16:46.859369 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-cvls2" event={"ID":"1ec921c3-291e-460e-abe1-d981ae06b426","Type":"ContainerStarted","Data":"6b8fd73bdcf127ab7a87e478ef408a45f521eceb3a7a6ba2087061764f1c1ba0"} Mar 14 07:16:46 crc kubenswrapper[4893]: I0314 07:16:46.861673 4893 generic.go:334] "Generic (PLEG): container finished" podID="1233c63c-e1d9-42d7-9086-f73ba0933e36" containerID="c67e800b373de6f66ff6e690167442067599fbf55f186ba34c01b7d02b48b2b1" exitCode=0 Mar 14 07:16:46 crc kubenswrapper[4893]: I0314 07:16:46.861716 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-hjq59" event={"ID":"1233c63c-e1d9-42d7-9086-f73ba0933e36","Type":"ContainerDied","Data":"c67e800b373de6f66ff6e690167442067599fbf55f186ba34c01b7d02b48b2b1"} Mar 14 07:16:46 crc kubenswrapper[4893]: I0314 07:16:46.861745 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-hjq59" event={"ID":"1233c63c-e1d9-42d7-9086-f73ba0933e36","Type":"ContainerDied","Data":"1dc12f34f59eae7d682ce9ec4e4a82fd8212285081112e4f9c6330cdae459010"} Mar 14 07:16:46 crc kubenswrapper[4893]: I0314 07:16:46.861772 4893 scope.go:117] "RemoveContainer" containerID="c67e800b373de6f66ff6e690167442067599fbf55f186ba34c01b7d02b48b2b1" Mar 14 07:16:46 crc kubenswrapper[4893]: I0314 07:16:46.861864 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-hjq59" Mar 14 07:16:46 crc kubenswrapper[4893]: I0314 07:16:46.884666 4893 scope.go:117] "RemoveContainer" containerID="c67e800b373de6f66ff6e690167442067599fbf55f186ba34c01b7d02b48b2b1" Mar 14 07:16:46 crc kubenswrapper[4893]: E0314 07:16:46.885186 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c67e800b373de6f66ff6e690167442067599fbf55f186ba34c01b7d02b48b2b1\": container with ID starting with c67e800b373de6f66ff6e690167442067599fbf55f186ba34c01b7d02b48b2b1 not found: ID does not exist" containerID="c67e800b373de6f66ff6e690167442067599fbf55f186ba34c01b7d02b48b2b1" Mar 14 07:16:46 crc kubenswrapper[4893]: I0314 07:16:46.885212 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c67e800b373de6f66ff6e690167442067599fbf55f186ba34c01b7d02b48b2b1"} err="failed to get container status \"c67e800b373de6f66ff6e690167442067599fbf55f186ba34c01b7d02b48b2b1\": rpc error: code = NotFound desc = could not find container \"c67e800b373de6f66ff6e690167442067599fbf55f186ba34c01b7d02b48b2b1\": container with ID starting with c67e800b373de6f66ff6e690167442067599fbf55f186ba34c01b7d02b48b2b1 not found: ID does not exist" Mar 14 07:16:46 crc kubenswrapper[4893]: I0314 07:16:46.898358 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-hjq59"] Mar 14 07:16:46 crc kubenswrapper[4893]: I0314 07:16:46.903065 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-hjq59"] Mar 14 07:16:47 crc kubenswrapper[4893]: I0314 07:16:47.387923 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1233c63c-e1d9-42d7-9086-f73ba0933e36" path="/var/lib/kubelet/pods/1233c63c-e1d9-42d7-9086-f73ba0933e36/volumes" Mar 14 07:16:47 crc kubenswrapper[4893]: I0314 07:16:47.874438 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-cvls2" event={"ID":"1ec921c3-291e-460e-abe1-d981ae06b426","Type":"ContainerStarted","Data":"2535a928df63b7a8fcf036ab20e8482a420791fb6f120f5036f9b38cfa6cef74"} Mar 14 07:16:47 crc kubenswrapper[4893]: I0314 07:16:47.907132 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-cvls2" podStartSLOduration=1.50498799 podStartE2EDuration="1.907105764s" podCreationTimestamp="2026-03-14 07:16:46 +0000 UTC" firstStartedPulling="2026-03-14 07:16:46.610803811 +0000 UTC m=+1085.872980603" lastFinishedPulling="2026-03-14 07:16:47.012921555 +0000 UTC m=+1086.275098377" observedRunningTime="2026-03-14 07:16:47.893801415 +0000 UTC m=+1087.155978267" watchObservedRunningTime="2026-03-14 07:16:47.907105764 +0000 UTC m=+1087.169282596" Mar 14 07:16:56 crc kubenswrapper[4893]: I0314 07:16:56.412511 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-cvls2" Mar 14 07:16:56 crc kubenswrapper[4893]: I0314 07:16:56.413161 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-cvls2" Mar 14 07:16:56 crc kubenswrapper[4893]: I0314 07:16:56.454303 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-cvls2" Mar 14 07:16:56 crc kubenswrapper[4893]: I0314 07:16:56.978908 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-cvls2" Mar 14 07:17:03 crc kubenswrapper[4893]: I0314 07:17:03.114215 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298xwz7b"] Mar 14 07:17:03 crc kubenswrapper[4893]: E0314 07:17:03.115322 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1233c63c-e1d9-42d7-9086-f73ba0933e36" containerName="registry-server" Mar 14 07:17:03 crc kubenswrapper[4893]: I0314 07:17:03.115352 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="1233c63c-e1d9-42d7-9086-f73ba0933e36" containerName="registry-server" Mar 14 07:17:03 crc kubenswrapper[4893]: I0314 07:17:03.115583 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="1233c63c-e1d9-42d7-9086-f73ba0933e36" containerName="registry-server" Mar 14 07:17:03 crc kubenswrapper[4893]: I0314 07:17:03.117031 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298xwz7b" Mar 14 07:17:03 crc kubenswrapper[4893]: I0314 07:17:03.120067 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-n9w6z" Mar 14 07:17:03 crc kubenswrapper[4893]: I0314 07:17:03.126259 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298xwz7b"] Mar 14 07:17:03 crc kubenswrapper[4893]: I0314 07:17:03.243168 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/54ba629d-3d14-4c08-92ca-430abfe4177c-util\") pod \"a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298xwz7b\" (UID: \"54ba629d-3d14-4c08-92ca-430abfe4177c\") " pod="openstack-operators/a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298xwz7b" Mar 14 07:17:03 crc kubenswrapper[4893]: I0314 07:17:03.243260 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fpx8\" (UniqueName: \"kubernetes.io/projected/54ba629d-3d14-4c08-92ca-430abfe4177c-kube-api-access-9fpx8\") pod \"a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298xwz7b\" (UID: \"54ba629d-3d14-4c08-92ca-430abfe4177c\") " pod="openstack-operators/a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298xwz7b" Mar 14 07:17:03 crc kubenswrapper[4893]: I0314 07:17:03.243652 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/54ba629d-3d14-4c08-92ca-430abfe4177c-bundle\") pod \"a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298xwz7b\" (UID: \"54ba629d-3d14-4c08-92ca-430abfe4177c\") " pod="openstack-operators/a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298xwz7b" Mar 14 07:17:03 crc kubenswrapper[4893]: I0314 07:17:03.345120 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/54ba629d-3d14-4c08-92ca-430abfe4177c-util\") pod \"a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298xwz7b\" (UID: \"54ba629d-3d14-4c08-92ca-430abfe4177c\") " pod="openstack-operators/a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298xwz7b" Mar 14 07:17:03 crc kubenswrapper[4893]: I0314 07:17:03.345270 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fpx8\" (UniqueName: \"kubernetes.io/projected/54ba629d-3d14-4c08-92ca-430abfe4177c-kube-api-access-9fpx8\") pod \"a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298xwz7b\" (UID: \"54ba629d-3d14-4c08-92ca-430abfe4177c\") " pod="openstack-operators/a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298xwz7b" Mar 14 07:17:03 crc kubenswrapper[4893]: I0314 07:17:03.345372 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/54ba629d-3d14-4c08-92ca-430abfe4177c-bundle\") pod \"a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298xwz7b\" (UID: \"54ba629d-3d14-4c08-92ca-430abfe4177c\") " pod="openstack-operators/a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298xwz7b" Mar 14 07:17:03 crc kubenswrapper[4893]: I0314 07:17:03.345654 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/54ba629d-3d14-4c08-92ca-430abfe4177c-util\") pod \"a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298xwz7b\" (UID: \"54ba629d-3d14-4c08-92ca-430abfe4177c\") " pod="openstack-operators/a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298xwz7b" Mar 14 07:17:03 crc kubenswrapper[4893]: I0314 07:17:03.346058 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/54ba629d-3d14-4c08-92ca-430abfe4177c-bundle\") pod \"a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298xwz7b\" (UID: \"54ba629d-3d14-4c08-92ca-430abfe4177c\") " pod="openstack-operators/a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298xwz7b" Mar 14 07:17:03 crc kubenswrapper[4893]: I0314 07:17:03.365166 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fpx8\" (UniqueName: \"kubernetes.io/projected/54ba629d-3d14-4c08-92ca-430abfe4177c-kube-api-access-9fpx8\") pod \"a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298xwz7b\" (UID: \"54ba629d-3d14-4c08-92ca-430abfe4177c\") " pod="openstack-operators/a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298xwz7b" Mar 14 07:17:03 crc kubenswrapper[4893]: I0314 07:17:03.443388 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298xwz7b" Mar 14 07:17:03 crc kubenswrapper[4893]: I0314 07:17:03.722573 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298xwz7b"] Mar 14 07:17:04 crc kubenswrapper[4893]: I0314 07:17:04.012166 4893 generic.go:334] "Generic (PLEG): container finished" podID="54ba629d-3d14-4c08-92ca-430abfe4177c" containerID="027573d6563f6cda555c641fb9197887a19fd6c8ff6c4f70b79adb761c8ad702" exitCode=0 Mar 14 07:17:04 crc kubenswrapper[4893]: I0314 07:17:04.012416 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298xwz7b" event={"ID":"54ba629d-3d14-4c08-92ca-430abfe4177c","Type":"ContainerDied","Data":"027573d6563f6cda555c641fb9197887a19fd6c8ff6c4f70b79adb761c8ad702"} Mar 14 07:17:04 crc kubenswrapper[4893]: I0314 07:17:04.012444 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298xwz7b" event={"ID":"54ba629d-3d14-4c08-92ca-430abfe4177c","Type":"ContainerStarted","Data":"ce36eca0dbdbbf488389276613a23cc774bd686149b0f01a0c58a79a7304c8d5"} Mar 14 07:17:05 crc kubenswrapper[4893]: I0314 07:17:05.019645 4893 generic.go:334] "Generic (PLEG): container finished" podID="54ba629d-3d14-4c08-92ca-430abfe4177c" containerID="185d45385341adb018c20c2a64fe9b1e574cec629be3e60d420f1aeecb4234f6" exitCode=0 Mar 14 07:17:05 crc kubenswrapper[4893]: I0314 07:17:05.019704 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298xwz7b" event={"ID":"54ba629d-3d14-4c08-92ca-430abfe4177c","Type":"ContainerDied","Data":"185d45385341adb018c20c2a64fe9b1e574cec629be3e60d420f1aeecb4234f6"} Mar 14 07:17:06 crc kubenswrapper[4893]: I0314 07:17:06.031095 4893 generic.go:334] "Generic (PLEG): container finished" podID="54ba629d-3d14-4c08-92ca-430abfe4177c" containerID="7316c9c689e3df7bacbde9a9d2b432c751d1fb14705fb2e13fab54cf70ecc666" exitCode=0 Mar 14 07:17:06 crc kubenswrapper[4893]: I0314 07:17:06.031174 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298xwz7b" event={"ID":"54ba629d-3d14-4c08-92ca-430abfe4177c","Type":"ContainerDied","Data":"7316c9c689e3df7bacbde9a9d2b432c751d1fb14705fb2e13fab54cf70ecc666"} Mar 14 07:17:07 crc kubenswrapper[4893]: I0314 07:17:07.302632 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298xwz7b" Mar 14 07:17:07 crc kubenswrapper[4893]: I0314 07:17:07.406596 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/54ba629d-3d14-4c08-92ca-430abfe4177c-bundle\") pod \"54ba629d-3d14-4c08-92ca-430abfe4177c\" (UID: \"54ba629d-3d14-4c08-92ca-430abfe4177c\") " Mar 14 07:17:07 crc kubenswrapper[4893]: I0314 07:17:07.406669 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fpx8\" (UniqueName: \"kubernetes.io/projected/54ba629d-3d14-4c08-92ca-430abfe4177c-kube-api-access-9fpx8\") pod \"54ba629d-3d14-4c08-92ca-430abfe4177c\" (UID: \"54ba629d-3d14-4c08-92ca-430abfe4177c\") " Mar 14 07:17:07 crc kubenswrapper[4893]: I0314 07:17:07.406701 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/54ba629d-3d14-4c08-92ca-430abfe4177c-util\") pod \"54ba629d-3d14-4c08-92ca-430abfe4177c\" (UID: \"54ba629d-3d14-4c08-92ca-430abfe4177c\") " Mar 14 07:17:07 crc kubenswrapper[4893]: I0314 07:17:07.407382 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54ba629d-3d14-4c08-92ca-430abfe4177c-bundle" (OuterVolumeSpecName: "bundle") pod "54ba629d-3d14-4c08-92ca-430abfe4177c" (UID: "54ba629d-3d14-4c08-92ca-430abfe4177c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:17:07 crc kubenswrapper[4893]: I0314 07:17:07.412142 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54ba629d-3d14-4c08-92ca-430abfe4177c-kube-api-access-9fpx8" (OuterVolumeSpecName: "kube-api-access-9fpx8") pod "54ba629d-3d14-4c08-92ca-430abfe4177c" (UID: "54ba629d-3d14-4c08-92ca-430abfe4177c"). InnerVolumeSpecName "kube-api-access-9fpx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:17:07 crc kubenswrapper[4893]: I0314 07:17:07.427114 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54ba629d-3d14-4c08-92ca-430abfe4177c-util" (OuterVolumeSpecName: "util") pod "54ba629d-3d14-4c08-92ca-430abfe4177c" (UID: "54ba629d-3d14-4c08-92ca-430abfe4177c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:17:07 crc kubenswrapper[4893]: I0314 07:17:07.508445 4893 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/54ba629d-3d14-4c08-92ca-430abfe4177c-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:17:07 crc kubenswrapper[4893]: I0314 07:17:07.508475 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fpx8\" (UniqueName: \"kubernetes.io/projected/54ba629d-3d14-4c08-92ca-430abfe4177c-kube-api-access-9fpx8\") on node \"crc\" DevicePath \"\"" Mar 14 07:17:07 crc kubenswrapper[4893]: I0314 07:17:07.508484 4893 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/54ba629d-3d14-4c08-92ca-430abfe4177c-util\") on node \"crc\" DevicePath \"\"" Mar 14 07:17:08 crc kubenswrapper[4893]: I0314 07:17:08.050966 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298xwz7b" event={"ID":"54ba629d-3d14-4c08-92ca-430abfe4177c","Type":"ContainerDied","Data":"ce36eca0dbdbbf488389276613a23cc774bd686149b0f01a0c58a79a7304c8d5"} Mar 14 07:17:08 crc kubenswrapper[4893]: I0314 07:17:08.051021 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce36eca0dbdbbf488389276613a23cc774bd686149b0f01a0c58a79a7304c8d5" Mar 14 07:17:08 crc kubenswrapper[4893]: I0314 07:17:08.051085 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298xwz7b" Mar 14 07:17:16 crc kubenswrapper[4893]: I0314 07:17:16.307775 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-6dc56d8cd6-q649l"] Mar 14 07:17:16 crc kubenswrapper[4893]: E0314 07:17:16.308607 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54ba629d-3d14-4c08-92ca-430abfe4177c" containerName="util" Mar 14 07:17:16 crc kubenswrapper[4893]: I0314 07:17:16.308622 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="54ba629d-3d14-4c08-92ca-430abfe4177c" containerName="util" Mar 14 07:17:16 crc kubenswrapper[4893]: E0314 07:17:16.308643 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54ba629d-3d14-4c08-92ca-430abfe4177c" containerName="pull" Mar 14 07:17:16 crc kubenswrapper[4893]: I0314 07:17:16.308651 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="54ba629d-3d14-4c08-92ca-430abfe4177c" containerName="pull" Mar 14 07:17:16 crc kubenswrapper[4893]: E0314 07:17:16.308672 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54ba629d-3d14-4c08-92ca-430abfe4177c" containerName="extract" Mar 14 07:17:16 crc kubenswrapper[4893]: I0314 07:17:16.308683 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="54ba629d-3d14-4c08-92ca-430abfe4177c" containerName="extract" Mar 14 07:17:16 crc kubenswrapper[4893]: I0314 07:17:16.308831 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="54ba629d-3d14-4c08-92ca-430abfe4177c" containerName="extract" Mar 14 07:17:16 crc kubenswrapper[4893]: I0314 07:17:16.309286 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6dc56d8cd6-q649l" Mar 14 07:17:16 crc kubenswrapper[4893]: I0314 07:17:16.312094 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-gp9cz" Mar 14 07:17:16 crc kubenswrapper[4893]: I0314 07:17:16.376005 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6dc56d8cd6-q649l"] Mar 14 07:17:16 crc kubenswrapper[4893]: I0314 07:17:16.434969 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8qcc\" (UniqueName: \"kubernetes.io/projected/271e053c-65a4-4422-9a1c-132c83e80ce5-kube-api-access-g8qcc\") pod \"openstack-operator-controller-init-6dc56d8cd6-q649l\" (UID: \"271e053c-65a4-4422-9a1c-132c83e80ce5\") " pod="openstack-operators/openstack-operator-controller-init-6dc56d8cd6-q649l" Mar 14 07:17:16 crc kubenswrapper[4893]: I0314 07:17:16.536048 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8qcc\" (UniqueName: \"kubernetes.io/projected/271e053c-65a4-4422-9a1c-132c83e80ce5-kube-api-access-g8qcc\") pod \"openstack-operator-controller-init-6dc56d8cd6-q649l\" (UID: \"271e053c-65a4-4422-9a1c-132c83e80ce5\") " pod="openstack-operators/openstack-operator-controller-init-6dc56d8cd6-q649l" Mar 14 07:17:16 crc kubenswrapper[4893]: I0314 07:17:16.560160 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8qcc\" (UniqueName: \"kubernetes.io/projected/271e053c-65a4-4422-9a1c-132c83e80ce5-kube-api-access-g8qcc\") pod \"openstack-operator-controller-init-6dc56d8cd6-q649l\" (UID: \"271e053c-65a4-4422-9a1c-132c83e80ce5\") " pod="openstack-operators/openstack-operator-controller-init-6dc56d8cd6-q649l" Mar 14 07:17:16 crc kubenswrapper[4893]: I0314 07:17:16.624510 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6dc56d8cd6-q649l" Mar 14 07:17:16 crc kubenswrapper[4893]: I0314 07:17:16.915476 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6dc56d8cd6-q649l"] Mar 14 07:17:17 crc kubenswrapper[4893]: I0314 07:17:17.112354 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6dc56d8cd6-q649l" event={"ID":"271e053c-65a4-4422-9a1c-132c83e80ce5","Type":"ContainerStarted","Data":"a4e9b970b9720d591368eadfb04f6207ccbbdef1a0d4d15ad999822db003e33f"} Mar 14 07:17:21 crc kubenswrapper[4893]: I0314 07:17:21.138142 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6dc56d8cd6-q649l" event={"ID":"271e053c-65a4-4422-9a1c-132c83e80ce5","Type":"ContainerStarted","Data":"92cbe01f26a5e864475a21adb6c5446124ebc0385d690634eb9048e1fcb5fe53"} Mar 14 07:17:21 crc kubenswrapper[4893]: I0314 07:17:21.138841 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-6dc56d8cd6-q649l" Mar 14 07:17:21 crc kubenswrapper[4893]: I0314 07:17:21.179844 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-6dc56d8cd6-q649l" podStartSLOduration=1.336921363 podStartE2EDuration="5.179820376s" podCreationTimestamp="2026-03-14 07:17:16 +0000 UTC" firstStartedPulling="2026-03-14 07:17:16.932192644 +0000 UTC m=+1116.194369436" lastFinishedPulling="2026-03-14 07:17:20.775091667 +0000 UTC m=+1120.037268449" observedRunningTime="2026-03-14 07:17:21.177849108 +0000 UTC m=+1120.440025900" watchObservedRunningTime="2026-03-14 07:17:21.179820376 +0000 UTC m=+1120.441997198" Mar 14 07:17:26 crc kubenswrapper[4893]: I0314 07:17:26.628927 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-6dc56d8cd6-q649l" Mar 14 07:17:59 crc kubenswrapper[4893]: I0314 07:17:59.730719 4893 patch_prober.go:28] interesting pod/machine-config-daemon-d4x6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:17:59 crc kubenswrapper[4893]: I0314 07:17:59.731297 4893 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:18:00 crc kubenswrapper[4893]: I0314 07:18:00.123649 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557878-cvzrx"] Mar 14 07:18:00 crc kubenswrapper[4893]: I0314 07:18:00.124471 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557878-cvzrx" Mar 14 07:18:00 crc kubenswrapper[4893]: I0314 07:18:00.126481 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 07:18:00 crc kubenswrapper[4893]: I0314 07:18:00.127216 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 07:18:00 crc kubenswrapper[4893]: I0314 07:18:00.127629 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-44qb7" Mar 14 07:18:00 crc kubenswrapper[4893]: I0314 07:18:00.130312 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557878-cvzrx"] Mar 14 07:18:00 crc kubenswrapper[4893]: I0314 07:18:00.153093 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6bbj\" (UniqueName: \"kubernetes.io/projected/85464791-1f62-4d33-bd20-813896eef4b8-kube-api-access-z6bbj\") pod \"auto-csr-approver-29557878-cvzrx\" (UID: \"85464791-1f62-4d33-bd20-813896eef4b8\") " pod="openshift-infra/auto-csr-approver-29557878-cvzrx" Mar 14 07:18:00 crc kubenswrapper[4893]: I0314 07:18:00.254288 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6bbj\" (UniqueName: \"kubernetes.io/projected/85464791-1f62-4d33-bd20-813896eef4b8-kube-api-access-z6bbj\") pod \"auto-csr-approver-29557878-cvzrx\" (UID: \"85464791-1f62-4d33-bd20-813896eef4b8\") " pod="openshift-infra/auto-csr-approver-29557878-cvzrx" Mar 14 07:18:00 crc kubenswrapper[4893]: I0314 07:18:00.281575 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6bbj\" (UniqueName: \"kubernetes.io/projected/85464791-1f62-4d33-bd20-813896eef4b8-kube-api-access-z6bbj\") pod \"auto-csr-approver-29557878-cvzrx\" (UID: \"85464791-1f62-4d33-bd20-813896eef4b8\") " pod="openshift-infra/auto-csr-approver-29557878-cvzrx" Mar 14 07:18:00 crc kubenswrapper[4893]: I0314 07:18:00.447754 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557878-cvzrx" Mar 14 07:18:00 crc kubenswrapper[4893]: I0314 07:18:00.845790 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557878-cvzrx"] Mar 14 07:18:01 crc kubenswrapper[4893]: I0314 07:18:01.424854 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557878-cvzrx" event={"ID":"85464791-1f62-4d33-bd20-813896eef4b8","Type":"ContainerStarted","Data":"22a9be2bbe1e823192c98fe72520e243768427c62ca4bd00647b109e3734c9f1"} Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.235871 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-d47688694-lg76w"] Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.237484 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-d47688694-lg76w" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.239664 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-xr2zr"] Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.239813 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-8dlb5" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.240424 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-xr2zr" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.252973 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-xr2zr"] Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.253990 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-b52x9" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.262009 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-d47688694-lg76w"] Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.270230 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-cjqhn"] Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.271150 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-cjqhn" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.274892 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-rxsg5" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.277784 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-cjqhn"] Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.293389 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfp2m\" (UniqueName: \"kubernetes.io/projected/10f5729a-cf9e-4851-9173-2c6c0d6dbcf0-kube-api-access-mfp2m\") pod \"designate-operator-controller-manager-66d56f6ff4-cjqhn\" (UID: \"10f5729a-cf9e-4851-9173-2c6c0d6dbcf0\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-cjqhn" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.293462 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4pv8\" (UniqueName: \"kubernetes.io/projected/bb479ff8-15fb-458b-87c7-ec4b3d15721d-kube-api-access-j4pv8\") pod \"barbican-operator-controller-manager-d47688694-lg76w\" (UID: \"bb479ff8-15fb-458b-87c7-ec4b3d15721d\") " pod="openstack-operators/barbican-operator-controller-manager-d47688694-lg76w" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.293507 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7wjf\" (UniqueName: \"kubernetes.io/projected/49eeeddb-7017-4642-aabb-ea932fa16ac7-kube-api-access-p7wjf\") pod \"cinder-operator-controller-manager-984cd4dcf-xr2zr\" (UID: \"49eeeddb-7017-4642-aabb-ea932fa16ac7\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-xr2zr" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.316739 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-65klp"] Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.317902 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-65klp" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.320129 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-gdvwj" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.322584 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-65klp"] Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.355566 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-qfg79"] Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.356532 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-qfg79" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.359154 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-px4zc" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.359328 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-djn7l"] Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.360110 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-djn7l" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.365927 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-2kx6m" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.370967 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-qfg79"] Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.387100 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-djn7l"] Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.387146 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-54dc5b8f8d-qmt9z"] Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.387982 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-qmt9z" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.393909 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.394338 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-n46rh" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.395167 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4pv8\" (UniqueName: \"kubernetes.io/projected/bb479ff8-15fb-458b-87c7-ec4b3d15721d-kube-api-access-j4pv8\") pod \"barbican-operator-controller-manager-d47688694-lg76w\" (UID: \"bb479ff8-15fb-458b-87c7-ec4b3d15721d\") " pod="openstack-operators/barbican-operator-controller-manager-d47688694-lg76w" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.395228 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tx8bg\" (UniqueName: \"kubernetes.io/projected/9e7e613b-2bb4-40a5-a8c2-5649763d4a61-kube-api-access-tx8bg\") pod \"glance-operator-controller-manager-5964f64c48-65klp\" (UID: \"9e7e613b-2bb4-40a5-a8c2-5649763d4a61\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-65klp" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.395267 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9966deb5-a260-43a8-bec4-772b6308266d-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-qmt9z\" (UID: \"9966deb5-a260-43a8-bec4-772b6308266d\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-qmt9z" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.395337 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7wjf\" (UniqueName: \"kubernetes.io/projected/49eeeddb-7017-4642-aabb-ea932fa16ac7-kube-api-access-p7wjf\") pod \"cinder-operator-controller-manager-984cd4dcf-xr2zr\" (UID: \"49eeeddb-7017-4642-aabb-ea932fa16ac7\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-xr2zr" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.395458 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64jlf\" (UniqueName: \"kubernetes.io/projected/26bb4dec-8e1a-4fdc-a7ca-808ae7026afe-kube-api-access-64jlf\") pod \"horizon-operator-controller-manager-6d9d6b584d-djn7l\" (UID: \"26bb4dec-8e1a-4fdc-a7ca-808ae7026afe\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-djn7l" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.395582 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7429\" (UniqueName: \"kubernetes.io/projected/9966deb5-a260-43a8-bec4-772b6308266d-kube-api-access-w7429\") pod \"infra-operator-controller-manager-54dc5b8f8d-qmt9z\" (UID: \"9966deb5-a260-43a8-bec4-772b6308266d\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-qmt9z" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.395637 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kljz\" (UniqueName: \"kubernetes.io/projected/a8ac6ab6-a354-4d94-9bfb-48ae4b15ff88-kube-api-access-9kljz\") pod \"heat-operator-controller-manager-77b6666d85-qfg79\" (UID: \"a8ac6ab6-a354-4d94-9bfb-48ae4b15ff88\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-qfg79" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.395707 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfp2m\" (UniqueName: \"kubernetes.io/projected/10f5729a-cf9e-4851-9173-2c6c0d6dbcf0-kube-api-access-mfp2m\") pod \"designate-operator-controller-manager-66d56f6ff4-cjqhn\" (UID: \"10f5729a-cf9e-4851-9173-2c6c0d6dbcf0\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-cjqhn" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.401483 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-54dc5b8f8d-qmt9z"] Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.407231 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5bc894d9b-gm6pg"] Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.408275 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-gm6pg" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.424301 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfp2m\" (UniqueName: \"kubernetes.io/projected/10f5729a-cf9e-4851-9173-2c6c0d6dbcf0-kube-api-access-mfp2m\") pod \"designate-operator-controller-manager-66d56f6ff4-cjqhn\" (UID: \"10f5729a-cf9e-4851-9173-2c6c0d6dbcf0\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-cjqhn" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.424891 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-djrdc" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.429152 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7wjf\" (UniqueName: \"kubernetes.io/projected/49eeeddb-7017-4642-aabb-ea932fa16ac7-kube-api-access-p7wjf\") pod \"cinder-operator-controller-manager-984cd4dcf-xr2zr\" (UID: \"49eeeddb-7017-4642-aabb-ea932fa16ac7\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-xr2zr" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.477455 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5bc894d9b-gm6pg"] Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.484128 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-xfpdm"] Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.485002 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-xfpdm" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.486623 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-rm4mj" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.488194 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4pv8\" (UniqueName: \"kubernetes.io/projected/bb479ff8-15fb-458b-87c7-ec4b3d15721d-kube-api-access-j4pv8\") pod \"barbican-operator-controller-manager-d47688694-lg76w\" (UID: \"bb479ff8-15fb-458b-87c7-ec4b3d15721d\") " pod="openstack-operators/barbican-operator-controller-manager-d47688694-lg76w" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.496762 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tx8bg\" (UniqueName: \"kubernetes.io/projected/9e7e613b-2bb4-40a5-a8c2-5649763d4a61-kube-api-access-tx8bg\") pod \"glance-operator-controller-manager-5964f64c48-65klp\" (UID: \"9e7e613b-2bb4-40a5-a8c2-5649763d4a61\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-65klp" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.496805 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjcwt\" (UniqueName: \"kubernetes.io/projected/1e068b17-cab4-421d-b501-cd825de6b67c-kube-api-access-tjcwt\") pod \"ironic-operator-controller-manager-5bc894d9b-gm6pg\" (UID: \"1e068b17-cab4-421d-b501-cd825de6b67c\") " pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-gm6pg" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.496832 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9966deb5-a260-43a8-bec4-772b6308266d-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-qmt9z\" (UID: \"9966deb5-a260-43a8-bec4-772b6308266d\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-qmt9z" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.496869 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64jlf\" (UniqueName: \"kubernetes.io/projected/26bb4dec-8e1a-4fdc-a7ca-808ae7026afe-kube-api-access-64jlf\") pod \"horizon-operator-controller-manager-6d9d6b584d-djn7l\" (UID: \"26bb4dec-8e1a-4fdc-a7ca-808ae7026afe\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-djn7l" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.496901 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7429\" (UniqueName: \"kubernetes.io/projected/9966deb5-a260-43a8-bec4-772b6308266d-kube-api-access-w7429\") pod \"infra-operator-controller-manager-54dc5b8f8d-qmt9z\" (UID: \"9966deb5-a260-43a8-bec4-772b6308266d\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-qmt9z" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.496933 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kljz\" (UniqueName: \"kubernetes.io/projected/a8ac6ab6-a354-4d94-9bfb-48ae4b15ff88-kube-api-access-9kljz\") pod \"heat-operator-controller-manager-77b6666d85-qfg79\" (UID: \"a8ac6ab6-a354-4d94-9bfb-48ae4b15ff88\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-qfg79" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.496958 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dnkd\" (UniqueName: \"kubernetes.io/projected/01b24940-e97d-472d-902c-87bfe6b67147-kube-api-access-2dnkd\") pod \"keystone-operator-controller-manager-684f77d66d-xfpdm\" (UID: \"01b24940-e97d-472d-902c-87bfe6b67147\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-xfpdm" Mar 14 07:18:03 crc kubenswrapper[4893]: E0314 07:18:03.497017 4893 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 14 07:18:03 crc kubenswrapper[4893]: E0314 07:18:03.497078 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9966deb5-a260-43a8-bec4-772b6308266d-cert podName:9966deb5-a260-43a8-bec4-772b6308266d nodeName:}" failed. No retries permitted until 2026-03-14 07:18:03.99706127 +0000 UTC m=+1163.259238052 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9966deb5-a260-43a8-bec4-772b6308266d-cert") pod "infra-operator-controller-manager-54dc5b8f8d-qmt9z" (UID: "9966deb5-a260-43a8-bec4-772b6308266d") : secret "infra-operator-webhook-server-cert" not found Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.498561 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-57b484b4df-tlm9k"] Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.499454 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-tlm9k" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.505452 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-vhfpq" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.514395 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tx8bg\" (UniqueName: \"kubernetes.io/projected/9e7e613b-2bb4-40a5-a8c2-5649763d4a61-kube-api-access-tx8bg\") pod \"glance-operator-controller-manager-5964f64c48-65klp\" (UID: \"9e7e613b-2bb4-40a5-a8c2-5649763d4a61\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-65klp" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.518660 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7429\" (UniqueName: \"kubernetes.io/projected/9966deb5-a260-43a8-bec4-772b6308266d-kube-api-access-w7429\") pod \"infra-operator-controller-manager-54dc5b8f8d-qmt9z\" (UID: \"9966deb5-a260-43a8-bec4-772b6308266d\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-qmt9z" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.522429 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-xfpdm"] Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.531093 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64jlf\" (UniqueName: \"kubernetes.io/projected/26bb4dec-8e1a-4fdc-a7ca-808ae7026afe-kube-api-access-64jlf\") pod \"horizon-operator-controller-manager-6d9d6b584d-djn7l\" (UID: \"26bb4dec-8e1a-4fdc-a7ca-808ae7026afe\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-djn7l" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.531634 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-57b484b4df-tlm9k"] Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.534443 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kljz\" (UniqueName: \"kubernetes.io/projected/a8ac6ab6-a354-4d94-9bfb-48ae4b15ff88-kube-api-access-9kljz\") pod \"heat-operator-controller-manager-77b6666d85-qfg79\" (UID: \"a8ac6ab6-a354-4d94-9bfb-48ae4b15ff88\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-qfg79" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.539591 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-cxg74"] Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.540595 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-cxg74" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.543043 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-tfpzv" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.558735 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-zkh7h"] Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.559637 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-zkh7h" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.560610 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-d47688694-lg76w" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.561670 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-x7jxw" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.563924 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-cxg74"] Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.573704 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-xr2zr" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.579958 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-7f84474648-xjwp7"] Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.580819 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-7f84474648-xjwp7" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.588059 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-qbm8l" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.589838 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-cjqhn" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.590677 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-zkh7h"] Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.595623 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-gztc6"] Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.596501 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-gztc6" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.598256 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92gcf\" (UniqueName: \"kubernetes.io/projected/c50bd703-5cd9-418a-b623-4809a8ff4213-kube-api-access-92gcf\") pod \"neutron-operator-controller-manager-776c5696bf-zkh7h\" (UID: \"c50bd703-5cd9-418a-b623-4809a8ff4213\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-zkh7h" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.598322 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zgxp\" (UniqueName: \"kubernetes.io/projected/0997c764-6b32-41ea-adca-b04feb1fbe6f-kube-api-access-2zgxp\") pod \"nova-operator-controller-manager-7f84474648-xjwp7\" (UID: \"0997c764-6b32-41ea-adca-b04feb1fbe6f\") " pod="openstack-operators/nova-operator-controller-manager-7f84474648-xjwp7" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.598371 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjcwt\" (UniqueName: \"kubernetes.io/projected/1e068b17-cab4-421d-b501-cd825de6b67c-kube-api-access-tjcwt\") pod \"ironic-operator-controller-manager-5bc894d9b-gm6pg\" (UID: \"1e068b17-cab4-421d-b501-cd825de6b67c\") " pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-gm6pg" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.598486 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxnhv\" (UniqueName: \"kubernetes.io/projected/b8da0600-7ae7-4f7a-8b3e-c81523dc6034-kube-api-access-vxnhv\") pod \"mariadb-operator-controller-manager-5b6b6b4c9f-cxg74\" (UID: \"b8da0600-7ae7-4f7a-8b3e-c81523dc6034\") " pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-cxg74" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.598512 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dnkd\" (UniqueName: \"kubernetes.io/projected/01b24940-e97d-472d-902c-87bfe6b67147-kube-api-access-2dnkd\") pod \"keystone-operator-controller-manager-684f77d66d-xfpdm\" (UID: \"01b24940-e97d-472d-902c-87bfe6b67147\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-xfpdm" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.598545 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9s6b7\" (UniqueName: \"kubernetes.io/projected/7a3a590a-bacd-4d27-91f1-1b6afd52ab3e-kube-api-access-9s6b7\") pod \"manila-operator-controller-manager-57b484b4df-tlm9k\" (UID: \"7a3a590a-bacd-4d27-91f1-1b6afd52ab3e\") " pod="openstack-operators/manila-operator-controller-manager-57b484b4df-tlm9k" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.598762 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-8j95b" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.620107 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-7f84474648-xjwp7"] Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.622293 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-gztc6"] Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.624226 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dnkd\" (UniqueName: \"kubernetes.io/projected/01b24940-e97d-472d-902c-87bfe6b67147-kube-api-access-2dnkd\") pod \"keystone-operator-controller-manager-684f77d66d-xfpdm\" (UID: \"01b24940-e97d-472d-902c-87bfe6b67147\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-xfpdm" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.626673 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjcwt\" (UniqueName: \"kubernetes.io/projected/1e068b17-cab4-421d-b501-cd825de6b67c-kube-api-access-tjcwt\") pod \"ironic-operator-controller-manager-5bc894d9b-gm6pg\" (UID: \"1e068b17-cab4-421d-b501-cd825de6b67c\") " pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-gm6pg" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.635460 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-65klp" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.681828 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-qfg79" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.692445 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-djn7l" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.699298 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9s6b7\" (UniqueName: \"kubernetes.io/projected/7a3a590a-bacd-4d27-91f1-1b6afd52ab3e-kube-api-access-9s6b7\") pod \"manila-operator-controller-manager-57b484b4df-tlm9k\" (UID: \"7a3a590a-bacd-4d27-91f1-1b6afd52ab3e\") " pod="openstack-operators/manila-operator-controller-manager-57b484b4df-tlm9k" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.699337 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92gcf\" (UniqueName: \"kubernetes.io/projected/c50bd703-5cd9-418a-b623-4809a8ff4213-kube-api-access-92gcf\") pod \"neutron-operator-controller-manager-776c5696bf-zkh7h\" (UID: \"c50bd703-5cd9-418a-b623-4809a8ff4213\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-zkh7h" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.699367 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zgxp\" (UniqueName: \"kubernetes.io/projected/0997c764-6b32-41ea-adca-b04feb1fbe6f-kube-api-access-2zgxp\") pod \"nova-operator-controller-manager-7f84474648-xjwp7\" (UID: \"0997c764-6b32-41ea-adca-b04feb1fbe6f\") " pod="openstack-operators/nova-operator-controller-manager-7f84474648-xjwp7" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.699697 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6szj\" (UniqueName: \"kubernetes.io/projected/1a01facb-b7c0-476b-96b1-759e6e9f3c30-kube-api-access-m6szj\") pod \"octavia-operator-controller-manager-5f4f55cb5c-gztc6\" (UID: \"1a01facb-b7c0-476b-96b1-759e6e9f3c30\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-gztc6" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.699749 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxnhv\" (UniqueName: \"kubernetes.io/projected/b8da0600-7ae7-4f7a-8b3e-c81523dc6034-kube-api-access-vxnhv\") pod \"mariadb-operator-controller-manager-5b6b6b4c9f-cxg74\" (UID: \"b8da0600-7ae7-4f7a-8b3e-c81523dc6034\") " pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-cxg74" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.723601 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxnhv\" (UniqueName: \"kubernetes.io/projected/b8da0600-7ae7-4f7a-8b3e-c81523dc6034-kube-api-access-vxnhv\") pod \"mariadb-operator-controller-manager-5b6b6b4c9f-cxg74\" (UID: \"b8da0600-7ae7-4f7a-8b3e-c81523dc6034\") " pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-cxg74" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.724930 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zgxp\" (UniqueName: \"kubernetes.io/projected/0997c764-6b32-41ea-adca-b04feb1fbe6f-kube-api-access-2zgxp\") pod \"nova-operator-controller-manager-7f84474648-xjwp7\" (UID: \"0997c764-6b32-41ea-adca-b04feb1fbe6f\") " pod="openstack-operators/nova-operator-controller-manager-7f84474648-xjwp7" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.725345 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9s6b7\" (UniqueName: \"kubernetes.io/projected/7a3a590a-bacd-4d27-91f1-1b6afd52ab3e-kube-api-access-9s6b7\") pod \"manila-operator-controller-manager-57b484b4df-tlm9k\" (UID: \"7a3a590a-bacd-4d27-91f1-1b6afd52ab3e\") " pod="openstack-operators/manila-operator-controller-manager-57b484b4df-tlm9k" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.727107 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92gcf\" (UniqueName: \"kubernetes.io/projected/c50bd703-5cd9-418a-b623-4809a8ff4213-kube-api-access-92gcf\") pod \"neutron-operator-controller-manager-776c5696bf-zkh7h\" (UID: \"c50bd703-5cd9-418a-b623-4809a8ff4213\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-zkh7h" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.801280 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6szj\" (UniqueName: \"kubernetes.io/projected/1a01facb-b7c0-476b-96b1-759e6e9f3c30-kube-api-access-m6szj\") pod \"octavia-operator-controller-manager-5f4f55cb5c-gztc6\" (UID: \"1a01facb-b7c0-476b-96b1-759e6e9f3c30\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-gztc6" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.811853 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-gm6pg" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.823706 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6szj\" (UniqueName: \"kubernetes.io/projected/1a01facb-b7c0-476b-96b1-759e6e9f3c30-kube-api-access-m6szj\") pod \"octavia-operator-controller-manager-5f4f55cb5c-gztc6\" (UID: \"1a01facb-b7c0-476b-96b1-759e6e9f3c30\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-gztc6" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.871310 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-xfpdm" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.871729 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6f7958d774rh7fr"] Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.872474 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f7958d774rh7fr" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.881627 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-7967f"] Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.882583 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-7967f" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.892846 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-tlm9k" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.903117 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba3c47d0-78fd-4b4d-8aa8-66c2379f8c0f-cert\") pod \"openstack-baremetal-operator-controller-manager-6f7958d774rh7fr\" (UID: \"ba3c47d0-78fd-4b4d-8aa8-66c2379f8c0f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f7958d774rh7fr" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.903186 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t29kd\" (UniqueName: \"kubernetes.io/projected/ba3c47d0-78fd-4b4d-8aa8-66c2379f8c0f-kube-api-access-t29kd\") pod \"openstack-baremetal-operator-controller-manager-6f7958d774rh7fr\" (UID: \"ba3c47d0-78fd-4b4d-8aa8-66c2379f8c0f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f7958d774rh7fr" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.903255 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6hmv\" (UniqueName: \"kubernetes.io/projected/2b0418ea-6737-4c52-b1bc-915fba5ef735-kube-api-access-l6hmv\") pod \"ovn-operator-controller-manager-bbc5b68f9-7967f\" (UID: \"2b0418ea-6737-4c52-b1bc-915fba5ef735\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-7967f" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.903418 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-cxg74" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.904875 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-gzbtj" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.905089 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.905196 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-9jssg" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.917851 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-kwcbl"] Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.918633 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-kwcbl" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.922148 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-zkh7h" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.930763 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-fd5kr" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.932818 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-gztc6" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.940724 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-7f84474648-xjwp7" Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.948217 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-7967f"] Mar 14 07:18:03 crc kubenswrapper[4893]: I0314 07:18:03.976775 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6f7958d774rh7fr"] Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.008047 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t29kd\" (UniqueName: \"kubernetes.io/projected/ba3c47d0-78fd-4b4d-8aa8-66c2379f8c0f-kube-api-access-t29kd\") pod \"openstack-baremetal-operator-controller-manager-6f7958d774rh7fr\" (UID: \"ba3c47d0-78fd-4b4d-8aa8-66c2379f8c0f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f7958d774rh7fr" Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.008167 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9966deb5-a260-43a8-bec4-772b6308266d-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-qmt9z\" (UID: \"9966deb5-a260-43a8-bec4-772b6308266d\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-qmt9z" Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.008191 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6hmv\" (UniqueName: \"kubernetes.io/projected/2b0418ea-6737-4c52-b1bc-915fba5ef735-kube-api-access-l6hmv\") pod \"ovn-operator-controller-manager-bbc5b68f9-7967f\" (UID: \"2b0418ea-6737-4c52-b1bc-915fba5ef735\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-7967f" Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.008251 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba3c47d0-78fd-4b4d-8aa8-66c2379f8c0f-cert\") pod \"openstack-baremetal-operator-controller-manager-6f7958d774rh7fr\" (UID: \"ba3c47d0-78fd-4b4d-8aa8-66c2379f8c0f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f7958d774rh7fr" Mar 14 07:18:04 crc kubenswrapper[4893]: E0314 07:18:04.008414 4893 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 14 07:18:04 crc kubenswrapper[4893]: E0314 07:18:04.008460 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba3c47d0-78fd-4b4d-8aa8-66c2379f8c0f-cert podName:ba3c47d0-78fd-4b4d-8aa8-66c2379f8c0f nodeName:}" failed. No retries permitted until 2026-03-14 07:18:04.508444549 +0000 UTC m=+1163.770621341 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ba3c47d0-78fd-4b4d-8aa8-66c2379f8c0f-cert") pod "openstack-baremetal-operator-controller-manager-6f7958d774rh7fr" (UID: "ba3c47d0-78fd-4b4d-8aa8-66c2379f8c0f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 14 07:18:04 crc kubenswrapper[4893]: E0314 07:18:04.009879 4893 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 14 07:18:04 crc kubenswrapper[4893]: E0314 07:18:04.009944 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9966deb5-a260-43a8-bec4-772b6308266d-cert podName:9966deb5-a260-43a8-bec4-772b6308266d nodeName:}" failed. No retries permitted until 2026-03-14 07:18:05.009907615 +0000 UTC m=+1164.272084407 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9966deb5-a260-43a8-bec4-772b6308266d-cert") pod "infra-operator-controller-manager-54dc5b8f8d-qmt9z" (UID: "9966deb5-a260-43a8-bec4-772b6308266d") : secret "infra-operator-webhook-server-cert" not found Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.028183 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-kwcbl"] Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.037499 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6hmv\" (UniqueName: \"kubernetes.io/projected/2b0418ea-6737-4c52-b1bc-915fba5ef735-kube-api-access-l6hmv\") pod \"ovn-operator-controller-manager-bbc5b68f9-7967f\" (UID: \"2b0418ea-6737-4c52-b1bc-915fba5ef735\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-7967f" Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.042296 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-7f9cc5dd44-wgkn7"] Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.048433 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t29kd\" (UniqueName: \"kubernetes.io/projected/ba3c47d0-78fd-4b4d-8aa8-66c2379f8c0f-kube-api-access-t29kd\") pod \"openstack-baremetal-operator-controller-manager-6f7958d774rh7fr\" (UID: \"ba3c47d0-78fd-4b4d-8aa8-66c2379f8c0f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f7958d774rh7fr" Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.063044 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-7f9cc5dd44-wgkn7"] Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.063145 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-wgkn7" Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.081093 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-xdfb7" Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.089414 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-49ss2"] Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.090399 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-49ss2" Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.100230 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-shlgc" Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.102619 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-9rs6t"] Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.130397 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-9rs6t" Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.134092 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-dsqdn" Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.136721 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvq6s\" (UniqueName: \"kubernetes.io/projected/373cda74-3e73-4a92-9d91-395826ab1864-kube-api-access-cvq6s\") pod \"swift-operator-controller-manager-7f9cc5dd44-wgkn7\" (UID: \"373cda74-3e73-4a92-9d91-395826ab1864\") " pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-wgkn7" Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.136990 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cs7n\" (UniqueName: \"kubernetes.io/projected/d92c0e9c-9151-4619-8ce6-9eb5cd77d093-kube-api-access-6cs7n\") pod \"telemetry-operator-controller-manager-6854b8b9d9-49ss2\" (UID: \"d92c0e9c-9151-4619-8ce6-9eb5cd77d093\") " pod="openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-49ss2" Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.137186 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmc2h\" (UniqueName: \"kubernetes.io/projected/887bf8c6-7149-4e9b-b67c-bc70791532b2-kube-api-access-jmc2h\") pod \"placement-operator-controller-manager-574d45c66c-kwcbl\" (UID: \"887bf8c6-7149-4e9b-b67c-bc70791532b2\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-kwcbl" Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.149842 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-lbg2b"] Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.151426 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-lbg2b" Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.198694 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-9c8x6" Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.216834 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-49ss2"] Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.265369 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvq6s\" (UniqueName: \"kubernetes.io/projected/373cda74-3e73-4a92-9d91-395826ab1864-kube-api-access-cvq6s\") pod \"swift-operator-controller-manager-7f9cc5dd44-wgkn7\" (UID: \"373cda74-3e73-4a92-9d91-395826ab1864\") " pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-wgkn7" Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.265757 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cs7n\" (UniqueName: \"kubernetes.io/projected/d92c0e9c-9151-4619-8ce6-9eb5cd77d093-kube-api-access-6cs7n\") pod \"telemetry-operator-controller-manager-6854b8b9d9-49ss2\" (UID: \"d92c0e9c-9151-4619-8ce6-9eb5cd77d093\") " pod="openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-49ss2" Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.265805 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmc2h\" (UniqueName: \"kubernetes.io/projected/887bf8c6-7149-4e9b-b67c-bc70791532b2-kube-api-access-jmc2h\") pod \"placement-operator-controller-manager-574d45c66c-kwcbl\" (UID: \"887bf8c6-7149-4e9b-b67c-bc70791532b2\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-kwcbl" Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.265835 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5nbt\" (UniqueName: \"kubernetes.io/projected/b440d436-7017-4503-823f-16e998dbf74d-kube-api-access-d5nbt\") pod \"test-operator-controller-manager-5c5cb9c4d7-lbg2b\" (UID: \"b440d436-7017-4503-823f-16e998dbf74d\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-lbg2b" Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.265857 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpb4w\" (UniqueName: \"kubernetes.io/projected/633253f4-c3d4-439d-9b79-dee8c6d41bdc-kube-api-access-kpb4w\") pod \"watcher-operator-controller-manager-6c4d75f7f9-9rs6t\" (UID: \"633253f4-c3d4-439d-9b79-dee8c6d41bdc\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-9rs6t" Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.274841 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-9rs6t"] Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.288436 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmc2h\" (UniqueName: \"kubernetes.io/projected/887bf8c6-7149-4e9b-b67c-bc70791532b2-kube-api-access-jmc2h\") pod \"placement-operator-controller-manager-574d45c66c-kwcbl\" (UID: \"887bf8c6-7149-4e9b-b67c-bc70791532b2\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-kwcbl" Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.289944 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvq6s\" (UniqueName: \"kubernetes.io/projected/373cda74-3e73-4a92-9d91-395826ab1864-kube-api-access-cvq6s\") pod \"swift-operator-controller-manager-7f9cc5dd44-wgkn7\" (UID: \"373cda74-3e73-4a92-9d91-395826ab1864\") " pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-wgkn7" Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.291511 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cs7n\" (UniqueName: \"kubernetes.io/projected/d92c0e9c-9151-4619-8ce6-9eb5cd77d093-kube-api-access-6cs7n\") pod \"telemetry-operator-controller-manager-6854b8b9d9-49ss2\" (UID: \"d92c0e9c-9151-4619-8ce6-9eb5cd77d093\") " pod="openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-49ss2" Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.320345 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-7967f" Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.333038 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-kwcbl" Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.368474 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5nbt\" (UniqueName: \"kubernetes.io/projected/b440d436-7017-4503-823f-16e998dbf74d-kube-api-access-d5nbt\") pod \"test-operator-controller-manager-5c5cb9c4d7-lbg2b\" (UID: \"b440d436-7017-4503-823f-16e998dbf74d\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-lbg2b" Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.368552 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpb4w\" (UniqueName: \"kubernetes.io/projected/633253f4-c3d4-439d-9b79-dee8c6d41bdc-kube-api-access-kpb4w\") pod \"watcher-operator-controller-manager-6c4d75f7f9-9rs6t\" (UID: \"633253f4-c3d4-439d-9b79-dee8c6d41bdc\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-9rs6t" Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.380053 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-lbg2b"] Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.384480 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpb4w\" (UniqueName: \"kubernetes.io/projected/633253f4-c3d4-439d-9b79-dee8c6d41bdc-kube-api-access-kpb4w\") pod \"watcher-operator-controller-manager-6c4d75f7f9-9rs6t\" (UID: \"633253f4-c3d4-439d-9b79-dee8c6d41bdc\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-9rs6t" Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.396370 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5nbt\" (UniqueName: \"kubernetes.io/projected/b440d436-7017-4503-823f-16e998dbf74d-kube-api-access-d5nbt\") pod \"test-operator-controller-manager-5c5cb9c4d7-lbg2b\" (UID: \"b440d436-7017-4503-823f-16e998dbf74d\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-lbg2b" Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.435089 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6484b7b757-kd7nd"] Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.436032 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-kd7nd" Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.438933 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.439116 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-pmxnx" Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.439227 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.445496 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6484b7b757-kd7nd"] Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.451405 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-xr2zr"] Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.466609 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-wgkn7" Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.478668 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-d47688694-lg76w" event={"ID":"bb479ff8-15fb-458b-87c7-ec4b3d15721d","Type":"ContainerStarted","Data":"3f408ee2452509390230007b02f0afeb09237c9ed0817aa9013bfe7c826038a5"} Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.483441 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-cjqhn" event={"ID":"10f5729a-cf9e-4851-9173-2c6c0d6dbcf0","Type":"ContainerStarted","Data":"c3bdb55eb877759c59149786a2cc181d45b52f16a54a7cc8a3bbc0d8ca8ef51a"} Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.487745 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-xr2zr" event={"ID":"49eeeddb-7017-4642-aabb-ea932fa16ac7","Type":"ContainerStarted","Data":"3d60b560b4cb0f8f1ab39fade1de5f2be33220305fedfdb8c697ec9934cc7ae5"} Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.504579 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dq859"] Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.506018 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dq859" Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.508286 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-49ss2" Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.509920 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-gkkmj" Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.523911 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dq859"] Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.539268 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-d47688694-lg76w"] Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.548420 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-cjqhn"] Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.555456 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-65klp"] Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.561491 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-9rs6t" Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.571451 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-djn7l"] Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.578503 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmgbv\" (UniqueName: \"kubernetes.io/projected/7bab7ac9-4699-41cc-b1ca-0f344b13ab15-kube-api-access-lmgbv\") pod \"rabbitmq-cluster-operator-manager-668c99d594-dq859\" (UID: \"7bab7ac9-4699-41cc-b1ca-0f344b13ab15\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dq859" Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.578615 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba3c47d0-78fd-4b4d-8aa8-66c2379f8c0f-cert\") pod \"openstack-baremetal-operator-controller-manager-6f7958d774rh7fr\" (UID: \"ba3c47d0-78fd-4b4d-8aa8-66c2379f8c0f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f7958d774rh7fr" Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.578638 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/be24c0b7-17b5-4962-b12d-f438a21b953f-metrics-certs\") pod \"openstack-operator-controller-manager-6484b7b757-kd7nd\" (UID: \"be24c0b7-17b5-4962-b12d-f438a21b953f\") " pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-kd7nd" Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.578658 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/be24c0b7-17b5-4962-b12d-f438a21b953f-webhook-certs\") pod \"openstack-operator-controller-manager-6484b7b757-kd7nd\" (UID: \"be24c0b7-17b5-4962-b12d-f438a21b953f\") " pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-kd7nd" Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.578706 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7xpb\" (UniqueName: \"kubernetes.io/projected/be24c0b7-17b5-4962-b12d-f438a21b953f-kube-api-access-k7xpb\") pod \"openstack-operator-controller-manager-6484b7b757-kd7nd\" (UID: \"be24c0b7-17b5-4962-b12d-f438a21b953f\") " pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-kd7nd" Mar 14 07:18:04 crc kubenswrapper[4893]: E0314 07:18:04.578890 4893 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 14 07:18:04 crc kubenswrapper[4893]: E0314 07:18:04.578931 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba3c47d0-78fd-4b4d-8aa8-66c2379f8c0f-cert podName:ba3c47d0-78fd-4b4d-8aa8-66c2379f8c0f nodeName:}" failed. No retries permitted until 2026-03-14 07:18:05.578917506 +0000 UTC m=+1164.841094298 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ba3c47d0-78fd-4b4d-8aa8-66c2379f8c0f-cert") pod "openstack-baremetal-operator-controller-manager-6f7958d774rh7fr" (UID: "ba3c47d0-78fd-4b4d-8aa8-66c2379f8c0f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.580067 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-qfg79"] Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.592160 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-lbg2b" Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.680210 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/be24c0b7-17b5-4962-b12d-f438a21b953f-metrics-certs\") pod \"openstack-operator-controller-manager-6484b7b757-kd7nd\" (UID: \"be24c0b7-17b5-4962-b12d-f438a21b953f\") " pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-kd7nd" Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.680253 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/be24c0b7-17b5-4962-b12d-f438a21b953f-webhook-certs\") pod \"openstack-operator-controller-manager-6484b7b757-kd7nd\" (UID: \"be24c0b7-17b5-4962-b12d-f438a21b953f\") " pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-kd7nd" Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.680310 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7xpb\" (UniqueName: \"kubernetes.io/projected/be24c0b7-17b5-4962-b12d-f438a21b953f-kube-api-access-k7xpb\") pod \"openstack-operator-controller-manager-6484b7b757-kd7nd\" (UID: \"be24c0b7-17b5-4962-b12d-f438a21b953f\") " pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-kd7nd" Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.680372 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmgbv\" (UniqueName: \"kubernetes.io/projected/7bab7ac9-4699-41cc-b1ca-0f344b13ab15-kube-api-access-lmgbv\") pod \"rabbitmq-cluster-operator-manager-668c99d594-dq859\" (UID: \"7bab7ac9-4699-41cc-b1ca-0f344b13ab15\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dq859" Mar 14 07:18:04 crc kubenswrapper[4893]: E0314 07:18:04.680852 4893 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 14 07:18:04 crc kubenswrapper[4893]: E0314 07:18:04.680893 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be24c0b7-17b5-4962-b12d-f438a21b953f-metrics-certs podName:be24c0b7-17b5-4962-b12d-f438a21b953f nodeName:}" failed. No retries permitted until 2026-03-14 07:18:05.180880439 +0000 UTC m=+1164.443057231 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/be24c0b7-17b5-4962-b12d-f438a21b953f-metrics-certs") pod "openstack-operator-controller-manager-6484b7b757-kd7nd" (UID: "be24c0b7-17b5-4962-b12d-f438a21b953f") : secret "metrics-server-cert" not found Mar 14 07:18:04 crc kubenswrapper[4893]: E0314 07:18:04.681024 4893 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 14 07:18:04 crc kubenswrapper[4893]: E0314 07:18:04.681049 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be24c0b7-17b5-4962-b12d-f438a21b953f-webhook-certs podName:be24c0b7-17b5-4962-b12d-f438a21b953f nodeName:}" failed. No retries permitted until 2026-03-14 07:18:05.181039744 +0000 UTC m=+1164.443216536 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/be24c0b7-17b5-4962-b12d-f438a21b953f-webhook-certs") pod "openstack-operator-controller-manager-6484b7b757-kd7nd" (UID: "be24c0b7-17b5-4962-b12d-f438a21b953f") : secret "webhook-server-cert" not found Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.702164 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmgbv\" (UniqueName: \"kubernetes.io/projected/7bab7ac9-4699-41cc-b1ca-0f344b13ab15-kube-api-access-lmgbv\") pod \"rabbitmq-cluster-operator-manager-668c99d594-dq859\" (UID: \"7bab7ac9-4699-41cc-b1ca-0f344b13ab15\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dq859" Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.702975 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7xpb\" (UniqueName: \"kubernetes.io/projected/be24c0b7-17b5-4962-b12d-f438a21b953f-kube-api-access-k7xpb\") pod \"openstack-operator-controller-manager-6484b7b757-kd7nd\" (UID: \"be24c0b7-17b5-4962-b12d-f438a21b953f\") " pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-kd7nd" Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.769111 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-xfpdm"] Mar 14 07:18:04 crc kubenswrapper[4893]: I0314 07:18:04.966722 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dq859" Mar 14 07:18:05 crc kubenswrapper[4893]: I0314 07:18:05.062639 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-kwcbl"] Mar 14 07:18:05 crc kubenswrapper[4893]: I0314 07:18:05.085814 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-7f84474648-xjwp7"] Mar 14 07:18:05 crc kubenswrapper[4893]: I0314 07:18:05.086480 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9966deb5-a260-43a8-bec4-772b6308266d-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-qmt9z\" (UID: \"9966deb5-a260-43a8-bec4-772b6308266d\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-qmt9z" Mar 14 07:18:05 crc kubenswrapper[4893]: E0314 07:18:05.086647 4893 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 14 07:18:05 crc kubenswrapper[4893]: E0314 07:18:05.086722 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9966deb5-a260-43a8-bec4-772b6308266d-cert podName:9966deb5-a260-43a8-bec4-772b6308266d nodeName:}" failed. No retries permitted until 2026-03-14 07:18:07.086692835 +0000 UTC m=+1166.348869687 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9966deb5-a260-43a8-bec4-772b6308266d-cert") pod "infra-operator-controller-manager-54dc5b8f8d-qmt9z" (UID: "9966deb5-a260-43a8-bec4-772b6308266d") : secret "infra-operator-webhook-server-cert" not found Mar 14 07:18:05 crc kubenswrapper[4893]: I0314 07:18:05.112594 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-gztc6"] Mar 14 07:18:05 crc kubenswrapper[4893]: I0314 07:18:05.125124 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-cxg74"] Mar 14 07:18:05 crc kubenswrapper[4893]: W0314 07:18:05.142567 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a01facb_b7c0_476b_96b1_759e6e9f3c30.slice/crio-0305ba6d4c81a1392fa810f164a718fece56caa555fc43ac6467d3b6a4834953 WatchSource:0}: Error finding container 0305ba6d4c81a1392fa810f164a718fece56caa555fc43ac6467d3b6a4834953: Status 404 returned error can't find the container with id 0305ba6d4c81a1392fa810f164a718fece56caa555fc43ac6467d3b6a4834953 Mar 14 07:18:05 crc kubenswrapper[4893]: I0314 07:18:05.144240 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5bc894d9b-gm6pg"] Mar 14 07:18:05 crc kubenswrapper[4893]: I0314 07:18:05.163837 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-zkh7h"] Mar 14 07:18:05 crc kubenswrapper[4893]: I0314 07:18:05.170458 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-7967f"] Mar 14 07:18:05 crc kubenswrapper[4893]: I0314 07:18:05.176950 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-57b484b4df-tlm9k"] Mar 14 07:18:05 crc kubenswrapper[4893]: I0314 07:18:05.188060 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-7f9cc5dd44-wgkn7"] Mar 14 07:18:05 crc kubenswrapper[4893]: I0314 07:18:05.188297 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/be24c0b7-17b5-4962-b12d-f438a21b953f-metrics-certs\") pod \"openstack-operator-controller-manager-6484b7b757-kd7nd\" (UID: \"be24c0b7-17b5-4962-b12d-f438a21b953f\") " pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-kd7nd" Mar 14 07:18:05 crc kubenswrapper[4893]: I0314 07:18:05.188333 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/be24c0b7-17b5-4962-b12d-f438a21b953f-webhook-certs\") pod \"openstack-operator-controller-manager-6484b7b757-kd7nd\" (UID: \"be24c0b7-17b5-4962-b12d-f438a21b953f\") " pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-kd7nd" Mar 14 07:18:05 crc kubenswrapper[4893]: E0314 07:18:05.188461 4893 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 14 07:18:05 crc kubenswrapper[4893]: E0314 07:18:05.188515 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be24c0b7-17b5-4962-b12d-f438a21b953f-webhook-certs podName:be24c0b7-17b5-4962-b12d-f438a21b953f nodeName:}" failed. No retries permitted until 2026-03-14 07:18:06.188496516 +0000 UTC m=+1165.450673318 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/be24c0b7-17b5-4962-b12d-f438a21b953f-webhook-certs") pod "openstack-operator-controller-manager-6484b7b757-kd7nd" (UID: "be24c0b7-17b5-4962-b12d-f438a21b953f") : secret "webhook-server-cert" not found Mar 14 07:18:05 crc kubenswrapper[4893]: E0314 07:18:05.188648 4893 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 14 07:18:05 crc kubenswrapper[4893]: E0314 07:18:05.188683 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be24c0b7-17b5-4962-b12d-f438a21b953f-metrics-certs podName:be24c0b7-17b5-4962-b12d-f438a21b953f nodeName:}" failed. No retries permitted until 2026-03-14 07:18:06.18867173 +0000 UTC m=+1165.450848522 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/be24c0b7-17b5-4962-b12d-f438a21b953f-metrics-certs") pod "openstack-operator-controller-manager-6484b7b757-kd7nd" (UID: "be24c0b7-17b5-4962-b12d-f438a21b953f") : secret "metrics-server-cert" not found Mar 14 07:18:05 crc kubenswrapper[4893]: I0314 07:18:05.195065 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-lbg2b"] Mar 14 07:18:05 crc kubenswrapper[4893]: E0314 07:18:05.196596 4893 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:dd62e104225ea255af5a32828af4c21e1dfb50fbdf35cd41d07d1326f9017a40,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9s6b7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-57b484b4df-tlm9k_openstack-operators(7a3a590a-bacd-4d27-91f1-1b6afd52ab3e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 14 07:18:05 crc kubenswrapper[4893]: E0314 07:18:05.198670 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-tlm9k" podUID="7a3a590a-bacd-4d27-91f1-1b6afd52ab3e" Mar 14 07:18:05 crc kubenswrapper[4893]: I0314 07:18:05.205117 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-9rs6t"] Mar 14 07:18:05 crc kubenswrapper[4893]: E0314 07:18:05.210004 4893 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:72db77c98e7bca64d469b4dc316e9c8d329681f825d19ef8f333437fb1c6d3f5,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-cvq6s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-7f9cc5dd44-wgkn7_openstack-operators(373cda74-3e73-4a92-9d91-395826ab1864): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 14 07:18:05 crc kubenswrapper[4893]: E0314 07:18:05.212091 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-wgkn7" podUID="373cda74-3e73-4a92-9d91-395826ab1864" Mar 14 07:18:05 crc kubenswrapper[4893]: E0314 07:18:05.236058 4893 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kpb4w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6c4d75f7f9-9rs6t_openstack-operators(633253f4-c3d4-439d-9b79-dee8c6d41bdc): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 14 07:18:05 crc kubenswrapper[4893]: E0314 07:18:05.237343 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-9rs6t" podUID="633253f4-c3d4-439d-9b79-dee8c6d41bdc" Mar 14 07:18:05 crc kubenswrapper[4893]: E0314 07:18:05.239252 4893 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:3a0fc90da4caf7412ae01e21542b53a10fe7a2732a705b0ae83f926d72c7332a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6cs7n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-6854b8b9d9-49ss2_openstack-operators(d92c0e9c-9151-4619-8ce6-9eb5cd77d093): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 14 07:18:05 crc kubenswrapper[4893]: E0314 07:18:05.242156 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-49ss2" podUID="d92c0e9c-9151-4619-8ce6-9eb5cd77d093" Mar 14 07:18:05 crc kubenswrapper[4893]: I0314 07:18:05.245167 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-49ss2"] Mar 14 07:18:05 crc kubenswrapper[4893]: I0314 07:18:05.254001 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dq859"] Mar 14 07:18:05 crc kubenswrapper[4893]: W0314 07:18:05.260827 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bab7ac9_4699_41cc_b1ca_0f344b13ab15.slice/crio-345fb4c32eafaac24334bb684afe038033ba54179fcdf134a115428736d40554 WatchSource:0}: Error finding container 345fb4c32eafaac24334bb684afe038033ba54179fcdf134a115428736d40554: Status 404 returned error can't find the container with id 345fb4c32eafaac24334bb684afe038033ba54179fcdf134a115428736d40554 Mar 14 07:18:05 crc kubenswrapper[4893]: E0314 07:18:05.263691 4893 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lmgbv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-dq859_openstack-operators(7bab7ac9-4699-41cc-b1ca-0f344b13ab15): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 14 07:18:05 crc kubenswrapper[4893]: W0314 07:18:05.263923 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb440d436_7017_4503_823f_16e998dbf74d.slice/crio-19367c58efb5bcab2925e410911d8d3b816f85cfd2efada5b63ee2da379988a1 WatchSource:0}: Error finding container 19367c58efb5bcab2925e410911d8d3b816f85cfd2efada5b63ee2da379988a1: Status 404 returned error can't find the container with id 19367c58efb5bcab2925e410911d8d3b816f85cfd2efada5b63ee2da379988a1 Mar 14 07:18:05 crc kubenswrapper[4893]: E0314 07:18:05.264818 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dq859" podUID="7bab7ac9-4699-41cc-b1ca-0f344b13ab15" Mar 14 07:18:05 crc kubenswrapper[4893]: E0314 07:18:05.271362 4893 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-d5nbt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5c5cb9c4d7-lbg2b_openstack-operators(b440d436-7017-4503-823f-16e998dbf74d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 14 07:18:05 crc kubenswrapper[4893]: E0314 07:18:05.272731 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-lbg2b" podUID="b440d436-7017-4503-823f-16e998dbf74d" Mar 14 07:18:05 crc kubenswrapper[4893]: I0314 07:18:05.494389 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-9rs6t" event={"ID":"633253f4-c3d4-439d-9b79-dee8c6d41bdc","Type":"ContainerStarted","Data":"8bec6a0424da538c728274de6d75d07f277fffee80c2714df492fb6373daa792"} Mar 14 07:18:05 crc kubenswrapper[4893]: E0314 07:18:05.496033 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-9rs6t" podUID="633253f4-c3d4-439d-9b79-dee8c6d41bdc" Mar 14 07:18:05 crc kubenswrapper[4893]: I0314 07:18:05.496366 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-tlm9k" event={"ID":"7a3a590a-bacd-4d27-91f1-1b6afd52ab3e","Type":"ContainerStarted","Data":"1379f605769c96d46d3e5f424966b60a7d6f841d87b6bfe83fbb2bfe2ee2cba1"} Mar 14 07:18:05 crc kubenswrapper[4893]: E0314 07:18:05.497339 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:dd62e104225ea255af5a32828af4c21e1dfb50fbdf35cd41d07d1326f9017a40\\\"\"" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-tlm9k" podUID="7a3a590a-bacd-4d27-91f1-1b6afd52ab3e" Mar 14 07:18:05 crc kubenswrapper[4893]: I0314 07:18:05.525856 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-gm6pg" event={"ID":"1e068b17-cab4-421d-b501-cd825de6b67c","Type":"ContainerStarted","Data":"7a835885e8a09a06968e14dd01646dfa9fe78e80eccca3d8965f95115c5f2d28"} Mar 14 07:18:05 crc kubenswrapper[4893]: I0314 07:18:05.527857 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-65klp" event={"ID":"9e7e613b-2bb4-40a5-a8c2-5649763d4a61","Type":"ContainerStarted","Data":"806a2857170f48ab9b11bb9f42091c7fcd6b32f6a4aafd5268f789742380ce9d"} Mar 14 07:18:05 crc kubenswrapper[4893]: I0314 07:18:05.528973 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-xfpdm" event={"ID":"01b24940-e97d-472d-902c-87bfe6b67147","Type":"ContainerStarted","Data":"539290a36a1b7f013bb762982397119bc4129fb950ada3ef62e9a8edae79cbbb"} Mar 14 07:18:05 crc kubenswrapper[4893]: I0314 07:18:05.531490 4893 generic.go:334] "Generic (PLEG): container finished" podID="85464791-1f62-4d33-bd20-813896eef4b8" containerID="748fde2a5a1f03fcf4446be0f3aae719eb6b6ddc7af7b25240b22a1f8bfac2ea" exitCode=0 Mar 14 07:18:05 crc kubenswrapper[4893]: I0314 07:18:05.531710 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557878-cvzrx" event={"ID":"85464791-1f62-4d33-bd20-813896eef4b8","Type":"ContainerDied","Data":"748fde2a5a1f03fcf4446be0f3aae719eb6b6ddc7af7b25240b22a1f8bfac2ea"} Mar 14 07:18:05 crc kubenswrapper[4893]: I0314 07:18:05.540370 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-qfg79" event={"ID":"a8ac6ab6-a354-4d94-9bfb-48ae4b15ff88","Type":"ContainerStarted","Data":"cd69652bed7878e3f8155199bcdb0f1866b535e888728dce934d1c8a2a97ae39"} Mar 14 07:18:05 crc kubenswrapper[4893]: I0314 07:18:05.542650 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dq859" event={"ID":"7bab7ac9-4699-41cc-b1ca-0f344b13ab15","Type":"ContainerStarted","Data":"345fb4c32eafaac24334bb684afe038033ba54179fcdf134a115428736d40554"} Mar 14 07:18:05 crc kubenswrapper[4893]: E0314 07:18:05.543722 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dq859" podUID="7bab7ac9-4699-41cc-b1ca-0f344b13ab15" Mar 14 07:18:05 crc kubenswrapper[4893]: I0314 07:18:05.544120 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-7967f" event={"ID":"2b0418ea-6737-4c52-b1bc-915fba5ef735","Type":"ContainerStarted","Data":"f61c0b58f2c95c0d4766c0255a9e1361f40fd6166eda88f46ec629f0f07021fb"} Mar 14 07:18:05 crc kubenswrapper[4893]: I0314 07:18:05.545096 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-wgkn7" event={"ID":"373cda74-3e73-4a92-9d91-395826ab1864","Type":"ContainerStarted","Data":"8b3bbe9cfde471afbf92465d4a0fb7be1899a477c4c64f91e08cea3111817e7f"} Mar 14 07:18:05 crc kubenswrapper[4893]: E0314 07:18:05.546420 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:72db77c98e7bca64d469b4dc316e9c8d329681f825d19ef8f333437fb1c6d3f5\\\"\"" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-wgkn7" podUID="373cda74-3e73-4a92-9d91-395826ab1864" Mar 14 07:18:05 crc kubenswrapper[4893]: I0314 07:18:05.547677 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-zkh7h" event={"ID":"c50bd703-5cd9-418a-b623-4809a8ff4213","Type":"ContainerStarted","Data":"84215ead3c7fa0916eb38ab92e5f5c723f4c203afd14ae060fce38fdd5d0003c"} Mar 14 07:18:05 crc kubenswrapper[4893]: I0314 07:18:05.548717 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-kwcbl" event={"ID":"887bf8c6-7149-4e9b-b67c-bc70791532b2","Type":"ContainerStarted","Data":"d6f8ae4626a114e1984634d61e6996fc199d87707cd00bd00bd2d035ee6222aa"} Mar 14 07:18:05 crc kubenswrapper[4893]: I0314 07:18:05.549666 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-gztc6" event={"ID":"1a01facb-b7c0-476b-96b1-759e6e9f3c30","Type":"ContainerStarted","Data":"0305ba6d4c81a1392fa810f164a718fece56caa555fc43ac6467d3b6a4834953"} Mar 14 07:18:05 crc kubenswrapper[4893]: I0314 07:18:05.551007 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-7f84474648-xjwp7" event={"ID":"0997c764-6b32-41ea-adca-b04feb1fbe6f","Type":"ContainerStarted","Data":"ef18d423fe84c75f3f9176bc67cbc5199571505961b115bdd43c41b876d6f27c"} Mar 14 07:18:05 crc kubenswrapper[4893]: I0314 07:18:05.551854 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-lbg2b" event={"ID":"b440d436-7017-4503-823f-16e998dbf74d","Type":"ContainerStarted","Data":"19367c58efb5bcab2925e410911d8d3b816f85cfd2efada5b63ee2da379988a1"} Mar 14 07:18:05 crc kubenswrapper[4893]: I0314 07:18:05.555387 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-cxg74" event={"ID":"b8da0600-7ae7-4f7a-8b3e-c81523dc6034","Type":"ContainerStarted","Data":"b6d7726af951240d2979a54d591ebc895e81de15973b432a6e9946dedca16f5f"} Mar 14 07:18:05 crc kubenswrapper[4893]: E0314 07:18:05.556361 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-lbg2b" podUID="b440d436-7017-4503-823f-16e998dbf74d" Mar 14 07:18:05 crc kubenswrapper[4893]: I0314 07:18:05.557238 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-djn7l" event={"ID":"26bb4dec-8e1a-4fdc-a7ca-808ae7026afe","Type":"ContainerStarted","Data":"5bc13a2a38a4dee94b491aec9cf3f9e6407543a88fd88abd590ef9419e688311"} Mar 14 07:18:05 crc kubenswrapper[4893]: I0314 07:18:05.559112 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-49ss2" event={"ID":"d92c0e9c-9151-4619-8ce6-9eb5cd77d093","Type":"ContainerStarted","Data":"bc2dbe62707ea0bfe85fdf0d748857e01a0cc413f1f391ef57be7067ca4d4627"} Mar 14 07:18:05 crc kubenswrapper[4893]: E0314 07:18:05.560735 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:3a0fc90da4caf7412ae01e21542b53a10fe7a2732a705b0ae83f926d72c7332a\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-49ss2" podUID="d92c0e9c-9151-4619-8ce6-9eb5cd77d093" Mar 14 07:18:05 crc kubenswrapper[4893]: I0314 07:18:05.594597 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba3c47d0-78fd-4b4d-8aa8-66c2379f8c0f-cert\") pod \"openstack-baremetal-operator-controller-manager-6f7958d774rh7fr\" (UID: \"ba3c47d0-78fd-4b4d-8aa8-66c2379f8c0f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f7958d774rh7fr" Mar 14 07:18:05 crc kubenswrapper[4893]: E0314 07:18:05.595306 4893 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 14 07:18:05 crc kubenswrapper[4893]: E0314 07:18:05.595778 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba3c47d0-78fd-4b4d-8aa8-66c2379f8c0f-cert podName:ba3c47d0-78fd-4b4d-8aa8-66c2379f8c0f nodeName:}" failed. No retries permitted until 2026-03-14 07:18:07.595756597 +0000 UTC m=+1166.857933429 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ba3c47d0-78fd-4b4d-8aa8-66c2379f8c0f-cert") pod "openstack-baremetal-operator-controller-manager-6f7958d774rh7fr" (UID: "ba3c47d0-78fd-4b4d-8aa8-66c2379f8c0f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 14 07:18:06 crc kubenswrapper[4893]: I0314 07:18:06.203916 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/be24c0b7-17b5-4962-b12d-f438a21b953f-metrics-certs\") pod \"openstack-operator-controller-manager-6484b7b757-kd7nd\" (UID: \"be24c0b7-17b5-4962-b12d-f438a21b953f\") " pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-kd7nd" Mar 14 07:18:06 crc kubenswrapper[4893]: I0314 07:18:06.204339 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/be24c0b7-17b5-4962-b12d-f438a21b953f-webhook-certs\") pod \"openstack-operator-controller-manager-6484b7b757-kd7nd\" (UID: \"be24c0b7-17b5-4962-b12d-f438a21b953f\") " pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-kd7nd" Mar 14 07:18:06 crc kubenswrapper[4893]: E0314 07:18:06.204132 4893 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 14 07:18:06 crc kubenswrapper[4893]: E0314 07:18:06.204468 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be24c0b7-17b5-4962-b12d-f438a21b953f-metrics-certs podName:be24c0b7-17b5-4962-b12d-f438a21b953f nodeName:}" failed. No retries permitted until 2026-03-14 07:18:08.204441896 +0000 UTC m=+1167.466618688 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/be24c0b7-17b5-4962-b12d-f438a21b953f-metrics-certs") pod "openstack-operator-controller-manager-6484b7b757-kd7nd" (UID: "be24c0b7-17b5-4962-b12d-f438a21b953f") : secret "metrics-server-cert" not found Mar 14 07:18:06 crc kubenswrapper[4893]: E0314 07:18:06.204550 4893 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 14 07:18:06 crc kubenswrapper[4893]: E0314 07:18:06.204620 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be24c0b7-17b5-4962-b12d-f438a21b953f-webhook-certs podName:be24c0b7-17b5-4962-b12d-f438a21b953f nodeName:}" failed. No retries permitted until 2026-03-14 07:18:08.20459928 +0000 UTC m=+1167.466776152 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/be24c0b7-17b5-4962-b12d-f438a21b953f-webhook-certs") pod "openstack-operator-controller-manager-6484b7b757-kd7nd" (UID: "be24c0b7-17b5-4962-b12d-f438a21b953f") : secret "webhook-server-cert" not found Mar 14 07:18:06 crc kubenswrapper[4893]: E0314 07:18:06.572075 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dq859" podUID="7bab7ac9-4699-41cc-b1ca-0f344b13ab15" Mar 14 07:18:06 crc kubenswrapper[4893]: E0314 07:18:06.572336 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:72db77c98e7bca64d469b4dc316e9c8d329681f825d19ef8f333437fb1c6d3f5\\\"\"" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-wgkn7" podUID="373cda74-3e73-4a92-9d91-395826ab1864" Mar 14 07:18:06 crc kubenswrapper[4893]: E0314 07:18:06.572394 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-lbg2b" podUID="b440d436-7017-4503-823f-16e998dbf74d" Mar 14 07:18:06 crc kubenswrapper[4893]: E0314 07:18:06.572826 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:3a0fc90da4caf7412ae01e21542b53a10fe7a2732a705b0ae83f926d72c7332a\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-49ss2" podUID="d92c0e9c-9151-4619-8ce6-9eb5cd77d093" Mar 14 07:18:06 crc kubenswrapper[4893]: E0314 07:18:06.587340 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-9rs6t" podUID="633253f4-c3d4-439d-9b79-dee8c6d41bdc" Mar 14 07:18:06 crc kubenswrapper[4893]: E0314 07:18:06.589694 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:dd62e104225ea255af5a32828af4c21e1dfb50fbdf35cd41d07d1326f9017a40\\\"\"" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-tlm9k" podUID="7a3a590a-bacd-4d27-91f1-1b6afd52ab3e" Mar 14 07:18:07 crc kubenswrapper[4893]: I0314 07:18:07.123055 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9966deb5-a260-43a8-bec4-772b6308266d-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-qmt9z\" (UID: \"9966deb5-a260-43a8-bec4-772b6308266d\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-qmt9z" Mar 14 07:18:07 crc kubenswrapper[4893]: E0314 07:18:07.123186 4893 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 14 07:18:07 crc kubenswrapper[4893]: E0314 07:18:07.123998 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9966deb5-a260-43a8-bec4-772b6308266d-cert podName:9966deb5-a260-43a8-bec4-772b6308266d nodeName:}" failed. No retries permitted until 2026-03-14 07:18:11.123948087 +0000 UTC m=+1170.386124879 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9966deb5-a260-43a8-bec4-772b6308266d-cert") pod "infra-operator-controller-manager-54dc5b8f8d-qmt9z" (UID: "9966deb5-a260-43a8-bec4-772b6308266d") : secret "infra-operator-webhook-server-cert" not found Mar 14 07:18:07 crc kubenswrapper[4893]: I0314 07:18:07.631230 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba3c47d0-78fd-4b4d-8aa8-66c2379f8c0f-cert\") pod \"openstack-baremetal-operator-controller-manager-6f7958d774rh7fr\" (UID: \"ba3c47d0-78fd-4b4d-8aa8-66c2379f8c0f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f7958d774rh7fr" Mar 14 07:18:07 crc kubenswrapper[4893]: E0314 07:18:07.631444 4893 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 14 07:18:07 crc kubenswrapper[4893]: E0314 07:18:07.631551 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba3c47d0-78fd-4b4d-8aa8-66c2379f8c0f-cert podName:ba3c47d0-78fd-4b4d-8aa8-66c2379f8c0f nodeName:}" failed. No retries permitted until 2026-03-14 07:18:11.631510308 +0000 UTC m=+1170.893687190 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ba3c47d0-78fd-4b4d-8aa8-66c2379f8c0f-cert") pod "openstack-baremetal-operator-controller-manager-6f7958d774rh7fr" (UID: "ba3c47d0-78fd-4b4d-8aa8-66c2379f8c0f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 14 07:18:08 crc kubenswrapper[4893]: I0314 07:18:08.242825 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/be24c0b7-17b5-4962-b12d-f438a21b953f-metrics-certs\") pod \"openstack-operator-controller-manager-6484b7b757-kd7nd\" (UID: \"be24c0b7-17b5-4962-b12d-f438a21b953f\") " pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-kd7nd" Mar 14 07:18:08 crc kubenswrapper[4893]: I0314 07:18:08.242882 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/be24c0b7-17b5-4962-b12d-f438a21b953f-webhook-certs\") pod \"openstack-operator-controller-manager-6484b7b757-kd7nd\" (UID: \"be24c0b7-17b5-4962-b12d-f438a21b953f\") " pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-kd7nd" Mar 14 07:18:08 crc kubenswrapper[4893]: E0314 07:18:08.242995 4893 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 14 07:18:08 crc kubenswrapper[4893]: E0314 07:18:08.243064 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be24c0b7-17b5-4962-b12d-f438a21b953f-metrics-certs podName:be24c0b7-17b5-4962-b12d-f438a21b953f nodeName:}" failed. No retries permitted until 2026-03-14 07:18:12.243042212 +0000 UTC m=+1171.505219004 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/be24c0b7-17b5-4962-b12d-f438a21b953f-metrics-certs") pod "openstack-operator-controller-manager-6484b7b757-kd7nd" (UID: "be24c0b7-17b5-4962-b12d-f438a21b953f") : secret "metrics-server-cert" not found Mar 14 07:18:08 crc kubenswrapper[4893]: E0314 07:18:08.243121 4893 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 14 07:18:08 crc kubenswrapper[4893]: E0314 07:18:08.243207 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be24c0b7-17b5-4962-b12d-f438a21b953f-webhook-certs podName:be24c0b7-17b5-4962-b12d-f438a21b953f nodeName:}" failed. No retries permitted until 2026-03-14 07:18:12.243188755 +0000 UTC m=+1171.505365537 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/be24c0b7-17b5-4962-b12d-f438a21b953f-webhook-certs") pod "openstack-operator-controller-manager-6484b7b757-kd7nd" (UID: "be24c0b7-17b5-4962-b12d-f438a21b953f") : secret "webhook-server-cert" not found Mar 14 07:18:11 crc kubenswrapper[4893]: I0314 07:18:11.184897 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9966deb5-a260-43a8-bec4-772b6308266d-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-qmt9z\" (UID: \"9966deb5-a260-43a8-bec4-772b6308266d\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-qmt9z" Mar 14 07:18:11 crc kubenswrapper[4893]: E0314 07:18:11.185076 4893 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 14 07:18:11 crc kubenswrapper[4893]: E0314 07:18:11.185644 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9966deb5-a260-43a8-bec4-772b6308266d-cert podName:9966deb5-a260-43a8-bec4-772b6308266d nodeName:}" failed. No retries permitted until 2026-03-14 07:18:19.185602425 +0000 UTC m=+1178.447779217 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9966deb5-a260-43a8-bec4-772b6308266d-cert") pod "infra-operator-controller-manager-54dc5b8f8d-qmt9z" (UID: "9966deb5-a260-43a8-bec4-772b6308266d") : secret "infra-operator-webhook-server-cert" not found Mar 14 07:18:11 crc kubenswrapper[4893]: I0314 07:18:11.691105 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba3c47d0-78fd-4b4d-8aa8-66c2379f8c0f-cert\") pod \"openstack-baremetal-operator-controller-manager-6f7958d774rh7fr\" (UID: \"ba3c47d0-78fd-4b4d-8aa8-66c2379f8c0f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f7958d774rh7fr" Mar 14 07:18:11 crc kubenswrapper[4893]: E0314 07:18:11.691362 4893 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 14 07:18:11 crc kubenswrapper[4893]: E0314 07:18:11.691418 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba3c47d0-78fd-4b4d-8aa8-66c2379f8c0f-cert podName:ba3c47d0-78fd-4b4d-8aa8-66c2379f8c0f nodeName:}" failed. No retries permitted until 2026-03-14 07:18:19.691403782 +0000 UTC m=+1178.953580574 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ba3c47d0-78fd-4b4d-8aa8-66c2379f8c0f-cert") pod "openstack-baremetal-operator-controller-manager-6f7958d774rh7fr" (UID: "ba3c47d0-78fd-4b4d-8aa8-66c2379f8c0f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 14 07:18:12 crc kubenswrapper[4893]: I0314 07:18:12.300239 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/be24c0b7-17b5-4962-b12d-f438a21b953f-metrics-certs\") pod \"openstack-operator-controller-manager-6484b7b757-kd7nd\" (UID: \"be24c0b7-17b5-4962-b12d-f438a21b953f\") " pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-kd7nd" Mar 14 07:18:12 crc kubenswrapper[4893]: I0314 07:18:12.300285 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/be24c0b7-17b5-4962-b12d-f438a21b953f-webhook-certs\") pod \"openstack-operator-controller-manager-6484b7b757-kd7nd\" (UID: \"be24c0b7-17b5-4962-b12d-f438a21b953f\") " pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-kd7nd" Mar 14 07:18:12 crc kubenswrapper[4893]: E0314 07:18:12.300487 4893 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 14 07:18:12 crc kubenswrapper[4893]: E0314 07:18:12.300569 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be24c0b7-17b5-4962-b12d-f438a21b953f-webhook-certs podName:be24c0b7-17b5-4962-b12d-f438a21b953f nodeName:}" failed. No retries permitted until 2026-03-14 07:18:20.300553058 +0000 UTC m=+1179.562729850 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/be24c0b7-17b5-4962-b12d-f438a21b953f-webhook-certs") pod "openstack-operator-controller-manager-6484b7b757-kd7nd" (UID: "be24c0b7-17b5-4962-b12d-f438a21b953f") : secret "webhook-server-cert" not found Mar 14 07:18:12 crc kubenswrapper[4893]: E0314 07:18:12.300613 4893 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 14 07:18:12 crc kubenswrapper[4893]: E0314 07:18:12.300632 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be24c0b7-17b5-4962-b12d-f438a21b953f-metrics-certs podName:be24c0b7-17b5-4962-b12d-f438a21b953f nodeName:}" failed. No retries permitted until 2026-03-14 07:18:20.30062647 +0000 UTC m=+1179.562803252 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/be24c0b7-17b5-4962-b12d-f438a21b953f-metrics-certs") pod "openstack-operator-controller-manager-6484b7b757-kd7nd" (UID: "be24c0b7-17b5-4962-b12d-f438a21b953f") : secret "metrics-server-cert" not found Mar 14 07:18:13 crc kubenswrapper[4893]: I0314 07:18:13.390311 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557878-cvzrx" Mar 14 07:18:13 crc kubenswrapper[4893]: I0314 07:18:13.518777 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6bbj\" (UniqueName: \"kubernetes.io/projected/85464791-1f62-4d33-bd20-813896eef4b8-kube-api-access-z6bbj\") pod \"85464791-1f62-4d33-bd20-813896eef4b8\" (UID: \"85464791-1f62-4d33-bd20-813896eef4b8\") " Mar 14 07:18:13 crc kubenswrapper[4893]: I0314 07:18:13.535412 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85464791-1f62-4d33-bd20-813896eef4b8-kube-api-access-z6bbj" (OuterVolumeSpecName: "kube-api-access-z6bbj") pod "85464791-1f62-4d33-bd20-813896eef4b8" (UID: "85464791-1f62-4d33-bd20-813896eef4b8"). InnerVolumeSpecName "kube-api-access-z6bbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:18:13 crc kubenswrapper[4893]: I0314 07:18:13.613238 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557878-cvzrx" event={"ID":"85464791-1f62-4d33-bd20-813896eef4b8","Type":"ContainerDied","Data":"22a9be2bbe1e823192c98fe72520e243768427c62ca4bd00647b109e3734c9f1"} Mar 14 07:18:13 crc kubenswrapper[4893]: I0314 07:18:13.613283 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557878-cvzrx" Mar 14 07:18:13 crc kubenswrapper[4893]: I0314 07:18:13.613307 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="22a9be2bbe1e823192c98fe72520e243768427c62ca4bd00647b109e3734c9f1" Mar 14 07:18:13 crc kubenswrapper[4893]: I0314 07:18:13.620149 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6bbj\" (UniqueName: \"kubernetes.io/projected/85464791-1f62-4d33-bd20-813896eef4b8-kube-api-access-z6bbj\") on node \"crc\" DevicePath \"\"" Mar 14 07:18:14 crc kubenswrapper[4893]: I0314 07:18:14.458446 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557872-s4jrn"] Mar 14 07:18:14 crc kubenswrapper[4893]: I0314 07:18:14.464091 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557872-s4jrn"] Mar 14 07:18:15 crc kubenswrapper[4893]: I0314 07:18:15.385922 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0ed59f6-a6b5-4601-97d6-66f298b0fc3e" path="/var/lib/kubelet/pods/f0ed59f6-a6b5-4601-97d6-66f298b0fc3e/volumes" Mar 14 07:18:19 crc kubenswrapper[4893]: I0314 07:18:19.217788 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9966deb5-a260-43a8-bec4-772b6308266d-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-qmt9z\" (UID: \"9966deb5-a260-43a8-bec4-772b6308266d\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-qmt9z" Mar 14 07:18:19 crc kubenswrapper[4893]: E0314 07:18:19.218006 4893 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 14 07:18:19 crc kubenswrapper[4893]: E0314 07:18:19.218322 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9966deb5-a260-43a8-bec4-772b6308266d-cert podName:9966deb5-a260-43a8-bec4-772b6308266d nodeName:}" failed. No retries permitted until 2026-03-14 07:18:35.218302842 +0000 UTC m=+1194.480479654 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9966deb5-a260-43a8-bec4-772b6308266d-cert") pod "infra-operator-controller-manager-54dc5b8f8d-qmt9z" (UID: "9966deb5-a260-43a8-bec4-772b6308266d") : secret "infra-operator-webhook-server-cert" not found Mar 14 07:18:19 crc kubenswrapper[4893]: I0314 07:18:19.724589 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba3c47d0-78fd-4b4d-8aa8-66c2379f8c0f-cert\") pod \"openstack-baremetal-operator-controller-manager-6f7958d774rh7fr\" (UID: \"ba3c47d0-78fd-4b4d-8aa8-66c2379f8c0f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f7958d774rh7fr" Mar 14 07:18:19 crc kubenswrapper[4893]: I0314 07:18:19.737466 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba3c47d0-78fd-4b4d-8aa8-66c2379f8c0f-cert\") pod \"openstack-baremetal-operator-controller-manager-6f7958d774rh7fr\" (UID: \"ba3c47d0-78fd-4b4d-8aa8-66c2379f8c0f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f7958d774rh7fr" Mar 14 07:18:19 crc kubenswrapper[4893]: I0314 07:18:19.861337 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f7958d774rh7fr" Mar 14 07:18:19 crc kubenswrapper[4893]: E0314 07:18:19.967793 4893 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:bbe772fa051f782c9dcc3c34ce43495e1116aa9089a760c10068790baa9b25ff" Mar 14 07:18:19 crc kubenswrapper[4893]: E0314 07:18:19.967985 4893 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:bbe772fa051f782c9dcc3c34ce43495e1116aa9089a760c10068790baa9b25ff,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2zgxp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-7f84474648-xjwp7_openstack-operators(0997c764-6b32-41ea-adca-b04feb1fbe6f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 14 07:18:19 crc kubenswrapper[4893]: E0314 07:18:19.969093 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-7f84474648-xjwp7" podUID="0997c764-6b32-41ea-adca-b04feb1fbe6f" Mar 14 07:18:20 crc kubenswrapper[4893]: I0314 07:18:20.336362 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/be24c0b7-17b5-4962-b12d-f438a21b953f-metrics-certs\") pod \"openstack-operator-controller-manager-6484b7b757-kd7nd\" (UID: \"be24c0b7-17b5-4962-b12d-f438a21b953f\") " pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-kd7nd" Mar 14 07:18:20 crc kubenswrapper[4893]: I0314 07:18:20.336405 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/be24c0b7-17b5-4962-b12d-f438a21b953f-webhook-certs\") pod \"openstack-operator-controller-manager-6484b7b757-kd7nd\" (UID: \"be24c0b7-17b5-4962-b12d-f438a21b953f\") " pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-kd7nd" Mar 14 07:18:20 crc kubenswrapper[4893]: I0314 07:18:20.341718 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/be24c0b7-17b5-4962-b12d-f438a21b953f-webhook-certs\") pod \"openstack-operator-controller-manager-6484b7b757-kd7nd\" (UID: \"be24c0b7-17b5-4962-b12d-f438a21b953f\") " pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-kd7nd" Mar 14 07:18:20 crc kubenswrapper[4893]: I0314 07:18:20.351320 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/be24c0b7-17b5-4962-b12d-f438a21b953f-metrics-certs\") pod \"openstack-operator-controller-manager-6484b7b757-kd7nd\" (UID: \"be24c0b7-17b5-4962-b12d-f438a21b953f\") " pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-kd7nd" Mar 14 07:18:20 crc kubenswrapper[4893]: E0314 07:18:20.546744 4893 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:40b84319f2f12a1c7ee478fd86a8b1aa5ac2ea8e24f5ce0f1ca78ad879dea8ca" Mar 14 07:18:20 crc kubenswrapper[4893]: E0314 07:18:20.547188 4893 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:40b84319f2f12a1c7ee478fd86a8b1aa5ac2ea8e24f5ce0f1ca78ad879dea8ca,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2dnkd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-684f77d66d-xfpdm_openstack-operators(01b24940-e97d-472d-902c-87bfe6b67147): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 14 07:18:20 crc kubenswrapper[4893]: I0314 07:18:20.546952 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-kd7nd" Mar 14 07:18:20 crc kubenswrapper[4893]: E0314 07:18:20.548593 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-xfpdm" podUID="01b24940-e97d-472d-902c-87bfe6b67147" Mar 14 07:18:20 crc kubenswrapper[4893]: E0314 07:18:20.679030 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:40b84319f2f12a1c7ee478fd86a8b1aa5ac2ea8e24f5ce0f1ca78ad879dea8ca\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-xfpdm" podUID="01b24940-e97d-472d-902c-87bfe6b67147" Mar 14 07:18:20 crc kubenswrapper[4893]: E0314 07:18:20.679215 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:bbe772fa051f782c9dcc3c34ce43495e1116aa9089a760c10068790baa9b25ff\\\"\"" pod="openstack-operators/nova-operator-controller-manager-7f84474648-xjwp7" podUID="0997c764-6b32-41ea-adca-b04feb1fbe6f" Mar 14 07:18:21 crc kubenswrapper[4893]: I0314 07:18:21.138136 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6f7958d774rh7fr"] Mar 14 07:18:21 crc kubenswrapper[4893]: I0314 07:18:21.182185 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6484b7b757-kd7nd"] Mar 14 07:18:21 crc kubenswrapper[4893]: I0314 07:18:21.691368 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-gm6pg" event={"ID":"1e068b17-cab4-421d-b501-cd825de6b67c","Type":"ContainerStarted","Data":"c7064d0de379bb4a85c186e2843b3fd032a36e7ee2e7819e2f50ccab4e854d2c"} Mar 14 07:18:21 crc kubenswrapper[4893]: I0314 07:18:21.691937 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-gm6pg" Mar 14 07:18:21 crc kubenswrapper[4893]: I0314 07:18:21.695481 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-qfg79" event={"ID":"a8ac6ab6-a354-4d94-9bfb-48ae4b15ff88","Type":"ContainerStarted","Data":"d8c5524645602a806d7b6fad79e97e9cf5d21a3a4b755664461ef75b77268074"} Mar 14 07:18:21 crc kubenswrapper[4893]: I0314 07:18:21.699731 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-qfg79" Mar 14 07:18:21 crc kubenswrapper[4893]: I0314 07:18:21.711796 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-gm6pg" podStartSLOduration=3.266113371 podStartE2EDuration="18.711721659s" podCreationTimestamp="2026-03-14 07:18:03 +0000 UTC" firstStartedPulling="2026-03-14 07:18:05.161406208 +0000 UTC m=+1164.423583010" lastFinishedPulling="2026-03-14 07:18:20.607014506 +0000 UTC m=+1179.869191298" observedRunningTime="2026-03-14 07:18:21.705647282 +0000 UTC m=+1180.967824104" watchObservedRunningTime="2026-03-14 07:18:21.711721659 +0000 UTC m=+1180.973898451" Mar 14 07:18:21 crc kubenswrapper[4893]: I0314 07:18:21.721560 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-qfg79" podStartSLOduration=2.671141338 podStartE2EDuration="18.721545089s" podCreationTimestamp="2026-03-14 07:18:03 +0000 UTC" firstStartedPulling="2026-03-14 07:18:04.555066628 +0000 UTC m=+1163.817243420" lastFinishedPulling="2026-03-14 07:18:20.605470379 +0000 UTC m=+1179.867647171" observedRunningTime="2026-03-14 07:18:21.716693691 +0000 UTC m=+1180.978870483" watchObservedRunningTime="2026-03-14 07:18:21.721545089 +0000 UTC m=+1180.983721881" Mar 14 07:18:26 crc kubenswrapper[4893]: W0314 07:18:26.686716 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba3c47d0_78fd_4b4d_8aa8_66c2379f8c0f.slice/crio-60b0736c46ac163a3c3ba68b03a636e1f06f2f2c3691b174047b60b58b4c9b02 WatchSource:0}: Error finding container 60b0736c46ac163a3c3ba68b03a636e1f06f2f2c3691b174047b60b58b4c9b02: Status 404 returned error can't find the container with id 60b0736c46ac163a3c3ba68b03a636e1f06f2f2c3691b174047b60b58b4c9b02 Mar 14 07:18:26 crc kubenswrapper[4893]: W0314 07:18:26.687913 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe24c0b7_17b5_4962_b12d_f438a21b953f.slice/crio-607a2b5c932885def02d5e1b645dfc3a79f0a7455ff1184cec05496d97135b68 WatchSource:0}: Error finding container 607a2b5c932885def02d5e1b645dfc3a79f0a7455ff1184cec05496d97135b68: Status 404 returned error can't find the container with id 607a2b5c932885def02d5e1b645dfc3a79f0a7455ff1184cec05496d97135b68 Mar 14 07:18:26 crc kubenswrapper[4893]: I0314 07:18:26.737681 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-d47688694-lg76w" event={"ID":"bb479ff8-15fb-458b-87c7-ec4b3d15721d","Type":"ContainerStarted","Data":"15591283e895d10d54a9e19df492c691431f23a4fabb1dc46bb51d05ed187c21"} Mar 14 07:18:26 crc kubenswrapper[4893]: I0314 07:18:26.738086 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-d47688694-lg76w" Mar 14 07:18:26 crc kubenswrapper[4893]: I0314 07:18:26.742106 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-cxg74" event={"ID":"b8da0600-7ae7-4f7a-8b3e-c81523dc6034","Type":"ContainerStarted","Data":"0e6ac65d00d1c0072958ba7590dd990fbb6b341e3e0f625a02bfb1bec3cb1580"} Mar 14 07:18:26 crc kubenswrapper[4893]: I0314 07:18:26.742239 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-cxg74" Mar 14 07:18:26 crc kubenswrapper[4893]: I0314 07:18:26.743508 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-gztc6" event={"ID":"1a01facb-b7c0-476b-96b1-759e6e9f3c30","Type":"ContainerStarted","Data":"bf9c33d321a36d107165cac167b3fc6b400b393a8f01b26dc8e1425cff64e656"} Mar 14 07:18:26 crc kubenswrapper[4893]: I0314 07:18:26.743707 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-gztc6" Mar 14 07:18:26 crc kubenswrapper[4893]: I0314 07:18:26.748788 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-65klp" event={"ID":"9e7e613b-2bb4-40a5-a8c2-5649763d4a61","Type":"ContainerStarted","Data":"09c19e9edd9bb09945feab8cbf2db4ca0691e8689d821e13687072d0485dc0ac"} Mar 14 07:18:26 crc kubenswrapper[4893]: I0314 07:18:26.748922 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-65klp" Mar 14 07:18:26 crc kubenswrapper[4893]: I0314 07:18:26.750380 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-djn7l" event={"ID":"26bb4dec-8e1a-4fdc-a7ca-808ae7026afe","Type":"ContainerStarted","Data":"ebffd089fffe9e8e9a62c0f3c28ffd9cfdea8e0ea8580e1c2533ab31517b39e5"} Mar 14 07:18:26 crc kubenswrapper[4893]: I0314 07:18:26.750994 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-djn7l" Mar 14 07:18:26 crc kubenswrapper[4893]: I0314 07:18:26.761926 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-cjqhn" event={"ID":"10f5729a-cf9e-4851-9173-2c6c0d6dbcf0","Type":"ContainerStarted","Data":"f7cfd4a511d08813c0803d4759897a33c711e52196dfb066a858a6d979e5743c"} Mar 14 07:18:26 crc kubenswrapper[4893]: I0314 07:18:26.762459 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-cjqhn" Mar 14 07:18:26 crc kubenswrapper[4893]: I0314 07:18:26.772706 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-7967f" event={"ID":"2b0418ea-6737-4c52-b1bc-915fba5ef735","Type":"ContainerStarted","Data":"3ff7bd8bc9c75dcd4ee06ff14486d801e30fa2b37649f39467be6a85f4e657a7"} Mar 14 07:18:26 crc kubenswrapper[4893]: I0314 07:18:26.773214 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-7967f" Mar 14 07:18:26 crc kubenswrapper[4893]: I0314 07:18:26.775304 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-d47688694-lg76w" podStartSLOduration=7.480970271 podStartE2EDuration="23.775294855s" podCreationTimestamp="2026-03-14 07:18:03 +0000 UTC" firstStartedPulling="2026-03-14 07:18:04.314332792 +0000 UTC m=+1163.576509584" lastFinishedPulling="2026-03-14 07:18:20.608657376 +0000 UTC m=+1179.870834168" observedRunningTime="2026-03-14 07:18:26.7660824 +0000 UTC m=+1186.028259192" watchObservedRunningTime="2026-03-14 07:18:26.775294855 +0000 UTC m=+1186.037471647" Mar 14 07:18:26 crc kubenswrapper[4893]: I0314 07:18:26.789939 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-xr2zr" event={"ID":"49eeeddb-7017-4642-aabb-ea932fa16ac7","Type":"ContainerStarted","Data":"15cdfac675b13fbdfece519912a1fcfbc625a3e0d6a2a5aacd22a1fb07730a9a"} Mar 14 07:18:26 crc kubenswrapper[4893]: I0314 07:18:26.790617 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-xr2zr" Mar 14 07:18:26 crc kubenswrapper[4893]: I0314 07:18:26.796946 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-65klp" podStartSLOduration=7.705584226 podStartE2EDuration="23.796932602s" podCreationTimestamp="2026-03-14 07:18:03 +0000 UTC" firstStartedPulling="2026-03-14 07:18:04.528471362 +0000 UTC m=+1163.790648154" lastFinishedPulling="2026-03-14 07:18:20.619819738 +0000 UTC m=+1179.881996530" observedRunningTime="2026-03-14 07:18:26.796910782 +0000 UTC m=+1186.059087594" watchObservedRunningTime="2026-03-14 07:18:26.796932602 +0000 UTC m=+1186.059109394" Mar 14 07:18:26 crc kubenswrapper[4893]: I0314 07:18:26.808424 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-zkh7h" event={"ID":"c50bd703-5cd9-418a-b623-4809a8ff4213","Type":"ContainerStarted","Data":"436ddf2540c1bef2252ff9c6f7ebb4e6a3ae9fb258bfb88604df685115ea86c5"} Mar 14 07:18:26 crc kubenswrapper[4893]: I0314 07:18:26.809221 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-zkh7h" Mar 14 07:18:26 crc kubenswrapper[4893]: I0314 07:18:26.813751 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-kd7nd" event={"ID":"be24c0b7-17b5-4962-b12d-f438a21b953f","Type":"ContainerStarted","Data":"607a2b5c932885def02d5e1b645dfc3a79f0a7455ff1184cec05496d97135b68"} Mar 14 07:18:26 crc kubenswrapper[4893]: I0314 07:18:26.824141 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-cjqhn" podStartSLOduration=7.622401742 podStartE2EDuration="23.824126955s" podCreationTimestamp="2026-03-14 07:18:03 +0000 UTC" firstStartedPulling="2026-03-14 07:18:04.406065423 +0000 UTC m=+1163.668242215" lastFinishedPulling="2026-03-14 07:18:20.607790636 +0000 UTC m=+1179.869967428" observedRunningTime="2026-03-14 07:18:26.818858277 +0000 UTC m=+1186.081035069" watchObservedRunningTime="2026-03-14 07:18:26.824126955 +0000 UTC m=+1186.086303747" Mar 14 07:18:26 crc kubenswrapper[4893]: I0314 07:18:26.834287 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f7958d774rh7fr" event={"ID":"ba3c47d0-78fd-4b4d-8aa8-66c2379f8c0f","Type":"ContainerStarted","Data":"60b0736c46ac163a3c3ba68b03a636e1f06f2f2c3691b174047b60b58b4c9b02"} Mar 14 07:18:26 crc kubenswrapper[4893]: I0314 07:18:26.853868 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-kwcbl" event={"ID":"887bf8c6-7149-4e9b-b67c-bc70791532b2","Type":"ContainerStarted","Data":"82da300d25e1c20ed88f1bd180d58d6d667ba4cd19df07046bf68b72cb47fd3a"} Mar 14 07:18:26 crc kubenswrapper[4893]: I0314 07:18:26.854101 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-kwcbl" Mar 14 07:18:26 crc kubenswrapper[4893]: I0314 07:18:26.862644 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-gztc6" podStartSLOduration=8.407323666 podStartE2EDuration="23.862623253s" podCreationTimestamp="2026-03-14 07:18:03 +0000 UTC" firstStartedPulling="2026-03-14 07:18:05.153297558 +0000 UTC m=+1164.415474350" lastFinishedPulling="2026-03-14 07:18:20.608597145 +0000 UTC m=+1179.870773937" observedRunningTime="2026-03-14 07:18:26.843831646 +0000 UTC m=+1186.106008458" watchObservedRunningTime="2026-03-14 07:18:26.862623253 +0000 UTC m=+1186.124800045" Mar 14 07:18:26 crc kubenswrapper[4893]: I0314 07:18:26.867819 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-djn7l" podStartSLOduration=7.832556092 podStartE2EDuration="23.86780834s" podCreationTimestamp="2026-03-14 07:18:03 +0000 UTC" firstStartedPulling="2026-03-14 07:18:04.572912587 +0000 UTC m=+1163.835089379" lastFinishedPulling="2026-03-14 07:18:20.608164835 +0000 UTC m=+1179.870341627" observedRunningTime="2026-03-14 07:18:26.855967641 +0000 UTC m=+1186.118144453" watchObservedRunningTime="2026-03-14 07:18:26.86780834 +0000 UTC m=+1186.129985132" Mar 14 07:18:26 crc kubenswrapper[4893]: I0314 07:18:26.918020 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-cxg74" podStartSLOduration=8.494123373 podStartE2EDuration="23.918003963s" podCreationTimestamp="2026-03-14 07:18:03 +0000 UTC" firstStartedPulling="2026-03-14 07:18:05.195026837 +0000 UTC m=+1164.457203629" lastFinishedPulling="2026-03-14 07:18:20.618907427 +0000 UTC m=+1179.881084219" observedRunningTime="2026-03-14 07:18:26.908783788 +0000 UTC m=+1186.170960590" watchObservedRunningTime="2026-03-14 07:18:26.918003963 +0000 UTC m=+1186.180180755" Mar 14 07:18:26 crc kubenswrapper[4893]: I0314 07:18:26.934496 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-7967f" podStartSLOduration=8.521799207 podStartE2EDuration="23.934480064s" podCreationTimestamp="2026-03-14 07:18:03 +0000 UTC" firstStartedPulling="2026-03-14 07:18:05.196078182 +0000 UTC m=+1164.458254974" lastFinishedPulling="2026-03-14 07:18:20.608759039 +0000 UTC m=+1179.870935831" observedRunningTime="2026-03-14 07:18:26.928048488 +0000 UTC m=+1186.190225290" watchObservedRunningTime="2026-03-14 07:18:26.934480064 +0000 UTC m=+1186.196656846" Mar 14 07:18:26 crc kubenswrapper[4893]: I0314 07:18:26.942930 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-kwcbl" podStartSLOduration=8.479430831 podStartE2EDuration="23.94291663s" podCreationTimestamp="2026-03-14 07:18:03 +0000 UTC" firstStartedPulling="2026-03-14 07:18:05.145188138 +0000 UTC m=+1164.407364930" lastFinishedPulling="2026-03-14 07:18:20.608673937 +0000 UTC m=+1179.870850729" observedRunningTime="2026-03-14 07:18:26.939708082 +0000 UTC m=+1186.201884874" watchObservedRunningTime="2026-03-14 07:18:26.94291663 +0000 UTC m=+1186.205093422" Mar 14 07:18:26 crc kubenswrapper[4893]: I0314 07:18:26.967570 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-xr2zr" podStartSLOduration=7.65320715 podStartE2EDuration="23.967551021s" podCreationTimestamp="2026-03-14 07:18:03 +0000 UTC" firstStartedPulling="2026-03-14 07:18:04.288535005 +0000 UTC m=+1163.550711797" lastFinishedPulling="2026-03-14 07:18:20.602878876 +0000 UTC m=+1179.865055668" observedRunningTime="2026-03-14 07:18:26.954672467 +0000 UTC m=+1186.216849279" watchObservedRunningTime="2026-03-14 07:18:26.967551021 +0000 UTC m=+1186.229727813" Mar 14 07:18:26 crc kubenswrapper[4893]: I0314 07:18:26.970116 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-zkh7h" podStartSLOduration=8.519270639 podStartE2EDuration="23.970100633s" podCreationTimestamp="2026-03-14 07:18:03 +0000 UTC" firstStartedPulling="2026-03-14 07:18:05.1688117 +0000 UTC m=+1164.430988492" lastFinishedPulling="2026-03-14 07:18:20.619641694 +0000 UTC m=+1179.881818486" observedRunningTime="2026-03-14 07:18:26.968961535 +0000 UTC m=+1186.231138327" watchObservedRunningTime="2026-03-14 07:18:26.970100633 +0000 UTC m=+1186.232277425" Mar 14 07:18:29 crc kubenswrapper[4893]: I0314 07:18:29.731151 4893 patch_prober.go:28] interesting pod/machine-config-daemon-d4x6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:18:29 crc kubenswrapper[4893]: I0314 07:18:29.731515 4893 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:18:29 crc kubenswrapper[4893]: I0314 07:18:29.887827 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-kd7nd" event={"ID":"be24c0b7-17b5-4962-b12d-f438a21b953f","Type":"ContainerStarted","Data":"6a40ba9de9f6274826b8153ad96b8e35128b1578cc4cfb84acec7de07ee13d15"} Mar 14 07:18:29 crc kubenswrapper[4893]: I0314 07:18:29.888047 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-kd7nd" Mar 14 07:18:29 crc kubenswrapper[4893]: I0314 07:18:29.916861 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-kd7nd" podStartSLOduration=25.916826658 podStartE2EDuration="25.916826658s" podCreationTimestamp="2026-03-14 07:18:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:18:29.910010172 +0000 UTC m=+1189.172186984" watchObservedRunningTime="2026-03-14 07:18:29.916826658 +0000 UTC m=+1189.179003450" Mar 14 07:18:31 crc kubenswrapper[4893]: I0314 07:18:31.387792 4893 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 07:18:31 crc kubenswrapper[4893]: I0314 07:18:31.908846 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-wgkn7" event={"ID":"373cda74-3e73-4a92-9d91-395826ab1864","Type":"ContainerStarted","Data":"7a8794a7a5519eee6b325eb6c3ec70ea46d81c02a6643901295e28b634da6493"} Mar 14 07:18:31 crc kubenswrapper[4893]: I0314 07:18:31.909460 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-wgkn7" Mar 14 07:18:31 crc kubenswrapper[4893]: I0314 07:18:31.933472 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-wgkn7" podStartSLOduration=5.074722573 podStartE2EDuration="28.933452686s" podCreationTimestamp="2026-03-14 07:18:03 +0000 UTC" firstStartedPulling="2026-03-14 07:18:05.209749559 +0000 UTC m=+1164.471926351" lastFinishedPulling="2026-03-14 07:18:29.068479672 +0000 UTC m=+1188.330656464" observedRunningTime="2026-03-14 07:18:31.928964206 +0000 UTC m=+1191.191141058" watchObservedRunningTime="2026-03-14 07:18:31.933452686 +0000 UTC m=+1191.195629478" Mar 14 07:18:32 crc kubenswrapper[4893]: I0314 07:18:32.916009 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-lbg2b" event={"ID":"b440d436-7017-4503-823f-16e998dbf74d","Type":"ContainerStarted","Data":"92313b5a15840e4d278efb1cc6aa97cef0defd329555fd6164ef3f8727f89c4d"} Mar 14 07:18:32 crc kubenswrapper[4893]: I0314 07:18:32.916453 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-lbg2b" Mar 14 07:18:32 crc kubenswrapper[4893]: I0314 07:18:32.917495 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-tlm9k" event={"ID":"7a3a590a-bacd-4d27-91f1-1b6afd52ab3e","Type":"ContainerStarted","Data":"dd65f0cc47540ba2b7cd9820cc8e080416cd82ba8161a171d795613f5703647f"} Mar 14 07:18:32 crc kubenswrapper[4893]: I0314 07:18:32.917648 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-tlm9k" Mar 14 07:18:32 crc kubenswrapper[4893]: I0314 07:18:32.919005 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-49ss2" event={"ID":"d92c0e9c-9151-4619-8ce6-9eb5cd77d093","Type":"ContainerStarted","Data":"23afca07a75704bc20668d0893761f514ea4ee68219f69b1b2e2c7d2a92fa72a"} Mar 14 07:18:32 crc kubenswrapper[4893]: I0314 07:18:32.919191 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-49ss2" Mar 14 07:18:32 crc kubenswrapper[4893]: I0314 07:18:32.920059 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dq859" event={"ID":"7bab7ac9-4699-41cc-b1ca-0f344b13ab15","Type":"ContainerStarted","Data":"2cfa0f629419488530a0c7bb5f0e21c4ac41ef616e0faaf785d12f60312e429a"} Mar 14 07:18:32 crc kubenswrapper[4893]: I0314 07:18:32.921371 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f7958d774rh7fr" event={"ID":"ba3c47d0-78fd-4b4d-8aa8-66c2379f8c0f","Type":"ContainerStarted","Data":"0dcdcdefc1d54f5d70d281842e1458678ddaa7aafb52b7945016f8d96561fa9f"} Mar 14 07:18:32 crc kubenswrapper[4893]: I0314 07:18:32.921480 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f7958d774rh7fr" Mar 14 07:18:32 crc kubenswrapper[4893]: I0314 07:18:32.922586 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-9rs6t" event={"ID":"633253f4-c3d4-439d-9b79-dee8c6d41bdc","Type":"ContainerStarted","Data":"0e01890eda94be33d64aae038ab20dcf22ba21267a638f432c325f919872e255"} Mar 14 07:18:32 crc kubenswrapper[4893]: I0314 07:18:32.922760 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-9rs6t" Mar 14 07:18:32 crc kubenswrapper[4893]: I0314 07:18:32.923596 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-7f84474648-xjwp7" event={"ID":"0997c764-6b32-41ea-adca-b04feb1fbe6f","Type":"ContainerStarted","Data":"dc6f7ab0ba54179318c3e958110ec7450e12646ac66f5a33d76c3e079c0b19e4"} Mar 14 07:18:32 crc kubenswrapper[4893]: I0314 07:18:32.923897 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-7f84474648-xjwp7" Mar 14 07:18:32 crc kubenswrapper[4893]: I0314 07:18:32.932315 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-lbg2b" podStartSLOduration=6.125791386 podStartE2EDuration="29.932304409s" podCreationTimestamp="2026-03-14 07:18:03 +0000 UTC" firstStartedPulling="2026-03-14 07:18:05.271239716 +0000 UTC m=+1164.533416508" lastFinishedPulling="2026-03-14 07:18:29.077752749 +0000 UTC m=+1188.339929531" observedRunningTime="2026-03-14 07:18:32.931570661 +0000 UTC m=+1192.193747453" watchObservedRunningTime="2026-03-14 07:18:32.932304409 +0000 UTC m=+1192.194481201" Mar 14 07:18:32 crc kubenswrapper[4893]: I0314 07:18:32.959213 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f7958d774rh7fr" podStartSLOduration=24.505105732 podStartE2EDuration="29.959194794s" podCreationTimestamp="2026-03-14 07:18:03 +0000 UTC" firstStartedPulling="2026-03-14 07:18:26.689203237 +0000 UTC m=+1185.951380029" lastFinishedPulling="2026-03-14 07:18:32.143292299 +0000 UTC m=+1191.405469091" observedRunningTime="2026-03-14 07:18:32.956438227 +0000 UTC m=+1192.218615039" watchObservedRunningTime="2026-03-14 07:18:32.959194794 +0000 UTC m=+1192.221371586" Mar 14 07:18:32 crc kubenswrapper[4893]: I0314 07:18:32.987999 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-tlm9k" podStartSLOduration=3.055731274 podStartE2EDuration="29.987981966s" podCreationTimestamp="2026-03-14 07:18:03 +0000 UTC" firstStartedPulling="2026-03-14 07:18:05.196447462 +0000 UTC m=+1164.458624244" lastFinishedPulling="2026-03-14 07:18:32.128698144 +0000 UTC m=+1191.390874936" observedRunningTime="2026-03-14 07:18:32.984868031 +0000 UTC m=+1192.247044833" watchObservedRunningTime="2026-03-14 07:18:32.987981966 +0000 UTC m=+1192.250158758" Mar 14 07:18:33 crc kubenswrapper[4893]: I0314 07:18:33.016691 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dq859" podStartSLOduration=2.035967371 podStartE2EDuration="29.016675945s" podCreationTimestamp="2026-03-14 07:18:04 +0000 UTC" firstStartedPulling="2026-03-14 07:18:05.263472765 +0000 UTC m=+1164.525649557" lastFinishedPulling="2026-03-14 07:18:32.244181339 +0000 UTC m=+1191.506358131" observedRunningTime="2026-03-14 07:18:33.015691852 +0000 UTC m=+1192.277868644" watchObservedRunningTime="2026-03-14 07:18:33.016675945 +0000 UTC m=+1192.278852737" Mar 14 07:18:33 crc kubenswrapper[4893]: I0314 07:18:33.021543 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-9rs6t" podStartSLOduration=3.042996385 podStartE2EDuration="30.021515164s" podCreationTimestamp="2026-03-14 07:18:03 +0000 UTC" firstStartedPulling="2026-03-14 07:18:05.235896174 +0000 UTC m=+1164.498072966" lastFinishedPulling="2026-03-14 07:18:32.214414943 +0000 UTC m=+1191.476591745" observedRunningTime="2026-03-14 07:18:33.004432087 +0000 UTC m=+1192.266608899" watchObservedRunningTime="2026-03-14 07:18:33.021515164 +0000 UTC m=+1192.283691956" Mar 14 07:18:33 crc kubenswrapper[4893]: I0314 07:18:33.033802 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-7f84474648-xjwp7" podStartSLOduration=2.954629927 podStartE2EDuration="30.033785332s" podCreationTimestamp="2026-03-14 07:18:03 +0000 UTC" firstStartedPulling="2026-03-14 07:18:05.153072702 +0000 UTC m=+1164.415249504" lastFinishedPulling="2026-03-14 07:18:32.232228087 +0000 UTC m=+1191.494404909" observedRunningTime="2026-03-14 07:18:33.031860235 +0000 UTC m=+1192.294037037" watchObservedRunningTime="2026-03-14 07:18:33.033785332 +0000 UTC m=+1192.295962124" Mar 14 07:18:33 crc kubenswrapper[4893]: I0314 07:18:33.042452 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-49ss2" podStartSLOduration=6.203209954 podStartE2EDuration="30.042435283s" podCreationTimestamp="2026-03-14 07:18:03 +0000 UTC" firstStartedPulling="2026-03-14 07:18:05.239042182 +0000 UTC m=+1164.501218974" lastFinishedPulling="2026-03-14 07:18:29.078267511 +0000 UTC m=+1188.340444303" observedRunningTime="2026-03-14 07:18:33.042199658 +0000 UTC m=+1192.304376460" watchObservedRunningTime="2026-03-14 07:18:33.042435283 +0000 UTC m=+1192.304612075" Mar 14 07:18:33 crc kubenswrapper[4893]: I0314 07:18:33.566664 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-d47688694-lg76w" Mar 14 07:18:33 crc kubenswrapper[4893]: I0314 07:18:33.578405 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-xr2zr" Mar 14 07:18:33 crc kubenswrapper[4893]: I0314 07:18:33.592805 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-cjqhn" Mar 14 07:18:33 crc kubenswrapper[4893]: I0314 07:18:33.638188 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-65klp" Mar 14 07:18:33 crc kubenswrapper[4893]: I0314 07:18:33.684942 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-qfg79" Mar 14 07:18:33 crc kubenswrapper[4893]: I0314 07:18:33.707407 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-djn7l" Mar 14 07:18:33 crc kubenswrapper[4893]: I0314 07:18:33.814405 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-gm6pg" Mar 14 07:18:33 crc kubenswrapper[4893]: I0314 07:18:33.906843 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-cxg74" Mar 14 07:18:33 crc kubenswrapper[4893]: I0314 07:18:33.925091 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-zkh7h" Mar 14 07:18:33 crc kubenswrapper[4893]: I0314 07:18:33.930688 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-xfpdm" event={"ID":"01b24940-e97d-472d-902c-87bfe6b67147","Type":"ContainerStarted","Data":"a44364787e65d537e44595297e92f5610c7c229ec9ade08d567e2fade6468c53"} Mar 14 07:18:33 crc kubenswrapper[4893]: I0314 07:18:33.936726 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-gztc6" Mar 14 07:18:33 crc kubenswrapper[4893]: I0314 07:18:33.961226 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-xfpdm" podStartSLOduration=2.9298209269999997 podStartE2EDuration="30.961210955s" podCreationTimestamp="2026-03-14 07:18:03 +0000 UTC" firstStartedPulling="2026-03-14 07:18:04.783395457 +0000 UTC m=+1164.045572249" lastFinishedPulling="2026-03-14 07:18:32.814785485 +0000 UTC m=+1192.076962277" observedRunningTime="2026-03-14 07:18:33.956574042 +0000 UTC m=+1193.218750844" watchObservedRunningTime="2026-03-14 07:18:33.961210955 +0000 UTC m=+1193.223387747" Mar 14 07:18:34 crc kubenswrapper[4893]: I0314 07:18:34.324242 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-7967f" Mar 14 07:18:34 crc kubenswrapper[4893]: I0314 07:18:34.336441 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-kwcbl" Mar 14 07:18:35 crc kubenswrapper[4893]: I0314 07:18:35.275685 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9966deb5-a260-43a8-bec4-772b6308266d-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-qmt9z\" (UID: \"9966deb5-a260-43a8-bec4-772b6308266d\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-qmt9z" Mar 14 07:18:35 crc kubenswrapper[4893]: I0314 07:18:35.286611 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9966deb5-a260-43a8-bec4-772b6308266d-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-qmt9z\" (UID: \"9966deb5-a260-43a8-bec4-772b6308266d\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-qmt9z" Mar 14 07:18:35 crc kubenswrapper[4893]: I0314 07:18:35.519334 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-qmt9z" Mar 14 07:18:35 crc kubenswrapper[4893]: I0314 07:18:35.734203 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-54dc5b8f8d-qmt9z"] Mar 14 07:18:35 crc kubenswrapper[4893]: W0314 07:18:35.741790 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9966deb5_a260_43a8_bec4_772b6308266d.slice/crio-d8a55ee9344f3c0a1103092d91330083fc59a2d9a8268c08f88f14b2045c5926 WatchSource:0}: Error finding container d8a55ee9344f3c0a1103092d91330083fc59a2d9a8268c08f88f14b2045c5926: Status 404 returned error can't find the container with id d8a55ee9344f3c0a1103092d91330083fc59a2d9a8268c08f88f14b2045c5926 Mar 14 07:18:35 crc kubenswrapper[4893]: I0314 07:18:35.949792 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-qmt9z" event={"ID":"9966deb5-a260-43a8-bec4-772b6308266d","Type":"ContainerStarted","Data":"d8a55ee9344f3c0a1103092d91330083fc59a2d9a8268c08f88f14b2045c5926"} Mar 14 07:18:37 crc kubenswrapper[4893]: I0314 07:18:37.968032 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-qmt9z" event={"ID":"9966deb5-a260-43a8-bec4-772b6308266d","Type":"ContainerStarted","Data":"c564c42c921a6e029568d9363876479aea820304ad5553b11bc0b3929974b0e0"} Mar 14 07:18:37 crc kubenswrapper[4893]: I0314 07:18:37.969183 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-qmt9z" Mar 14 07:18:37 crc kubenswrapper[4893]: I0314 07:18:37.992543 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-qmt9z" podStartSLOduration=33.435062357 podStartE2EDuration="34.992510343s" podCreationTimestamp="2026-03-14 07:18:03 +0000 UTC" firstStartedPulling="2026-03-14 07:18:35.745218414 +0000 UTC m=+1195.007395216" lastFinishedPulling="2026-03-14 07:18:37.3026664 +0000 UTC m=+1196.564843202" observedRunningTime="2026-03-14 07:18:37.988543146 +0000 UTC m=+1197.250719948" watchObservedRunningTime="2026-03-14 07:18:37.992510343 +0000 UTC m=+1197.254687135" Mar 14 07:18:39 crc kubenswrapper[4893]: I0314 07:18:39.869416 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f7958d774rh7fr" Mar 14 07:18:40 crc kubenswrapper[4893]: I0314 07:18:40.552905 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-kd7nd" Mar 14 07:18:43 crc kubenswrapper[4893]: I0314 07:18:43.872363 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-xfpdm" Mar 14 07:18:43 crc kubenswrapper[4893]: I0314 07:18:43.875869 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-xfpdm" Mar 14 07:18:43 crc kubenswrapper[4893]: I0314 07:18:43.898046 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-tlm9k" Mar 14 07:18:43 crc kubenswrapper[4893]: I0314 07:18:43.943969 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-7f84474648-xjwp7" Mar 14 07:18:44 crc kubenswrapper[4893]: I0314 07:18:44.470081 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-wgkn7" Mar 14 07:18:44 crc kubenswrapper[4893]: I0314 07:18:44.478485 4893 scope.go:117] "RemoveContainer" containerID="7f869d2d08b072906140050fdd8e8329d0ca605ecc94294ecc9799c0092f363d" Mar 14 07:18:44 crc kubenswrapper[4893]: I0314 07:18:44.514553 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-49ss2" Mar 14 07:18:44 crc kubenswrapper[4893]: I0314 07:18:44.573346 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-9rs6t" Mar 14 07:18:44 crc kubenswrapper[4893]: I0314 07:18:44.596863 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-lbg2b" Mar 14 07:18:45 crc kubenswrapper[4893]: I0314 07:18:45.526133 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-qmt9z" Mar 14 07:18:59 crc kubenswrapper[4893]: I0314 07:18:59.730679 4893 patch_prober.go:28] interesting pod/machine-config-daemon-d4x6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:18:59 crc kubenswrapper[4893]: I0314 07:18:59.731332 4893 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:18:59 crc kubenswrapper[4893]: I0314 07:18:59.731391 4893 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" Mar 14 07:18:59 crc kubenswrapper[4893]: I0314 07:18:59.732141 4893 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6b7e5a6c1e81433472238895d55cad009404cc5608a11ac60397f9c58ede773f"} pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 07:18:59 crc kubenswrapper[4893]: I0314 07:18:59.732194 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" containerID="cri-o://6b7e5a6c1e81433472238895d55cad009404cc5608a11ac60397f9c58ede773f" gracePeriod=600 Mar 14 07:19:00 crc kubenswrapper[4893]: I0314 07:19:00.148460 4893 generic.go:334] "Generic (PLEG): container finished" podID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerID="6b7e5a6c1e81433472238895d55cad009404cc5608a11ac60397f9c58ede773f" exitCode=0 Mar 14 07:19:00 crc kubenswrapper[4893]: I0314 07:19:00.148547 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" event={"ID":"ad6724e5-48cf-4417-ae51-b1cb8c6af70d","Type":"ContainerDied","Data":"6b7e5a6c1e81433472238895d55cad009404cc5608a11ac60397f9c58ede773f"} Mar 14 07:19:00 crc kubenswrapper[4893]: I0314 07:19:00.148914 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" event={"ID":"ad6724e5-48cf-4417-ae51-b1cb8c6af70d","Type":"ContainerStarted","Data":"65067d2744ce3683d92ff7c636321367aa0c4ec520d4ca1606a1f744b31b6656"} Mar 14 07:19:00 crc kubenswrapper[4893]: I0314 07:19:00.148936 4893 scope.go:117] "RemoveContainer" containerID="2fd7b7357426a71964d49e99a8163fe1e89a54e8bb9c768156381da3bae22bd0" Mar 14 07:19:02 crc kubenswrapper[4893]: I0314 07:19:02.059940 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5448ff6dc7-nqmg5"] Mar 14 07:19:02 crc kubenswrapper[4893]: E0314 07:19:02.061039 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85464791-1f62-4d33-bd20-813896eef4b8" containerName="oc" Mar 14 07:19:02 crc kubenswrapper[4893]: I0314 07:19:02.061057 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="85464791-1f62-4d33-bd20-813896eef4b8" containerName="oc" Mar 14 07:19:02 crc kubenswrapper[4893]: I0314 07:19:02.061230 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="85464791-1f62-4d33-bd20-813896eef4b8" containerName="oc" Mar 14 07:19:02 crc kubenswrapper[4893]: I0314 07:19:02.062159 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5448ff6dc7-nqmg5" Mar 14 07:19:02 crc kubenswrapper[4893]: I0314 07:19:02.066491 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-brppt" Mar 14 07:19:02 crc kubenswrapper[4893]: I0314 07:19:02.066772 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 14 07:19:02 crc kubenswrapper[4893]: I0314 07:19:02.066900 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 14 07:19:02 crc kubenswrapper[4893]: I0314 07:19:02.067001 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 14 07:19:02 crc kubenswrapper[4893]: I0314 07:19:02.068954 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5448ff6dc7-nqmg5"] Mar 14 07:19:02 crc kubenswrapper[4893]: I0314 07:19:02.104486 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-64696987c5-whdn2"] Mar 14 07:19:02 crc kubenswrapper[4893]: I0314 07:19:02.105820 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64696987c5-whdn2" Mar 14 07:19:02 crc kubenswrapper[4893]: I0314 07:19:02.115477 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 14 07:19:02 crc kubenswrapper[4893]: I0314 07:19:02.119197 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64696987c5-whdn2"] Mar 14 07:19:02 crc kubenswrapper[4893]: I0314 07:19:02.218480 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75rm4\" (UniqueName: \"kubernetes.io/projected/a0db85c4-654a-44af-9f4a-68f4e7f72ba7-kube-api-access-75rm4\") pod \"dnsmasq-dns-64696987c5-whdn2\" (UID: \"a0db85c4-654a-44af-9f4a-68f4e7f72ba7\") " pod="openstack/dnsmasq-dns-64696987c5-whdn2" Mar 14 07:19:02 crc kubenswrapper[4893]: I0314 07:19:02.218547 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8dtf\" (UniqueName: \"kubernetes.io/projected/1b03746c-6794-4325-a2ea-d43d5bf418f2-kube-api-access-h8dtf\") pod \"dnsmasq-dns-5448ff6dc7-nqmg5\" (UID: \"1b03746c-6794-4325-a2ea-d43d5bf418f2\") " pod="openstack/dnsmasq-dns-5448ff6dc7-nqmg5" Mar 14 07:19:02 crc kubenswrapper[4893]: I0314 07:19:02.218594 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0db85c4-654a-44af-9f4a-68f4e7f72ba7-config\") pod \"dnsmasq-dns-64696987c5-whdn2\" (UID: \"a0db85c4-654a-44af-9f4a-68f4e7f72ba7\") " pod="openstack/dnsmasq-dns-64696987c5-whdn2" Mar 14 07:19:02 crc kubenswrapper[4893]: I0314 07:19:02.218667 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b03746c-6794-4325-a2ea-d43d5bf418f2-config\") pod \"dnsmasq-dns-5448ff6dc7-nqmg5\" (UID: \"1b03746c-6794-4325-a2ea-d43d5bf418f2\") " pod="openstack/dnsmasq-dns-5448ff6dc7-nqmg5" Mar 14 07:19:02 crc kubenswrapper[4893]: I0314 07:19:02.218692 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a0db85c4-654a-44af-9f4a-68f4e7f72ba7-dns-svc\") pod \"dnsmasq-dns-64696987c5-whdn2\" (UID: \"a0db85c4-654a-44af-9f4a-68f4e7f72ba7\") " pod="openstack/dnsmasq-dns-64696987c5-whdn2" Mar 14 07:19:02 crc kubenswrapper[4893]: I0314 07:19:02.320332 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75rm4\" (UniqueName: \"kubernetes.io/projected/a0db85c4-654a-44af-9f4a-68f4e7f72ba7-kube-api-access-75rm4\") pod \"dnsmasq-dns-64696987c5-whdn2\" (UID: \"a0db85c4-654a-44af-9f4a-68f4e7f72ba7\") " pod="openstack/dnsmasq-dns-64696987c5-whdn2" Mar 14 07:19:02 crc kubenswrapper[4893]: I0314 07:19:02.320380 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8dtf\" (UniqueName: \"kubernetes.io/projected/1b03746c-6794-4325-a2ea-d43d5bf418f2-kube-api-access-h8dtf\") pod \"dnsmasq-dns-5448ff6dc7-nqmg5\" (UID: \"1b03746c-6794-4325-a2ea-d43d5bf418f2\") " pod="openstack/dnsmasq-dns-5448ff6dc7-nqmg5" Mar 14 07:19:02 crc kubenswrapper[4893]: I0314 07:19:02.320422 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0db85c4-654a-44af-9f4a-68f4e7f72ba7-config\") pod \"dnsmasq-dns-64696987c5-whdn2\" (UID: \"a0db85c4-654a-44af-9f4a-68f4e7f72ba7\") " pod="openstack/dnsmasq-dns-64696987c5-whdn2" Mar 14 07:19:02 crc kubenswrapper[4893]: I0314 07:19:02.320448 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b03746c-6794-4325-a2ea-d43d5bf418f2-config\") pod \"dnsmasq-dns-5448ff6dc7-nqmg5\" (UID: \"1b03746c-6794-4325-a2ea-d43d5bf418f2\") " pod="openstack/dnsmasq-dns-5448ff6dc7-nqmg5" Mar 14 07:19:02 crc kubenswrapper[4893]: I0314 07:19:02.320473 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a0db85c4-654a-44af-9f4a-68f4e7f72ba7-dns-svc\") pod \"dnsmasq-dns-64696987c5-whdn2\" (UID: \"a0db85c4-654a-44af-9f4a-68f4e7f72ba7\") " pod="openstack/dnsmasq-dns-64696987c5-whdn2" Mar 14 07:19:02 crc kubenswrapper[4893]: I0314 07:19:02.321391 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0db85c4-654a-44af-9f4a-68f4e7f72ba7-config\") pod \"dnsmasq-dns-64696987c5-whdn2\" (UID: \"a0db85c4-654a-44af-9f4a-68f4e7f72ba7\") " pod="openstack/dnsmasq-dns-64696987c5-whdn2" Mar 14 07:19:02 crc kubenswrapper[4893]: I0314 07:19:02.321446 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a0db85c4-654a-44af-9f4a-68f4e7f72ba7-dns-svc\") pod \"dnsmasq-dns-64696987c5-whdn2\" (UID: \"a0db85c4-654a-44af-9f4a-68f4e7f72ba7\") " pod="openstack/dnsmasq-dns-64696987c5-whdn2" Mar 14 07:19:02 crc kubenswrapper[4893]: I0314 07:19:02.321509 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b03746c-6794-4325-a2ea-d43d5bf418f2-config\") pod \"dnsmasq-dns-5448ff6dc7-nqmg5\" (UID: \"1b03746c-6794-4325-a2ea-d43d5bf418f2\") " pod="openstack/dnsmasq-dns-5448ff6dc7-nqmg5" Mar 14 07:19:02 crc kubenswrapper[4893]: I0314 07:19:02.339752 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8dtf\" (UniqueName: \"kubernetes.io/projected/1b03746c-6794-4325-a2ea-d43d5bf418f2-kube-api-access-h8dtf\") pod \"dnsmasq-dns-5448ff6dc7-nqmg5\" (UID: \"1b03746c-6794-4325-a2ea-d43d5bf418f2\") " pod="openstack/dnsmasq-dns-5448ff6dc7-nqmg5" Mar 14 07:19:02 crc kubenswrapper[4893]: I0314 07:19:02.340605 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75rm4\" (UniqueName: \"kubernetes.io/projected/a0db85c4-654a-44af-9f4a-68f4e7f72ba7-kube-api-access-75rm4\") pod \"dnsmasq-dns-64696987c5-whdn2\" (UID: \"a0db85c4-654a-44af-9f4a-68f4e7f72ba7\") " pod="openstack/dnsmasq-dns-64696987c5-whdn2" Mar 14 07:19:02 crc kubenswrapper[4893]: I0314 07:19:02.382887 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5448ff6dc7-nqmg5" Mar 14 07:19:02 crc kubenswrapper[4893]: I0314 07:19:02.425846 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64696987c5-whdn2" Mar 14 07:19:02 crc kubenswrapper[4893]: I0314 07:19:02.609721 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5448ff6dc7-nqmg5"] Mar 14 07:19:02 crc kubenswrapper[4893]: I0314 07:19:02.744425 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64696987c5-whdn2"] Mar 14 07:19:02 crc kubenswrapper[4893]: W0314 07:19:02.751703 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0db85c4_654a_44af_9f4a_68f4e7f72ba7.slice/crio-719a6a92cbc3d9073df716850ccd7023e018219213a9636b92c9979b852cfa84 WatchSource:0}: Error finding container 719a6a92cbc3d9073df716850ccd7023e018219213a9636b92c9979b852cfa84: Status 404 returned error can't find the container with id 719a6a92cbc3d9073df716850ccd7023e018219213a9636b92c9979b852cfa84 Mar 14 07:19:03 crc kubenswrapper[4893]: I0314 07:19:03.172364 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64696987c5-whdn2" event={"ID":"a0db85c4-654a-44af-9f4a-68f4e7f72ba7","Type":"ContainerStarted","Data":"719a6a92cbc3d9073df716850ccd7023e018219213a9636b92c9979b852cfa84"} Mar 14 07:19:03 crc kubenswrapper[4893]: I0314 07:19:03.173134 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5448ff6dc7-nqmg5" event={"ID":"1b03746c-6794-4325-a2ea-d43d5bf418f2","Type":"ContainerStarted","Data":"f24a87b724d3505789b7b4ed829600523b0d3d78692c80226d389808f1a00271"} Mar 14 07:19:04 crc kubenswrapper[4893]: I0314 07:19:04.168858 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5448ff6dc7-nqmg5"] Mar 14 07:19:04 crc kubenswrapper[4893]: I0314 07:19:04.210367 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-854f47b4f9-4n27f"] Mar 14 07:19:04 crc kubenswrapper[4893]: I0314 07:19:04.211382 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-854f47b4f9-4n27f" Mar 14 07:19:04 crc kubenswrapper[4893]: I0314 07:19:04.227044 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-854f47b4f9-4n27f"] Mar 14 07:19:04 crc kubenswrapper[4893]: I0314 07:19:04.348126 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcvdn\" (UniqueName: \"kubernetes.io/projected/4980f8ae-d3e9-4a80-8257-c40696e11036-kube-api-access-jcvdn\") pod \"dnsmasq-dns-854f47b4f9-4n27f\" (UID: \"4980f8ae-d3e9-4a80-8257-c40696e11036\") " pod="openstack/dnsmasq-dns-854f47b4f9-4n27f" Mar 14 07:19:04 crc kubenswrapper[4893]: I0314 07:19:04.348181 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4980f8ae-d3e9-4a80-8257-c40696e11036-dns-svc\") pod \"dnsmasq-dns-854f47b4f9-4n27f\" (UID: \"4980f8ae-d3e9-4a80-8257-c40696e11036\") " pod="openstack/dnsmasq-dns-854f47b4f9-4n27f" Mar 14 07:19:04 crc kubenswrapper[4893]: I0314 07:19:04.348369 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4980f8ae-d3e9-4a80-8257-c40696e11036-config\") pod \"dnsmasq-dns-854f47b4f9-4n27f\" (UID: \"4980f8ae-d3e9-4a80-8257-c40696e11036\") " pod="openstack/dnsmasq-dns-854f47b4f9-4n27f" Mar 14 07:19:04 crc kubenswrapper[4893]: I0314 07:19:04.449199 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcvdn\" (UniqueName: \"kubernetes.io/projected/4980f8ae-d3e9-4a80-8257-c40696e11036-kube-api-access-jcvdn\") pod \"dnsmasq-dns-854f47b4f9-4n27f\" (UID: \"4980f8ae-d3e9-4a80-8257-c40696e11036\") " pod="openstack/dnsmasq-dns-854f47b4f9-4n27f" Mar 14 07:19:04 crc kubenswrapper[4893]: I0314 07:19:04.449252 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4980f8ae-d3e9-4a80-8257-c40696e11036-dns-svc\") pod \"dnsmasq-dns-854f47b4f9-4n27f\" (UID: \"4980f8ae-d3e9-4a80-8257-c40696e11036\") " pod="openstack/dnsmasq-dns-854f47b4f9-4n27f" Mar 14 07:19:04 crc kubenswrapper[4893]: I0314 07:19:04.449299 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4980f8ae-d3e9-4a80-8257-c40696e11036-config\") pod \"dnsmasq-dns-854f47b4f9-4n27f\" (UID: \"4980f8ae-d3e9-4a80-8257-c40696e11036\") " pod="openstack/dnsmasq-dns-854f47b4f9-4n27f" Mar 14 07:19:04 crc kubenswrapper[4893]: I0314 07:19:04.450194 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4980f8ae-d3e9-4a80-8257-c40696e11036-config\") pod \"dnsmasq-dns-854f47b4f9-4n27f\" (UID: \"4980f8ae-d3e9-4a80-8257-c40696e11036\") " pod="openstack/dnsmasq-dns-854f47b4f9-4n27f" Mar 14 07:19:04 crc kubenswrapper[4893]: I0314 07:19:04.450270 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4980f8ae-d3e9-4a80-8257-c40696e11036-dns-svc\") pod \"dnsmasq-dns-854f47b4f9-4n27f\" (UID: \"4980f8ae-d3e9-4a80-8257-c40696e11036\") " pod="openstack/dnsmasq-dns-854f47b4f9-4n27f" Mar 14 07:19:04 crc kubenswrapper[4893]: I0314 07:19:04.467654 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcvdn\" (UniqueName: \"kubernetes.io/projected/4980f8ae-d3e9-4a80-8257-c40696e11036-kube-api-access-jcvdn\") pod \"dnsmasq-dns-854f47b4f9-4n27f\" (UID: \"4980f8ae-d3e9-4a80-8257-c40696e11036\") " pod="openstack/dnsmasq-dns-854f47b4f9-4n27f" Mar 14 07:19:04 crc kubenswrapper[4893]: I0314 07:19:04.537593 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-854f47b4f9-4n27f" Mar 14 07:19:04 crc kubenswrapper[4893]: I0314 07:19:04.885566 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64696987c5-whdn2"] Mar 14 07:19:04 crc kubenswrapper[4893]: I0314 07:19:04.910816 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-54b5dffb47-dn9fz"] Mar 14 07:19:04 crc kubenswrapper[4893]: I0314 07:19:04.912386 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54b5dffb47-dn9fz" Mar 14 07:19:04 crc kubenswrapper[4893]: I0314 07:19:04.929603 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54b5dffb47-dn9fz"] Mar 14 07:19:04 crc kubenswrapper[4893]: I0314 07:19:04.979105 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-854f47b4f9-4n27f"] Mar 14 07:19:05 crc kubenswrapper[4893]: I0314 07:19:05.059194 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/664bdee5-ebab-425c-b11e-2f8eed685b18-config\") pod \"dnsmasq-dns-54b5dffb47-dn9fz\" (UID: \"664bdee5-ebab-425c-b11e-2f8eed685b18\") " pod="openstack/dnsmasq-dns-54b5dffb47-dn9fz" Mar 14 07:19:05 crc kubenswrapper[4893]: I0314 07:19:05.059313 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vs56l\" (UniqueName: \"kubernetes.io/projected/664bdee5-ebab-425c-b11e-2f8eed685b18-kube-api-access-vs56l\") pod \"dnsmasq-dns-54b5dffb47-dn9fz\" (UID: \"664bdee5-ebab-425c-b11e-2f8eed685b18\") " pod="openstack/dnsmasq-dns-54b5dffb47-dn9fz" Mar 14 07:19:05 crc kubenswrapper[4893]: I0314 07:19:05.059336 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/664bdee5-ebab-425c-b11e-2f8eed685b18-dns-svc\") pod \"dnsmasq-dns-54b5dffb47-dn9fz\" (UID: \"664bdee5-ebab-425c-b11e-2f8eed685b18\") " pod="openstack/dnsmasq-dns-54b5dffb47-dn9fz" Mar 14 07:19:05 crc kubenswrapper[4893]: I0314 07:19:05.161016 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vs56l\" (UniqueName: \"kubernetes.io/projected/664bdee5-ebab-425c-b11e-2f8eed685b18-kube-api-access-vs56l\") pod \"dnsmasq-dns-54b5dffb47-dn9fz\" (UID: \"664bdee5-ebab-425c-b11e-2f8eed685b18\") " pod="openstack/dnsmasq-dns-54b5dffb47-dn9fz" Mar 14 07:19:05 crc kubenswrapper[4893]: I0314 07:19:05.161061 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/664bdee5-ebab-425c-b11e-2f8eed685b18-dns-svc\") pod \"dnsmasq-dns-54b5dffb47-dn9fz\" (UID: \"664bdee5-ebab-425c-b11e-2f8eed685b18\") " pod="openstack/dnsmasq-dns-54b5dffb47-dn9fz" Mar 14 07:19:05 crc kubenswrapper[4893]: I0314 07:19:05.161090 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/664bdee5-ebab-425c-b11e-2f8eed685b18-config\") pod \"dnsmasq-dns-54b5dffb47-dn9fz\" (UID: \"664bdee5-ebab-425c-b11e-2f8eed685b18\") " pod="openstack/dnsmasq-dns-54b5dffb47-dn9fz" Mar 14 07:19:05 crc kubenswrapper[4893]: I0314 07:19:05.161926 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/664bdee5-ebab-425c-b11e-2f8eed685b18-config\") pod \"dnsmasq-dns-54b5dffb47-dn9fz\" (UID: \"664bdee5-ebab-425c-b11e-2f8eed685b18\") " pod="openstack/dnsmasq-dns-54b5dffb47-dn9fz" Mar 14 07:19:05 crc kubenswrapper[4893]: I0314 07:19:05.162027 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/664bdee5-ebab-425c-b11e-2f8eed685b18-dns-svc\") pod \"dnsmasq-dns-54b5dffb47-dn9fz\" (UID: \"664bdee5-ebab-425c-b11e-2f8eed685b18\") " pod="openstack/dnsmasq-dns-54b5dffb47-dn9fz" Mar 14 07:19:05 crc kubenswrapper[4893]: I0314 07:19:05.183703 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vs56l\" (UniqueName: \"kubernetes.io/projected/664bdee5-ebab-425c-b11e-2f8eed685b18-kube-api-access-vs56l\") pod \"dnsmasq-dns-54b5dffb47-dn9fz\" (UID: \"664bdee5-ebab-425c-b11e-2f8eed685b18\") " pod="openstack/dnsmasq-dns-54b5dffb47-dn9fz" Mar 14 07:19:05 crc kubenswrapper[4893]: I0314 07:19:05.196479 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-854f47b4f9-4n27f" event={"ID":"4980f8ae-d3e9-4a80-8257-c40696e11036","Type":"ContainerStarted","Data":"0fa90bfb5f1581490e308d2ba53933cca665b5835491e4263ce6b46526e0aec0"} Mar 14 07:19:05 crc kubenswrapper[4893]: I0314 07:19:05.237330 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54b5dffb47-dn9fz" Mar 14 07:19:05 crc kubenswrapper[4893]: I0314 07:19:05.329004 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 14 07:19:05 crc kubenswrapper[4893]: I0314 07:19:05.330063 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 14 07:19:05 crc kubenswrapper[4893]: I0314 07:19:05.331768 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 14 07:19:05 crc kubenswrapper[4893]: I0314 07:19:05.335740 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-xtbrb" Mar 14 07:19:05 crc kubenswrapper[4893]: I0314 07:19:05.335968 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 14 07:19:05 crc kubenswrapper[4893]: I0314 07:19:05.336083 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 14 07:19:05 crc kubenswrapper[4893]: I0314 07:19:05.336204 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 14 07:19:05 crc kubenswrapper[4893]: I0314 07:19:05.336310 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 14 07:19:05 crc kubenswrapper[4893]: I0314 07:19:05.336413 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 14 07:19:05 crc kubenswrapper[4893]: I0314 07:19:05.348139 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 14 07:19:05 crc kubenswrapper[4893]: I0314 07:19:05.465183 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a752b3c8-284e-490f-be39-506e7a075c6f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"a752b3c8-284e-490f-be39-506e7a075c6f\") " pod="openstack/rabbitmq-server-0" Mar 14 07:19:05 crc kubenswrapper[4893]: I0314 07:19:05.465487 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a752b3c8-284e-490f-be39-506e7a075c6f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"a752b3c8-284e-490f-be39-506e7a075c6f\") " pod="openstack/rabbitmq-server-0" Mar 14 07:19:05 crc kubenswrapper[4893]: I0314 07:19:05.465510 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"a752b3c8-284e-490f-be39-506e7a075c6f\") " pod="openstack/rabbitmq-server-0" Mar 14 07:19:05 crc kubenswrapper[4893]: I0314 07:19:05.465545 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a752b3c8-284e-490f-be39-506e7a075c6f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"a752b3c8-284e-490f-be39-506e7a075c6f\") " pod="openstack/rabbitmq-server-0" Mar 14 07:19:05 crc kubenswrapper[4893]: I0314 07:19:05.465758 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5qkl\" (UniqueName: \"kubernetes.io/projected/a752b3c8-284e-490f-be39-506e7a075c6f-kube-api-access-k5qkl\") pod \"rabbitmq-server-0\" (UID: \"a752b3c8-284e-490f-be39-506e7a075c6f\") " pod="openstack/rabbitmq-server-0" Mar 14 07:19:05 crc kubenswrapper[4893]: I0314 07:19:05.465824 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a752b3c8-284e-490f-be39-506e7a075c6f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"a752b3c8-284e-490f-be39-506e7a075c6f\") " pod="openstack/rabbitmq-server-0" Mar 14 07:19:05 crc kubenswrapper[4893]: I0314 07:19:05.465997 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a752b3c8-284e-490f-be39-506e7a075c6f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"a752b3c8-284e-490f-be39-506e7a075c6f\") " pod="openstack/rabbitmq-server-0" Mar 14 07:19:05 crc kubenswrapper[4893]: I0314 07:19:05.466041 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a752b3c8-284e-490f-be39-506e7a075c6f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"a752b3c8-284e-490f-be39-506e7a075c6f\") " pod="openstack/rabbitmq-server-0" Mar 14 07:19:05 crc kubenswrapper[4893]: I0314 07:19:05.466066 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a752b3c8-284e-490f-be39-506e7a075c6f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"a752b3c8-284e-490f-be39-506e7a075c6f\") " pod="openstack/rabbitmq-server-0" Mar 14 07:19:05 crc kubenswrapper[4893]: I0314 07:19:05.466109 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a752b3c8-284e-490f-be39-506e7a075c6f-config-data\") pod \"rabbitmq-server-0\" (UID: \"a752b3c8-284e-490f-be39-506e7a075c6f\") " pod="openstack/rabbitmq-server-0" Mar 14 07:19:05 crc kubenswrapper[4893]: I0314 07:19:05.466149 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a752b3c8-284e-490f-be39-506e7a075c6f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"a752b3c8-284e-490f-be39-506e7a075c6f\") " pod="openstack/rabbitmq-server-0" Mar 14 07:19:05 crc kubenswrapper[4893]: I0314 07:19:05.567220 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a752b3c8-284e-490f-be39-506e7a075c6f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"a752b3c8-284e-490f-be39-506e7a075c6f\") " pod="openstack/rabbitmq-server-0" Mar 14 07:19:05 crc kubenswrapper[4893]: I0314 07:19:05.567354 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a752b3c8-284e-490f-be39-506e7a075c6f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"a752b3c8-284e-490f-be39-506e7a075c6f\") " pod="openstack/rabbitmq-server-0" Mar 14 07:19:05 crc kubenswrapper[4893]: I0314 07:19:05.567377 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a752b3c8-284e-490f-be39-506e7a075c6f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"a752b3c8-284e-490f-be39-506e7a075c6f\") " pod="openstack/rabbitmq-server-0" Mar 14 07:19:05 crc kubenswrapper[4893]: I0314 07:19:05.567396 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a752b3c8-284e-490f-be39-506e7a075c6f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"a752b3c8-284e-490f-be39-506e7a075c6f\") " pod="openstack/rabbitmq-server-0" Mar 14 07:19:05 crc kubenswrapper[4893]: I0314 07:19:05.567420 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a752b3c8-284e-490f-be39-506e7a075c6f-config-data\") pod \"rabbitmq-server-0\" (UID: \"a752b3c8-284e-490f-be39-506e7a075c6f\") " pod="openstack/rabbitmq-server-0" Mar 14 07:19:05 crc kubenswrapper[4893]: I0314 07:19:05.567442 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a752b3c8-284e-490f-be39-506e7a075c6f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"a752b3c8-284e-490f-be39-506e7a075c6f\") " pod="openstack/rabbitmq-server-0" Mar 14 07:19:05 crc kubenswrapper[4893]: I0314 07:19:05.567471 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a752b3c8-284e-490f-be39-506e7a075c6f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"a752b3c8-284e-490f-be39-506e7a075c6f\") " pod="openstack/rabbitmq-server-0" Mar 14 07:19:05 crc kubenswrapper[4893]: I0314 07:19:05.567486 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a752b3c8-284e-490f-be39-506e7a075c6f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"a752b3c8-284e-490f-be39-506e7a075c6f\") " pod="openstack/rabbitmq-server-0" Mar 14 07:19:05 crc kubenswrapper[4893]: I0314 07:19:05.567505 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"a752b3c8-284e-490f-be39-506e7a075c6f\") " pod="openstack/rabbitmq-server-0" Mar 14 07:19:05 crc kubenswrapper[4893]: I0314 07:19:05.567548 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a752b3c8-284e-490f-be39-506e7a075c6f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"a752b3c8-284e-490f-be39-506e7a075c6f\") " pod="openstack/rabbitmq-server-0" Mar 14 07:19:05 crc kubenswrapper[4893]: I0314 07:19:05.567571 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5qkl\" (UniqueName: \"kubernetes.io/projected/a752b3c8-284e-490f-be39-506e7a075c6f-kube-api-access-k5qkl\") pod \"rabbitmq-server-0\" (UID: \"a752b3c8-284e-490f-be39-506e7a075c6f\") " pod="openstack/rabbitmq-server-0" Mar 14 07:19:05 crc kubenswrapper[4893]: I0314 07:19:05.568393 4893 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"a752b3c8-284e-490f-be39-506e7a075c6f\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Mar 14 07:19:05 crc kubenswrapper[4893]: I0314 07:19:05.568883 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a752b3c8-284e-490f-be39-506e7a075c6f-config-data\") pod \"rabbitmq-server-0\" (UID: \"a752b3c8-284e-490f-be39-506e7a075c6f\") " pod="openstack/rabbitmq-server-0" Mar 14 07:19:05 crc kubenswrapper[4893]: I0314 07:19:05.568883 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a752b3c8-284e-490f-be39-506e7a075c6f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"a752b3c8-284e-490f-be39-506e7a075c6f\") " pod="openstack/rabbitmq-server-0" Mar 14 07:19:05 crc kubenswrapper[4893]: I0314 07:19:05.579216 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a752b3c8-284e-490f-be39-506e7a075c6f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"a752b3c8-284e-490f-be39-506e7a075c6f\") " pod="openstack/rabbitmq-server-0" Mar 14 07:19:05 crc kubenswrapper[4893]: I0314 07:19:05.579513 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a752b3c8-284e-490f-be39-506e7a075c6f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"a752b3c8-284e-490f-be39-506e7a075c6f\") " pod="openstack/rabbitmq-server-0" Mar 14 07:19:05 crc kubenswrapper[4893]: I0314 07:19:05.579673 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a752b3c8-284e-490f-be39-506e7a075c6f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"a752b3c8-284e-490f-be39-506e7a075c6f\") " pod="openstack/rabbitmq-server-0" Mar 14 07:19:05 crc kubenswrapper[4893]: I0314 07:19:05.579739 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a752b3c8-284e-490f-be39-506e7a075c6f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"a752b3c8-284e-490f-be39-506e7a075c6f\") " pod="openstack/rabbitmq-server-0" Mar 14 07:19:05 crc kubenswrapper[4893]: I0314 07:19:05.579792 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a752b3c8-284e-490f-be39-506e7a075c6f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"a752b3c8-284e-490f-be39-506e7a075c6f\") " pod="openstack/rabbitmq-server-0" Mar 14 07:19:05 crc kubenswrapper[4893]: I0314 07:19:05.586876 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a752b3c8-284e-490f-be39-506e7a075c6f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"a752b3c8-284e-490f-be39-506e7a075c6f\") " pod="openstack/rabbitmq-server-0" Mar 14 07:19:05 crc kubenswrapper[4893]: I0314 07:19:05.587212 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a752b3c8-284e-490f-be39-506e7a075c6f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"a752b3c8-284e-490f-be39-506e7a075c6f\") " pod="openstack/rabbitmq-server-0" Mar 14 07:19:05 crc kubenswrapper[4893]: I0314 07:19:05.595547 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5qkl\" (UniqueName: \"kubernetes.io/projected/a752b3c8-284e-490f-be39-506e7a075c6f-kube-api-access-k5qkl\") pod \"rabbitmq-server-0\" (UID: \"a752b3c8-284e-490f-be39-506e7a075c6f\") " pod="openstack/rabbitmq-server-0" Mar 14 07:19:05 crc kubenswrapper[4893]: I0314 07:19:05.604989 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"a752b3c8-284e-490f-be39-506e7a075c6f\") " pod="openstack/rabbitmq-server-0" Mar 14 07:19:05 crc kubenswrapper[4893]: I0314 07:19:05.664933 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 14 07:19:05 crc kubenswrapper[4893]: I0314 07:19:05.700116 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54b5dffb47-dn9fz"] Mar 14 07:19:05 crc kubenswrapper[4893]: W0314 07:19:05.718597 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod664bdee5_ebab_425c_b11e_2f8eed685b18.slice/crio-6eeed7a63111cb5779ae09fda949f727116b3d8743848089ef7a06b539f4a690 WatchSource:0}: Error finding container 6eeed7a63111cb5779ae09fda949f727116b3d8743848089ef7a06b539f4a690: Status 404 returned error can't find the container with id 6eeed7a63111cb5779ae09fda949f727116b3d8743848089ef7a06b539f4a690 Mar 14 07:19:06 crc kubenswrapper[4893]: I0314 07:19:06.019879 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 14 07:19:06 crc kubenswrapper[4893]: I0314 07:19:06.024031 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:19:06 crc kubenswrapper[4893]: I0314 07:19:06.026947 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-rzhtv" Mar 14 07:19:06 crc kubenswrapper[4893]: I0314 07:19:06.027125 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 14 07:19:06 crc kubenswrapper[4893]: I0314 07:19:06.029225 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 14 07:19:06 crc kubenswrapper[4893]: I0314 07:19:06.029627 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 14 07:19:06 crc kubenswrapper[4893]: I0314 07:19:06.029819 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 14 07:19:06 crc kubenswrapper[4893]: I0314 07:19:06.029948 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 14 07:19:06 crc kubenswrapper[4893]: I0314 07:19:06.030050 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 14 07:19:06 crc kubenswrapper[4893]: I0314 07:19:06.031416 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 14 07:19:06 crc kubenswrapper[4893]: I0314 07:19:06.145688 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 14 07:19:06 crc kubenswrapper[4893]: I0314 07:19:06.188570 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:19:06 crc kubenswrapper[4893]: I0314 07:19:06.188639 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:19:06 crc kubenswrapper[4893]: I0314 07:19:06.188771 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:19:06 crc kubenswrapper[4893]: I0314 07:19:06.188809 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:19:06 crc kubenswrapper[4893]: I0314 07:19:06.188834 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:19:06 crc kubenswrapper[4893]: I0314 07:19:06.188881 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:19:06 crc kubenswrapper[4893]: I0314 07:19:06.188912 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:19:06 crc kubenswrapper[4893]: I0314 07:19:06.188971 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5ws2\" (UniqueName: \"kubernetes.io/projected/7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e-kube-api-access-n5ws2\") pod \"rabbitmq-cell1-server-0\" (UID: \"7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:19:06 crc kubenswrapper[4893]: I0314 07:19:06.189052 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:19:06 crc kubenswrapper[4893]: I0314 07:19:06.189081 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:19:06 crc kubenswrapper[4893]: I0314 07:19:06.189096 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:19:06 crc kubenswrapper[4893]: I0314 07:19:06.210922 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54b5dffb47-dn9fz" event={"ID":"664bdee5-ebab-425c-b11e-2f8eed685b18","Type":"ContainerStarted","Data":"6eeed7a63111cb5779ae09fda949f727116b3d8743848089ef7a06b539f4a690"} Mar 14 07:19:06 crc kubenswrapper[4893]: W0314 07:19:06.252078 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda752b3c8_284e_490f_be39_506e7a075c6f.slice/crio-0ba35f303367256d7eb006970aae07499ffd749050a21d793a6fe41fb1adbe09 WatchSource:0}: Error finding container 0ba35f303367256d7eb006970aae07499ffd749050a21d793a6fe41fb1adbe09: Status 404 returned error can't find the container with id 0ba35f303367256d7eb006970aae07499ffd749050a21d793a6fe41fb1adbe09 Mar 14 07:19:06 crc kubenswrapper[4893]: I0314 07:19:06.291330 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:19:06 crc kubenswrapper[4893]: I0314 07:19:06.291380 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:19:06 crc kubenswrapper[4893]: I0314 07:19:06.291402 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:19:06 crc kubenswrapper[4893]: I0314 07:19:06.291426 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:19:06 crc kubenswrapper[4893]: I0314 07:19:06.291443 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:19:06 crc kubenswrapper[4893]: I0314 07:19:06.291471 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:19:06 crc kubenswrapper[4893]: I0314 07:19:06.291494 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:19:06 crc kubenswrapper[4893]: I0314 07:19:06.291552 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5ws2\" (UniqueName: \"kubernetes.io/projected/7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e-kube-api-access-n5ws2\") pod \"rabbitmq-cell1-server-0\" (UID: \"7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:19:06 crc kubenswrapper[4893]: I0314 07:19:06.291588 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:19:06 crc kubenswrapper[4893]: I0314 07:19:06.291608 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:19:06 crc kubenswrapper[4893]: I0314 07:19:06.291628 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:19:06 crc kubenswrapper[4893]: I0314 07:19:06.292353 4893 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:19:06 crc kubenswrapper[4893]: I0314 07:19:06.292481 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:19:06 crc kubenswrapper[4893]: I0314 07:19:06.292615 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:19:06 crc kubenswrapper[4893]: I0314 07:19:06.292618 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:19:06 crc kubenswrapper[4893]: I0314 07:19:06.293212 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:19:06 crc kubenswrapper[4893]: I0314 07:19:06.293744 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:19:06 crc kubenswrapper[4893]: I0314 07:19:06.300950 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:19:06 crc kubenswrapper[4893]: I0314 07:19:06.302951 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:19:06 crc kubenswrapper[4893]: I0314 07:19:06.305859 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:19:06 crc kubenswrapper[4893]: I0314 07:19:06.310945 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:19:06 crc kubenswrapper[4893]: I0314 07:19:06.311801 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5ws2\" (UniqueName: \"kubernetes.io/projected/7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e-kube-api-access-n5ws2\") pod \"rabbitmq-cell1-server-0\" (UID: \"7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:19:06 crc kubenswrapper[4893]: I0314 07:19:06.318951 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:19:06 crc kubenswrapper[4893]: I0314 07:19:06.410896 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:19:06 crc kubenswrapper[4893]: I0314 07:19:06.846486 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 14 07:19:07 crc kubenswrapper[4893]: I0314 07:19:07.226442 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a752b3c8-284e-490f-be39-506e7a075c6f","Type":"ContainerStarted","Data":"0ba35f303367256d7eb006970aae07499ffd749050a21d793a6fe41fb1adbe09"} Mar 14 07:19:07 crc kubenswrapper[4893]: I0314 07:19:07.227760 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e","Type":"ContainerStarted","Data":"daa8dcd940f0cd32125e66a1b55c50274984ef532a9dc79df60326ce40eec1e0"} Mar 14 07:19:07 crc kubenswrapper[4893]: I0314 07:19:07.424019 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 14 07:19:07 crc kubenswrapper[4893]: I0314 07:19:07.425221 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 14 07:19:07 crc kubenswrapper[4893]: I0314 07:19:07.433674 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-f6n62" Mar 14 07:19:07 crc kubenswrapper[4893]: I0314 07:19:07.433986 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 14 07:19:07 crc kubenswrapper[4893]: I0314 07:19:07.443827 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 14 07:19:07 crc kubenswrapper[4893]: I0314 07:19:07.447331 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 14 07:19:07 crc kubenswrapper[4893]: I0314 07:19:07.447435 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 14 07:19:07 crc kubenswrapper[4893]: I0314 07:19:07.451248 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 14 07:19:07 crc kubenswrapper[4893]: I0314 07:19:07.538799 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/444c0d3d-4ad4-47a3-9281-b7028d69a78a-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"444c0d3d-4ad4-47a3-9281-b7028d69a78a\") " pod="openstack/openstack-galera-0" Mar 14 07:19:07 crc kubenswrapper[4893]: I0314 07:19:07.539248 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"444c0d3d-4ad4-47a3-9281-b7028d69a78a\") " pod="openstack/openstack-galera-0" Mar 14 07:19:07 crc kubenswrapper[4893]: I0314 07:19:07.539289 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/444c0d3d-4ad4-47a3-9281-b7028d69a78a-config-data-default\") pod \"openstack-galera-0\" (UID: \"444c0d3d-4ad4-47a3-9281-b7028d69a78a\") " pod="openstack/openstack-galera-0" Mar 14 07:19:07 crc kubenswrapper[4893]: I0314 07:19:07.539318 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/444c0d3d-4ad4-47a3-9281-b7028d69a78a-kolla-config\") pod \"openstack-galera-0\" (UID: \"444c0d3d-4ad4-47a3-9281-b7028d69a78a\") " pod="openstack/openstack-galera-0" Mar 14 07:19:07 crc kubenswrapper[4893]: I0314 07:19:07.539371 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/444c0d3d-4ad4-47a3-9281-b7028d69a78a-operator-scripts\") pod \"openstack-galera-0\" (UID: \"444c0d3d-4ad4-47a3-9281-b7028d69a78a\") " pod="openstack/openstack-galera-0" Mar 14 07:19:07 crc kubenswrapper[4893]: I0314 07:19:07.539392 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2vqj\" (UniqueName: \"kubernetes.io/projected/444c0d3d-4ad4-47a3-9281-b7028d69a78a-kube-api-access-c2vqj\") pod \"openstack-galera-0\" (UID: \"444c0d3d-4ad4-47a3-9281-b7028d69a78a\") " pod="openstack/openstack-galera-0" Mar 14 07:19:07 crc kubenswrapper[4893]: I0314 07:19:07.539433 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/444c0d3d-4ad4-47a3-9281-b7028d69a78a-config-data-generated\") pod \"openstack-galera-0\" (UID: \"444c0d3d-4ad4-47a3-9281-b7028d69a78a\") " pod="openstack/openstack-galera-0" Mar 14 07:19:07 crc kubenswrapper[4893]: I0314 07:19:07.539464 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/444c0d3d-4ad4-47a3-9281-b7028d69a78a-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"444c0d3d-4ad4-47a3-9281-b7028d69a78a\") " pod="openstack/openstack-galera-0" Mar 14 07:19:07 crc kubenswrapper[4893]: I0314 07:19:07.640676 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/444c0d3d-4ad4-47a3-9281-b7028d69a78a-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"444c0d3d-4ad4-47a3-9281-b7028d69a78a\") " pod="openstack/openstack-galera-0" Mar 14 07:19:07 crc kubenswrapper[4893]: I0314 07:19:07.640893 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"444c0d3d-4ad4-47a3-9281-b7028d69a78a\") " pod="openstack/openstack-galera-0" Mar 14 07:19:07 crc kubenswrapper[4893]: I0314 07:19:07.640956 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/444c0d3d-4ad4-47a3-9281-b7028d69a78a-config-data-default\") pod \"openstack-galera-0\" (UID: \"444c0d3d-4ad4-47a3-9281-b7028d69a78a\") " pod="openstack/openstack-galera-0" Mar 14 07:19:07 crc kubenswrapper[4893]: I0314 07:19:07.641006 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/444c0d3d-4ad4-47a3-9281-b7028d69a78a-kolla-config\") pod \"openstack-galera-0\" (UID: \"444c0d3d-4ad4-47a3-9281-b7028d69a78a\") " pod="openstack/openstack-galera-0" Mar 14 07:19:07 crc kubenswrapper[4893]: I0314 07:19:07.641057 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/444c0d3d-4ad4-47a3-9281-b7028d69a78a-operator-scripts\") pod \"openstack-galera-0\" (UID: \"444c0d3d-4ad4-47a3-9281-b7028d69a78a\") " pod="openstack/openstack-galera-0" Mar 14 07:19:07 crc kubenswrapper[4893]: I0314 07:19:07.641094 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2vqj\" (UniqueName: \"kubernetes.io/projected/444c0d3d-4ad4-47a3-9281-b7028d69a78a-kube-api-access-c2vqj\") pod \"openstack-galera-0\" (UID: \"444c0d3d-4ad4-47a3-9281-b7028d69a78a\") " pod="openstack/openstack-galera-0" Mar 14 07:19:07 crc kubenswrapper[4893]: I0314 07:19:07.641122 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/444c0d3d-4ad4-47a3-9281-b7028d69a78a-config-data-generated\") pod \"openstack-galera-0\" (UID: \"444c0d3d-4ad4-47a3-9281-b7028d69a78a\") " pod="openstack/openstack-galera-0" Mar 14 07:19:07 crc kubenswrapper[4893]: I0314 07:19:07.641146 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/444c0d3d-4ad4-47a3-9281-b7028d69a78a-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"444c0d3d-4ad4-47a3-9281-b7028d69a78a\") " pod="openstack/openstack-galera-0" Mar 14 07:19:07 crc kubenswrapper[4893]: I0314 07:19:07.641654 4893 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"444c0d3d-4ad4-47a3-9281-b7028d69a78a\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/openstack-galera-0" Mar 14 07:19:07 crc kubenswrapper[4893]: I0314 07:19:07.641906 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/444c0d3d-4ad4-47a3-9281-b7028d69a78a-kolla-config\") pod \"openstack-galera-0\" (UID: \"444c0d3d-4ad4-47a3-9281-b7028d69a78a\") " pod="openstack/openstack-galera-0" Mar 14 07:19:07 crc kubenswrapper[4893]: I0314 07:19:07.641957 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/444c0d3d-4ad4-47a3-9281-b7028d69a78a-config-data-generated\") pod \"openstack-galera-0\" (UID: \"444c0d3d-4ad4-47a3-9281-b7028d69a78a\") " pod="openstack/openstack-galera-0" Mar 14 07:19:07 crc kubenswrapper[4893]: I0314 07:19:07.642539 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/444c0d3d-4ad4-47a3-9281-b7028d69a78a-config-data-default\") pod \"openstack-galera-0\" (UID: \"444c0d3d-4ad4-47a3-9281-b7028d69a78a\") " pod="openstack/openstack-galera-0" Mar 14 07:19:07 crc kubenswrapper[4893]: I0314 07:19:07.642892 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/444c0d3d-4ad4-47a3-9281-b7028d69a78a-operator-scripts\") pod \"openstack-galera-0\" (UID: \"444c0d3d-4ad4-47a3-9281-b7028d69a78a\") " pod="openstack/openstack-galera-0" Mar 14 07:19:07 crc kubenswrapper[4893]: I0314 07:19:07.647812 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/444c0d3d-4ad4-47a3-9281-b7028d69a78a-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"444c0d3d-4ad4-47a3-9281-b7028d69a78a\") " pod="openstack/openstack-galera-0" Mar 14 07:19:07 crc kubenswrapper[4893]: I0314 07:19:07.672835 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/444c0d3d-4ad4-47a3-9281-b7028d69a78a-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"444c0d3d-4ad4-47a3-9281-b7028d69a78a\") " pod="openstack/openstack-galera-0" Mar 14 07:19:07 crc kubenswrapper[4893]: I0314 07:19:07.674247 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"444c0d3d-4ad4-47a3-9281-b7028d69a78a\") " pod="openstack/openstack-galera-0" Mar 14 07:19:07 crc kubenswrapper[4893]: I0314 07:19:07.674647 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2vqj\" (UniqueName: \"kubernetes.io/projected/444c0d3d-4ad4-47a3-9281-b7028d69a78a-kube-api-access-c2vqj\") pod \"openstack-galera-0\" (UID: \"444c0d3d-4ad4-47a3-9281-b7028d69a78a\") " pod="openstack/openstack-galera-0" Mar 14 07:19:07 crc kubenswrapper[4893]: I0314 07:19:07.754864 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 14 07:19:08 crc kubenswrapper[4893]: I0314 07:19:08.336582 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 14 07:19:09 crc kubenswrapper[4893]: I0314 07:19:09.015339 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 14 07:19:09 crc kubenswrapper[4893]: I0314 07:19:09.017344 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 14 07:19:09 crc kubenswrapper[4893]: I0314 07:19:09.031702 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 14 07:19:09 crc kubenswrapper[4893]: I0314 07:19:09.032107 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 14 07:19:09 crc kubenswrapper[4893]: I0314 07:19:09.032293 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 14 07:19:09 crc kubenswrapper[4893]: I0314 07:19:09.032452 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 14 07:19:09 crc kubenswrapper[4893]: I0314 07:19:09.032643 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-hcpqx" Mar 14 07:19:09 crc kubenswrapper[4893]: I0314 07:19:09.174375 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlt62\" (UniqueName: \"kubernetes.io/projected/4f6afb83-bfaa-41a6-8429-b8588d82c7a7-kube-api-access-rlt62\") pod \"openstack-cell1-galera-0\" (UID: \"4f6afb83-bfaa-41a6-8429-b8588d82c7a7\") " pod="openstack/openstack-cell1-galera-0" Mar 14 07:19:09 crc kubenswrapper[4893]: I0314 07:19:09.175356 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"4f6afb83-bfaa-41a6-8429-b8588d82c7a7\") " pod="openstack/openstack-cell1-galera-0" Mar 14 07:19:09 crc kubenswrapper[4893]: I0314 07:19:09.175390 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4f6afb83-bfaa-41a6-8429-b8588d82c7a7-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"4f6afb83-bfaa-41a6-8429-b8588d82c7a7\") " pod="openstack/openstack-cell1-galera-0" Mar 14 07:19:09 crc kubenswrapper[4893]: I0314 07:19:09.175412 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f6afb83-bfaa-41a6-8429-b8588d82c7a7-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"4f6afb83-bfaa-41a6-8429-b8588d82c7a7\") " pod="openstack/openstack-cell1-galera-0" Mar 14 07:19:09 crc kubenswrapper[4893]: I0314 07:19:09.175443 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4f6afb83-bfaa-41a6-8429-b8588d82c7a7-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"4f6afb83-bfaa-41a6-8429-b8588d82c7a7\") " pod="openstack/openstack-cell1-galera-0" Mar 14 07:19:09 crc kubenswrapper[4893]: I0314 07:19:09.175457 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f6afb83-bfaa-41a6-8429-b8588d82c7a7-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"4f6afb83-bfaa-41a6-8429-b8588d82c7a7\") " pod="openstack/openstack-cell1-galera-0" Mar 14 07:19:09 crc kubenswrapper[4893]: I0314 07:19:09.175490 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4f6afb83-bfaa-41a6-8429-b8588d82c7a7-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"4f6afb83-bfaa-41a6-8429-b8588d82c7a7\") " pod="openstack/openstack-cell1-galera-0" Mar 14 07:19:09 crc kubenswrapper[4893]: I0314 07:19:09.175530 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f6afb83-bfaa-41a6-8429-b8588d82c7a7-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"4f6afb83-bfaa-41a6-8429-b8588d82c7a7\") " pod="openstack/openstack-cell1-galera-0" Mar 14 07:19:09 crc kubenswrapper[4893]: I0314 07:19:09.219512 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 14 07:19:09 crc kubenswrapper[4893]: I0314 07:19:09.221017 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 14 07:19:09 crc kubenswrapper[4893]: I0314 07:19:09.223075 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-9f587" Mar 14 07:19:09 crc kubenswrapper[4893]: I0314 07:19:09.223081 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 14 07:19:09 crc kubenswrapper[4893]: I0314 07:19:09.223633 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 14 07:19:09 crc kubenswrapper[4893]: I0314 07:19:09.234332 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 14 07:19:09 crc kubenswrapper[4893]: I0314 07:19:09.276329 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgfn5\" (UniqueName: \"kubernetes.io/projected/6de6c560-1e2c-4dca-b4c2-be4e51a5300f-kube-api-access-dgfn5\") pod \"memcached-0\" (UID: \"6de6c560-1e2c-4dca-b4c2-be4e51a5300f\") " pod="openstack/memcached-0" Mar 14 07:19:09 crc kubenswrapper[4893]: I0314 07:19:09.276377 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"4f6afb83-bfaa-41a6-8429-b8588d82c7a7\") " pod="openstack/openstack-cell1-galera-0" Mar 14 07:19:09 crc kubenswrapper[4893]: I0314 07:19:09.276403 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4f6afb83-bfaa-41a6-8429-b8588d82c7a7-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"4f6afb83-bfaa-41a6-8429-b8588d82c7a7\") " pod="openstack/openstack-cell1-galera-0" Mar 14 07:19:09 crc kubenswrapper[4893]: I0314 07:19:09.276425 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f6afb83-bfaa-41a6-8429-b8588d82c7a7-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"4f6afb83-bfaa-41a6-8429-b8588d82c7a7\") " pod="openstack/openstack-cell1-galera-0" Mar 14 07:19:09 crc kubenswrapper[4893]: I0314 07:19:09.276453 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4f6afb83-bfaa-41a6-8429-b8588d82c7a7-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"4f6afb83-bfaa-41a6-8429-b8588d82c7a7\") " pod="openstack/openstack-cell1-galera-0" Mar 14 07:19:09 crc kubenswrapper[4893]: I0314 07:19:09.276468 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f6afb83-bfaa-41a6-8429-b8588d82c7a7-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"4f6afb83-bfaa-41a6-8429-b8588d82c7a7\") " pod="openstack/openstack-cell1-galera-0" Mar 14 07:19:09 crc kubenswrapper[4893]: I0314 07:19:09.276508 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4f6afb83-bfaa-41a6-8429-b8588d82c7a7-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"4f6afb83-bfaa-41a6-8429-b8588d82c7a7\") " pod="openstack/openstack-cell1-galera-0" Mar 14 07:19:09 crc kubenswrapper[4893]: I0314 07:19:09.276559 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f6afb83-bfaa-41a6-8429-b8588d82c7a7-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"4f6afb83-bfaa-41a6-8429-b8588d82c7a7\") " pod="openstack/openstack-cell1-galera-0" Mar 14 07:19:09 crc kubenswrapper[4893]: I0314 07:19:09.276585 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6de6c560-1e2c-4dca-b4c2-be4e51a5300f-kolla-config\") pod \"memcached-0\" (UID: \"6de6c560-1e2c-4dca-b4c2-be4e51a5300f\") " pod="openstack/memcached-0" Mar 14 07:19:09 crc kubenswrapper[4893]: I0314 07:19:09.276605 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6de6c560-1e2c-4dca-b4c2-be4e51a5300f-config-data\") pod \"memcached-0\" (UID: \"6de6c560-1e2c-4dca-b4c2-be4e51a5300f\") " pod="openstack/memcached-0" Mar 14 07:19:09 crc kubenswrapper[4893]: I0314 07:19:09.276626 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlt62\" (UniqueName: \"kubernetes.io/projected/4f6afb83-bfaa-41a6-8429-b8588d82c7a7-kube-api-access-rlt62\") pod \"openstack-cell1-galera-0\" (UID: \"4f6afb83-bfaa-41a6-8429-b8588d82c7a7\") " pod="openstack/openstack-cell1-galera-0" Mar 14 07:19:09 crc kubenswrapper[4893]: I0314 07:19:09.276642 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6de6c560-1e2c-4dca-b4c2-be4e51a5300f-combined-ca-bundle\") pod \"memcached-0\" (UID: \"6de6c560-1e2c-4dca-b4c2-be4e51a5300f\") " pod="openstack/memcached-0" Mar 14 07:19:09 crc kubenswrapper[4893]: I0314 07:19:09.276668 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/6de6c560-1e2c-4dca-b4c2-be4e51a5300f-memcached-tls-certs\") pod \"memcached-0\" (UID: \"6de6c560-1e2c-4dca-b4c2-be4e51a5300f\") " pod="openstack/memcached-0" Mar 14 07:19:09 crc kubenswrapper[4893]: I0314 07:19:09.276837 4893 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"4f6afb83-bfaa-41a6-8429-b8588d82c7a7\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/openstack-cell1-galera-0" Mar 14 07:19:09 crc kubenswrapper[4893]: I0314 07:19:09.276990 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4f6afb83-bfaa-41a6-8429-b8588d82c7a7-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"4f6afb83-bfaa-41a6-8429-b8588d82c7a7\") " pod="openstack/openstack-cell1-galera-0" Mar 14 07:19:09 crc kubenswrapper[4893]: I0314 07:19:09.277680 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4f6afb83-bfaa-41a6-8429-b8588d82c7a7-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"4f6afb83-bfaa-41a6-8429-b8588d82c7a7\") " pod="openstack/openstack-cell1-galera-0" Mar 14 07:19:09 crc kubenswrapper[4893]: I0314 07:19:09.277950 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4f6afb83-bfaa-41a6-8429-b8588d82c7a7-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"4f6afb83-bfaa-41a6-8429-b8588d82c7a7\") " pod="openstack/openstack-cell1-galera-0" Mar 14 07:19:09 crc kubenswrapper[4893]: I0314 07:19:09.279211 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f6afb83-bfaa-41a6-8429-b8588d82c7a7-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"4f6afb83-bfaa-41a6-8429-b8588d82c7a7\") " pod="openstack/openstack-cell1-galera-0" Mar 14 07:19:09 crc kubenswrapper[4893]: I0314 07:19:09.294664 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f6afb83-bfaa-41a6-8429-b8588d82c7a7-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"4f6afb83-bfaa-41a6-8429-b8588d82c7a7\") " pod="openstack/openstack-cell1-galera-0" Mar 14 07:19:09 crc kubenswrapper[4893]: I0314 07:19:09.301044 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"444c0d3d-4ad4-47a3-9281-b7028d69a78a","Type":"ContainerStarted","Data":"7e7da4e22bf60aa0ec9b55713ac88c949061bd33595456b4882b823f5ccc3da7"} Mar 14 07:19:09 crc kubenswrapper[4893]: I0314 07:19:09.301286 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"4f6afb83-bfaa-41a6-8429-b8588d82c7a7\") " pod="openstack/openstack-cell1-galera-0" Mar 14 07:19:09 crc kubenswrapper[4893]: I0314 07:19:09.305136 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlt62\" (UniqueName: \"kubernetes.io/projected/4f6afb83-bfaa-41a6-8429-b8588d82c7a7-kube-api-access-rlt62\") pod \"openstack-cell1-galera-0\" (UID: \"4f6afb83-bfaa-41a6-8429-b8588d82c7a7\") " pod="openstack/openstack-cell1-galera-0" Mar 14 07:19:09 crc kubenswrapper[4893]: I0314 07:19:09.312496 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f6afb83-bfaa-41a6-8429-b8588d82c7a7-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"4f6afb83-bfaa-41a6-8429-b8588d82c7a7\") " pod="openstack/openstack-cell1-galera-0" Mar 14 07:19:09 crc kubenswrapper[4893]: I0314 07:19:09.362023 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 14 07:19:09 crc kubenswrapper[4893]: I0314 07:19:09.377731 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6de6c560-1e2c-4dca-b4c2-be4e51a5300f-kolla-config\") pod \"memcached-0\" (UID: \"6de6c560-1e2c-4dca-b4c2-be4e51a5300f\") " pod="openstack/memcached-0" Mar 14 07:19:09 crc kubenswrapper[4893]: I0314 07:19:09.378668 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6de6c560-1e2c-4dca-b4c2-be4e51a5300f-config-data\") pod \"memcached-0\" (UID: \"6de6c560-1e2c-4dca-b4c2-be4e51a5300f\") " pod="openstack/memcached-0" Mar 14 07:19:09 crc kubenswrapper[4893]: I0314 07:19:09.378739 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6de6c560-1e2c-4dca-b4c2-be4e51a5300f-combined-ca-bundle\") pod \"memcached-0\" (UID: \"6de6c560-1e2c-4dca-b4c2-be4e51a5300f\") " pod="openstack/memcached-0" Mar 14 07:19:09 crc kubenswrapper[4893]: I0314 07:19:09.378787 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/6de6c560-1e2c-4dca-b4c2-be4e51a5300f-memcached-tls-certs\") pod \"memcached-0\" (UID: \"6de6c560-1e2c-4dca-b4c2-be4e51a5300f\") " pod="openstack/memcached-0" Mar 14 07:19:09 crc kubenswrapper[4893]: I0314 07:19:09.378622 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6de6c560-1e2c-4dca-b4c2-be4e51a5300f-kolla-config\") pod \"memcached-0\" (UID: \"6de6c560-1e2c-4dca-b4c2-be4e51a5300f\") " pod="openstack/memcached-0" Mar 14 07:19:09 crc kubenswrapper[4893]: I0314 07:19:09.378871 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgfn5\" (UniqueName: \"kubernetes.io/projected/6de6c560-1e2c-4dca-b4c2-be4e51a5300f-kube-api-access-dgfn5\") pod \"memcached-0\" (UID: \"6de6c560-1e2c-4dca-b4c2-be4e51a5300f\") " pod="openstack/memcached-0" Mar 14 07:19:09 crc kubenswrapper[4893]: I0314 07:19:09.379353 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6de6c560-1e2c-4dca-b4c2-be4e51a5300f-config-data\") pod \"memcached-0\" (UID: \"6de6c560-1e2c-4dca-b4c2-be4e51a5300f\") " pod="openstack/memcached-0" Mar 14 07:19:09 crc kubenswrapper[4893]: I0314 07:19:09.381562 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/6de6c560-1e2c-4dca-b4c2-be4e51a5300f-memcached-tls-certs\") pod \"memcached-0\" (UID: \"6de6c560-1e2c-4dca-b4c2-be4e51a5300f\") " pod="openstack/memcached-0" Mar 14 07:19:09 crc kubenswrapper[4893]: I0314 07:19:09.389552 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6de6c560-1e2c-4dca-b4c2-be4e51a5300f-combined-ca-bundle\") pod \"memcached-0\" (UID: \"6de6c560-1e2c-4dca-b4c2-be4e51a5300f\") " pod="openstack/memcached-0" Mar 14 07:19:09 crc kubenswrapper[4893]: I0314 07:19:09.406922 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgfn5\" (UniqueName: \"kubernetes.io/projected/6de6c560-1e2c-4dca-b4c2-be4e51a5300f-kube-api-access-dgfn5\") pod \"memcached-0\" (UID: \"6de6c560-1e2c-4dca-b4c2-be4e51a5300f\") " pod="openstack/memcached-0" Mar 14 07:19:09 crc kubenswrapper[4893]: I0314 07:19:09.549606 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 14 07:19:09 crc kubenswrapper[4893]: I0314 07:19:09.855128 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 14 07:19:09 crc kubenswrapper[4893]: W0314 07:19:09.950986 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f6afb83_bfaa_41a6_8429_b8588d82c7a7.slice/crio-9c18ae52999bed19e6062471aa41398ff6a7cc15c07667a19f1cbd9e1d26ed4a WatchSource:0}: Error finding container 9c18ae52999bed19e6062471aa41398ff6a7cc15c07667a19f1cbd9e1d26ed4a: Status 404 returned error can't find the container with id 9c18ae52999bed19e6062471aa41398ff6a7cc15c07667a19f1cbd9e1d26ed4a Mar 14 07:19:10 crc kubenswrapper[4893]: I0314 07:19:10.094154 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 14 07:19:10 crc kubenswrapper[4893]: W0314 07:19:10.095650 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6de6c560_1e2c_4dca_b4c2_be4e51a5300f.slice/crio-83202805168c64d677b4c93e1f0226f435335bc812fe65a507a45275611928bf WatchSource:0}: Error finding container 83202805168c64d677b4c93e1f0226f435335bc812fe65a507a45275611928bf: Status 404 returned error can't find the container with id 83202805168c64d677b4c93e1f0226f435335bc812fe65a507a45275611928bf Mar 14 07:19:10 crc kubenswrapper[4893]: I0314 07:19:10.309556 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"6de6c560-1e2c-4dca-b4c2-be4e51a5300f","Type":"ContainerStarted","Data":"83202805168c64d677b4c93e1f0226f435335bc812fe65a507a45275611928bf"} Mar 14 07:19:10 crc kubenswrapper[4893]: I0314 07:19:10.318853 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"4f6afb83-bfaa-41a6-8429-b8588d82c7a7","Type":"ContainerStarted","Data":"9c18ae52999bed19e6062471aa41398ff6a7cc15c07667a19f1cbd9e1d26ed4a"} Mar 14 07:19:11 crc kubenswrapper[4893]: I0314 07:19:11.251847 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 14 07:19:11 crc kubenswrapper[4893]: I0314 07:19:11.255083 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 14 07:19:11 crc kubenswrapper[4893]: I0314 07:19:11.258380 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-pxjhg" Mar 14 07:19:11 crc kubenswrapper[4893]: I0314 07:19:11.265370 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 14 07:19:11 crc kubenswrapper[4893]: I0314 07:19:11.310327 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxw86\" (UniqueName: \"kubernetes.io/projected/cb3f59e3-5322-4aa3-94c7-de3cc01c39cc-kube-api-access-qxw86\") pod \"kube-state-metrics-0\" (UID: \"cb3f59e3-5322-4aa3-94c7-de3cc01c39cc\") " pod="openstack/kube-state-metrics-0" Mar 14 07:19:11 crc kubenswrapper[4893]: I0314 07:19:11.411504 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxw86\" (UniqueName: \"kubernetes.io/projected/cb3f59e3-5322-4aa3-94c7-de3cc01c39cc-kube-api-access-qxw86\") pod \"kube-state-metrics-0\" (UID: \"cb3f59e3-5322-4aa3-94c7-de3cc01c39cc\") " pod="openstack/kube-state-metrics-0" Mar 14 07:19:11 crc kubenswrapper[4893]: I0314 07:19:11.439365 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxw86\" (UniqueName: \"kubernetes.io/projected/cb3f59e3-5322-4aa3-94c7-de3cc01c39cc-kube-api-access-qxw86\") pod \"kube-state-metrics-0\" (UID: \"cb3f59e3-5322-4aa3-94c7-de3cc01c39cc\") " pod="openstack/kube-state-metrics-0" Mar 14 07:19:11 crc kubenswrapper[4893]: I0314 07:19:11.607005 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 14 07:19:15 crc kubenswrapper[4893]: I0314 07:19:15.140822 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-8rcbf"] Mar 14 07:19:15 crc kubenswrapper[4893]: I0314 07:19:15.142263 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-8rcbf" Mar 14 07:19:15 crc kubenswrapper[4893]: I0314 07:19:15.143900 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-vsfql" Mar 14 07:19:15 crc kubenswrapper[4893]: I0314 07:19:15.144599 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 14 07:19:15 crc kubenswrapper[4893]: I0314 07:19:15.144778 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 14 07:19:15 crc kubenswrapper[4893]: I0314 07:19:15.151082 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-8rcbf"] Mar 14 07:19:15 crc kubenswrapper[4893]: I0314 07:19:15.235374 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-bwq2l"] Mar 14 07:19:15 crc kubenswrapper[4893]: I0314 07:19:15.237258 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-bwq2l" Mar 14 07:19:15 crc kubenswrapper[4893]: I0314 07:19:15.247281 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-bwq2l"] Mar 14 07:19:15 crc kubenswrapper[4893]: I0314 07:19:15.298079 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a4b44171-12ae-4a98-aac1-1adc9dff3941-var-log-ovn\") pod \"ovn-controller-8rcbf\" (UID: \"a4b44171-12ae-4a98-aac1-1adc9dff3941\") " pod="openstack/ovn-controller-8rcbf" Mar 14 07:19:15 crc kubenswrapper[4893]: I0314 07:19:15.298145 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4b44171-12ae-4a98-aac1-1adc9dff3941-ovn-controller-tls-certs\") pod \"ovn-controller-8rcbf\" (UID: \"a4b44171-12ae-4a98-aac1-1adc9dff3941\") " pod="openstack/ovn-controller-8rcbf" Mar 14 07:19:15 crc kubenswrapper[4893]: I0314 07:19:15.298726 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2rzg\" (UniqueName: \"kubernetes.io/projected/a4b44171-12ae-4a98-aac1-1adc9dff3941-kube-api-access-v2rzg\") pod \"ovn-controller-8rcbf\" (UID: \"a4b44171-12ae-4a98-aac1-1adc9dff3941\") " pod="openstack/ovn-controller-8rcbf" Mar 14 07:19:15 crc kubenswrapper[4893]: I0314 07:19:15.298970 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a4b44171-12ae-4a98-aac1-1adc9dff3941-var-run\") pod \"ovn-controller-8rcbf\" (UID: \"a4b44171-12ae-4a98-aac1-1adc9dff3941\") " pod="openstack/ovn-controller-8rcbf" Mar 14 07:19:15 crc kubenswrapper[4893]: I0314 07:19:15.299033 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a4b44171-12ae-4a98-aac1-1adc9dff3941-scripts\") pod \"ovn-controller-8rcbf\" (UID: \"a4b44171-12ae-4a98-aac1-1adc9dff3941\") " pod="openstack/ovn-controller-8rcbf" Mar 14 07:19:15 crc kubenswrapper[4893]: I0314 07:19:15.299075 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4b44171-12ae-4a98-aac1-1adc9dff3941-combined-ca-bundle\") pod \"ovn-controller-8rcbf\" (UID: \"a4b44171-12ae-4a98-aac1-1adc9dff3941\") " pod="openstack/ovn-controller-8rcbf" Mar 14 07:19:15 crc kubenswrapper[4893]: I0314 07:19:15.299101 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a4b44171-12ae-4a98-aac1-1adc9dff3941-var-run-ovn\") pod \"ovn-controller-8rcbf\" (UID: \"a4b44171-12ae-4a98-aac1-1adc9dff3941\") " pod="openstack/ovn-controller-8rcbf" Mar 14 07:19:15 crc kubenswrapper[4893]: I0314 07:19:15.400157 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a4b44171-12ae-4a98-aac1-1adc9dff3941-var-log-ovn\") pod \"ovn-controller-8rcbf\" (UID: \"a4b44171-12ae-4a98-aac1-1adc9dff3941\") " pod="openstack/ovn-controller-8rcbf" Mar 14 07:19:15 crc kubenswrapper[4893]: I0314 07:19:15.400228 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4b44171-12ae-4a98-aac1-1adc9dff3941-ovn-controller-tls-certs\") pod \"ovn-controller-8rcbf\" (UID: \"a4b44171-12ae-4a98-aac1-1adc9dff3941\") " pod="openstack/ovn-controller-8rcbf" Mar 14 07:19:15 crc kubenswrapper[4893]: I0314 07:19:15.400258 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ec3a7835-99ba-4d0d-b81d-2dea0dc7128b-var-run\") pod \"ovn-controller-ovs-bwq2l\" (UID: \"ec3a7835-99ba-4d0d-b81d-2dea0dc7128b\") " pod="openstack/ovn-controller-ovs-bwq2l" Mar 14 07:19:15 crc kubenswrapper[4893]: I0314 07:19:15.400281 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ec3a7835-99ba-4d0d-b81d-2dea0dc7128b-scripts\") pod \"ovn-controller-ovs-bwq2l\" (UID: \"ec3a7835-99ba-4d0d-b81d-2dea0dc7128b\") " pod="openstack/ovn-controller-ovs-bwq2l" Mar 14 07:19:15 crc kubenswrapper[4893]: I0314 07:19:15.400599 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hc7q\" (UniqueName: \"kubernetes.io/projected/ec3a7835-99ba-4d0d-b81d-2dea0dc7128b-kube-api-access-4hc7q\") pod \"ovn-controller-ovs-bwq2l\" (UID: \"ec3a7835-99ba-4d0d-b81d-2dea0dc7128b\") " pod="openstack/ovn-controller-ovs-bwq2l" Mar 14 07:19:15 crc kubenswrapper[4893]: I0314 07:19:15.400683 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/ec3a7835-99ba-4d0d-b81d-2dea0dc7128b-etc-ovs\") pod \"ovn-controller-ovs-bwq2l\" (UID: \"ec3a7835-99ba-4d0d-b81d-2dea0dc7128b\") " pod="openstack/ovn-controller-ovs-bwq2l" Mar 14 07:19:15 crc kubenswrapper[4893]: I0314 07:19:15.400746 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/ec3a7835-99ba-4d0d-b81d-2dea0dc7128b-var-log\") pod \"ovn-controller-ovs-bwq2l\" (UID: \"ec3a7835-99ba-4d0d-b81d-2dea0dc7128b\") " pod="openstack/ovn-controller-ovs-bwq2l" Mar 14 07:19:15 crc kubenswrapper[4893]: I0314 07:19:15.400802 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2rzg\" (UniqueName: \"kubernetes.io/projected/a4b44171-12ae-4a98-aac1-1adc9dff3941-kube-api-access-v2rzg\") pod \"ovn-controller-8rcbf\" (UID: \"a4b44171-12ae-4a98-aac1-1adc9dff3941\") " pod="openstack/ovn-controller-8rcbf" Mar 14 07:19:15 crc kubenswrapper[4893]: I0314 07:19:15.400829 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a4b44171-12ae-4a98-aac1-1adc9dff3941-var-log-ovn\") pod \"ovn-controller-8rcbf\" (UID: \"a4b44171-12ae-4a98-aac1-1adc9dff3941\") " pod="openstack/ovn-controller-8rcbf" Mar 14 07:19:15 crc kubenswrapper[4893]: I0314 07:19:15.400861 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/ec3a7835-99ba-4d0d-b81d-2dea0dc7128b-var-lib\") pod \"ovn-controller-ovs-bwq2l\" (UID: \"ec3a7835-99ba-4d0d-b81d-2dea0dc7128b\") " pod="openstack/ovn-controller-ovs-bwq2l" Mar 14 07:19:15 crc kubenswrapper[4893]: I0314 07:19:15.400913 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a4b44171-12ae-4a98-aac1-1adc9dff3941-var-run\") pod \"ovn-controller-8rcbf\" (UID: \"a4b44171-12ae-4a98-aac1-1adc9dff3941\") " pod="openstack/ovn-controller-8rcbf" Mar 14 07:19:15 crc kubenswrapper[4893]: I0314 07:19:15.400949 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a4b44171-12ae-4a98-aac1-1adc9dff3941-scripts\") pod \"ovn-controller-8rcbf\" (UID: \"a4b44171-12ae-4a98-aac1-1adc9dff3941\") " pod="openstack/ovn-controller-8rcbf" Mar 14 07:19:15 crc kubenswrapper[4893]: I0314 07:19:15.400998 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4b44171-12ae-4a98-aac1-1adc9dff3941-combined-ca-bundle\") pod \"ovn-controller-8rcbf\" (UID: \"a4b44171-12ae-4a98-aac1-1adc9dff3941\") " pod="openstack/ovn-controller-8rcbf" Mar 14 07:19:15 crc kubenswrapper[4893]: I0314 07:19:15.401034 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a4b44171-12ae-4a98-aac1-1adc9dff3941-var-run-ovn\") pod \"ovn-controller-8rcbf\" (UID: \"a4b44171-12ae-4a98-aac1-1adc9dff3941\") " pod="openstack/ovn-controller-8rcbf" Mar 14 07:19:15 crc kubenswrapper[4893]: I0314 07:19:15.401105 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a4b44171-12ae-4a98-aac1-1adc9dff3941-var-run\") pod \"ovn-controller-8rcbf\" (UID: \"a4b44171-12ae-4a98-aac1-1adc9dff3941\") " pod="openstack/ovn-controller-8rcbf" Mar 14 07:19:15 crc kubenswrapper[4893]: I0314 07:19:15.401307 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a4b44171-12ae-4a98-aac1-1adc9dff3941-var-run-ovn\") pod \"ovn-controller-8rcbf\" (UID: \"a4b44171-12ae-4a98-aac1-1adc9dff3941\") " pod="openstack/ovn-controller-8rcbf" Mar 14 07:19:15 crc kubenswrapper[4893]: I0314 07:19:15.403030 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a4b44171-12ae-4a98-aac1-1adc9dff3941-scripts\") pod \"ovn-controller-8rcbf\" (UID: \"a4b44171-12ae-4a98-aac1-1adc9dff3941\") " pod="openstack/ovn-controller-8rcbf" Mar 14 07:19:15 crc kubenswrapper[4893]: I0314 07:19:15.412086 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4b44171-12ae-4a98-aac1-1adc9dff3941-combined-ca-bundle\") pod \"ovn-controller-8rcbf\" (UID: \"a4b44171-12ae-4a98-aac1-1adc9dff3941\") " pod="openstack/ovn-controller-8rcbf" Mar 14 07:19:15 crc kubenswrapper[4893]: I0314 07:19:15.420289 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2rzg\" (UniqueName: \"kubernetes.io/projected/a4b44171-12ae-4a98-aac1-1adc9dff3941-kube-api-access-v2rzg\") pod \"ovn-controller-8rcbf\" (UID: \"a4b44171-12ae-4a98-aac1-1adc9dff3941\") " pod="openstack/ovn-controller-8rcbf" Mar 14 07:19:15 crc kubenswrapper[4893]: I0314 07:19:15.423366 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4b44171-12ae-4a98-aac1-1adc9dff3941-ovn-controller-tls-certs\") pod \"ovn-controller-8rcbf\" (UID: \"a4b44171-12ae-4a98-aac1-1adc9dff3941\") " pod="openstack/ovn-controller-8rcbf" Mar 14 07:19:15 crc kubenswrapper[4893]: I0314 07:19:15.503774 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ec3a7835-99ba-4d0d-b81d-2dea0dc7128b-scripts\") pod \"ovn-controller-ovs-bwq2l\" (UID: \"ec3a7835-99ba-4d0d-b81d-2dea0dc7128b\") " pod="openstack/ovn-controller-ovs-bwq2l" Mar 14 07:19:15 crc kubenswrapper[4893]: I0314 07:19:15.503901 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hc7q\" (UniqueName: \"kubernetes.io/projected/ec3a7835-99ba-4d0d-b81d-2dea0dc7128b-kube-api-access-4hc7q\") pod \"ovn-controller-ovs-bwq2l\" (UID: \"ec3a7835-99ba-4d0d-b81d-2dea0dc7128b\") " pod="openstack/ovn-controller-ovs-bwq2l" Mar 14 07:19:15 crc kubenswrapper[4893]: I0314 07:19:15.503935 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/ec3a7835-99ba-4d0d-b81d-2dea0dc7128b-etc-ovs\") pod \"ovn-controller-ovs-bwq2l\" (UID: \"ec3a7835-99ba-4d0d-b81d-2dea0dc7128b\") " pod="openstack/ovn-controller-ovs-bwq2l" Mar 14 07:19:15 crc kubenswrapper[4893]: I0314 07:19:15.503976 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/ec3a7835-99ba-4d0d-b81d-2dea0dc7128b-var-log\") pod \"ovn-controller-ovs-bwq2l\" (UID: \"ec3a7835-99ba-4d0d-b81d-2dea0dc7128b\") " pod="openstack/ovn-controller-ovs-bwq2l" Mar 14 07:19:15 crc kubenswrapper[4893]: I0314 07:19:15.504017 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/ec3a7835-99ba-4d0d-b81d-2dea0dc7128b-var-lib\") pod \"ovn-controller-ovs-bwq2l\" (UID: \"ec3a7835-99ba-4d0d-b81d-2dea0dc7128b\") " pod="openstack/ovn-controller-ovs-bwq2l" Mar 14 07:19:15 crc kubenswrapper[4893]: I0314 07:19:15.504105 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ec3a7835-99ba-4d0d-b81d-2dea0dc7128b-var-run\") pod \"ovn-controller-ovs-bwq2l\" (UID: \"ec3a7835-99ba-4d0d-b81d-2dea0dc7128b\") " pod="openstack/ovn-controller-ovs-bwq2l" Mar 14 07:19:15 crc kubenswrapper[4893]: I0314 07:19:15.504372 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ec3a7835-99ba-4d0d-b81d-2dea0dc7128b-var-run\") pod \"ovn-controller-ovs-bwq2l\" (UID: \"ec3a7835-99ba-4d0d-b81d-2dea0dc7128b\") " pod="openstack/ovn-controller-ovs-bwq2l" Mar 14 07:19:15 crc kubenswrapper[4893]: I0314 07:19:15.505270 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/ec3a7835-99ba-4d0d-b81d-2dea0dc7128b-etc-ovs\") pod \"ovn-controller-ovs-bwq2l\" (UID: \"ec3a7835-99ba-4d0d-b81d-2dea0dc7128b\") " pod="openstack/ovn-controller-ovs-bwq2l" Mar 14 07:19:15 crc kubenswrapper[4893]: I0314 07:19:15.505381 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/ec3a7835-99ba-4d0d-b81d-2dea0dc7128b-var-log\") pod \"ovn-controller-ovs-bwq2l\" (UID: \"ec3a7835-99ba-4d0d-b81d-2dea0dc7128b\") " pod="openstack/ovn-controller-ovs-bwq2l" Mar 14 07:19:15 crc kubenswrapper[4893]: I0314 07:19:15.505577 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/ec3a7835-99ba-4d0d-b81d-2dea0dc7128b-var-lib\") pod \"ovn-controller-ovs-bwq2l\" (UID: \"ec3a7835-99ba-4d0d-b81d-2dea0dc7128b\") " pod="openstack/ovn-controller-ovs-bwq2l" Mar 14 07:19:15 crc kubenswrapper[4893]: I0314 07:19:15.505818 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ec3a7835-99ba-4d0d-b81d-2dea0dc7128b-scripts\") pod \"ovn-controller-ovs-bwq2l\" (UID: \"ec3a7835-99ba-4d0d-b81d-2dea0dc7128b\") " pod="openstack/ovn-controller-ovs-bwq2l" Mar 14 07:19:15 crc kubenswrapper[4893]: I0314 07:19:15.517737 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-8rcbf" Mar 14 07:19:15 crc kubenswrapper[4893]: I0314 07:19:15.522037 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hc7q\" (UniqueName: \"kubernetes.io/projected/ec3a7835-99ba-4d0d-b81d-2dea0dc7128b-kube-api-access-4hc7q\") pod \"ovn-controller-ovs-bwq2l\" (UID: \"ec3a7835-99ba-4d0d-b81d-2dea0dc7128b\") " pod="openstack/ovn-controller-ovs-bwq2l" Mar 14 07:19:15 crc kubenswrapper[4893]: I0314 07:19:15.554828 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-bwq2l" Mar 14 07:19:15 crc kubenswrapper[4893]: I0314 07:19:15.791776 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 14 07:19:15 crc kubenswrapper[4893]: I0314 07:19:15.792882 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 14 07:19:15 crc kubenswrapper[4893]: I0314 07:19:15.796570 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 14 07:19:15 crc kubenswrapper[4893]: I0314 07:19:15.796731 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 14 07:19:15 crc kubenswrapper[4893]: I0314 07:19:15.797494 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 14 07:19:15 crc kubenswrapper[4893]: I0314 07:19:15.797702 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 14 07:19:15 crc kubenswrapper[4893]: I0314 07:19:15.797765 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-szjt9" Mar 14 07:19:15 crc kubenswrapper[4893]: I0314 07:19:15.820782 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 14 07:19:15 crc kubenswrapper[4893]: I0314 07:19:15.910962 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e1f33b9a-21cf-4ca9-82a1-5fb7592bb749-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"e1f33b9a-21cf-4ca9-82a1-5fb7592bb749\") " pod="openstack/ovsdbserver-sb-0" Mar 14 07:19:15 crc kubenswrapper[4893]: I0314 07:19:15.911414 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"e1f33b9a-21cf-4ca9-82a1-5fb7592bb749\") " pod="openstack/ovsdbserver-sb-0" Mar 14 07:19:15 crc kubenswrapper[4893]: I0314 07:19:15.911563 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clxtq\" (UniqueName: \"kubernetes.io/projected/e1f33b9a-21cf-4ca9-82a1-5fb7592bb749-kube-api-access-clxtq\") pod \"ovsdbserver-sb-0\" (UID: \"e1f33b9a-21cf-4ca9-82a1-5fb7592bb749\") " pod="openstack/ovsdbserver-sb-0" Mar 14 07:19:15 crc kubenswrapper[4893]: I0314 07:19:15.911752 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1f33b9a-21cf-4ca9-82a1-5fb7592bb749-config\") pod \"ovsdbserver-sb-0\" (UID: \"e1f33b9a-21cf-4ca9-82a1-5fb7592bb749\") " pod="openstack/ovsdbserver-sb-0" Mar 14 07:19:15 crc kubenswrapper[4893]: I0314 07:19:15.911876 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1f33b9a-21cf-4ca9-82a1-5fb7592bb749-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e1f33b9a-21cf-4ca9-82a1-5fb7592bb749\") " pod="openstack/ovsdbserver-sb-0" Mar 14 07:19:15 crc kubenswrapper[4893]: I0314 07:19:15.911903 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1f33b9a-21cf-4ca9-82a1-5fb7592bb749-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"e1f33b9a-21cf-4ca9-82a1-5fb7592bb749\") " pod="openstack/ovsdbserver-sb-0" Mar 14 07:19:15 crc kubenswrapper[4893]: I0314 07:19:15.911973 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1f33b9a-21cf-4ca9-82a1-5fb7592bb749-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e1f33b9a-21cf-4ca9-82a1-5fb7592bb749\") " pod="openstack/ovsdbserver-sb-0" Mar 14 07:19:15 crc kubenswrapper[4893]: I0314 07:19:15.912029 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e1f33b9a-21cf-4ca9-82a1-5fb7592bb749-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"e1f33b9a-21cf-4ca9-82a1-5fb7592bb749\") " pod="openstack/ovsdbserver-sb-0" Mar 14 07:19:16 crc kubenswrapper[4893]: I0314 07:19:16.013615 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clxtq\" (UniqueName: \"kubernetes.io/projected/e1f33b9a-21cf-4ca9-82a1-5fb7592bb749-kube-api-access-clxtq\") pod \"ovsdbserver-sb-0\" (UID: \"e1f33b9a-21cf-4ca9-82a1-5fb7592bb749\") " pod="openstack/ovsdbserver-sb-0" Mar 14 07:19:16 crc kubenswrapper[4893]: I0314 07:19:16.013667 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1f33b9a-21cf-4ca9-82a1-5fb7592bb749-config\") pod \"ovsdbserver-sb-0\" (UID: \"e1f33b9a-21cf-4ca9-82a1-5fb7592bb749\") " pod="openstack/ovsdbserver-sb-0" Mar 14 07:19:16 crc kubenswrapper[4893]: I0314 07:19:16.013712 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1f33b9a-21cf-4ca9-82a1-5fb7592bb749-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e1f33b9a-21cf-4ca9-82a1-5fb7592bb749\") " pod="openstack/ovsdbserver-sb-0" Mar 14 07:19:16 crc kubenswrapper[4893]: I0314 07:19:16.013729 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1f33b9a-21cf-4ca9-82a1-5fb7592bb749-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"e1f33b9a-21cf-4ca9-82a1-5fb7592bb749\") " pod="openstack/ovsdbserver-sb-0" Mar 14 07:19:16 crc kubenswrapper[4893]: I0314 07:19:16.013751 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1f33b9a-21cf-4ca9-82a1-5fb7592bb749-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e1f33b9a-21cf-4ca9-82a1-5fb7592bb749\") " pod="openstack/ovsdbserver-sb-0" Mar 14 07:19:16 crc kubenswrapper[4893]: I0314 07:19:16.013779 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e1f33b9a-21cf-4ca9-82a1-5fb7592bb749-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"e1f33b9a-21cf-4ca9-82a1-5fb7592bb749\") " pod="openstack/ovsdbserver-sb-0" Mar 14 07:19:16 crc kubenswrapper[4893]: I0314 07:19:16.013827 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e1f33b9a-21cf-4ca9-82a1-5fb7592bb749-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"e1f33b9a-21cf-4ca9-82a1-5fb7592bb749\") " pod="openstack/ovsdbserver-sb-0" Mar 14 07:19:16 crc kubenswrapper[4893]: I0314 07:19:16.013852 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"e1f33b9a-21cf-4ca9-82a1-5fb7592bb749\") " pod="openstack/ovsdbserver-sb-0" Mar 14 07:19:16 crc kubenswrapper[4893]: I0314 07:19:16.014160 4893 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"e1f33b9a-21cf-4ca9-82a1-5fb7592bb749\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/ovsdbserver-sb-0" Mar 14 07:19:16 crc kubenswrapper[4893]: I0314 07:19:16.014795 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e1f33b9a-21cf-4ca9-82a1-5fb7592bb749-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"e1f33b9a-21cf-4ca9-82a1-5fb7592bb749\") " pod="openstack/ovsdbserver-sb-0" Mar 14 07:19:16 crc kubenswrapper[4893]: I0314 07:19:16.015165 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1f33b9a-21cf-4ca9-82a1-5fb7592bb749-config\") pod \"ovsdbserver-sb-0\" (UID: \"e1f33b9a-21cf-4ca9-82a1-5fb7592bb749\") " pod="openstack/ovsdbserver-sb-0" Mar 14 07:19:16 crc kubenswrapper[4893]: I0314 07:19:16.015221 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e1f33b9a-21cf-4ca9-82a1-5fb7592bb749-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"e1f33b9a-21cf-4ca9-82a1-5fb7592bb749\") " pod="openstack/ovsdbserver-sb-0" Mar 14 07:19:16 crc kubenswrapper[4893]: I0314 07:19:16.018098 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1f33b9a-21cf-4ca9-82a1-5fb7592bb749-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e1f33b9a-21cf-4ca9-82a1-5fb7592bb749\") " pod="openstack/ovsdbserver-sb-0" Mar 14 07:19:16 crc kubenswrapper[4893]: I0314 07:19:16.018421 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1f33b9a-21cf-4ca9-82a1-5fb7592bb749-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e1f33b9a-21cf-4ca9-82a1-5fb7592bb749\") " pod="openstack/ovsdbserver-sb-0" Mar 14 07:19:16 crc kubenswrapper[4893]: I0314 07:19:16.019397 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1f33b9a-21cf-4ca9-82a1-5fb7592bb749-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"e1f33b9a-21cf-4ca9-82a1-5fb7592bb749\") " pod="openstack/ovsdbserver-sb-0" Mar 14 07:19:16 crc kubenswrapper[4893]: I0314 07:19:16.032544 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clxtq\" (UniqueName: \"kubernetes.io/projected/e1f33b9a-21cf-4ca9-82a1-5fb7592bb749-kube-api-access-clxtq\") pod \"ovsdbserver-sb-0\" (UID: \"e1f33b9a-21cf-4ca9-82a1-5fb7592bb749\") " pod="openstack/ovsdbserver-sb-0" Mar 14 07:19:16 crc kubenswrapper[4893]: I0314 07:19:16.042891 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"e1f33b9a-21cf-4ca9-82a1-5fb7592bb749\") " pod="openstack/ovsdbserver-sb-0" Mar 14 07:19:16 crc kubenswrapper[4893]: I0314 07:19:16.118281 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 14 07:19:18 crc kubenswrapper[4893]: I0314 07:19:18.161104 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 14 07:19:18 crc kubenswrapper[4893]: I0314 07:19:18.163303 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 14 07:19:18 crc kubenswrapper[4893]: I0314 07:19:18.166555 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-qdxpn" Mar 14 07:19:18 crc kubenswrapper[4893]: I0314 07:19:18.167005 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 14 07:19:18 crc kubenswrapper[4893]: I0314 07:19:18.167995 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 14 07:19:18 crc kubenswrapper[4893]: I0314 07:19:18.168153 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 14 07:19:18 crc kubenswrapper[4893]: I0314 07:19:18.187385 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 14 07:19:18 crc kubenswrapper[4893]: I0314 07:19:18.356583 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e12ac0dd-46ad-4d51-9ebc-acdd264649b2-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e12ac0dd-46ad-4d51-9ebc-acdd264649b2\") " pod="openstack/ovsdbserver-nb-0" Mar 14 07:19:18 crc kubenswrapper[4893]: I0314 07:19:18.356933 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"e12ac0dd-46ad-4d51-9ebc-acdd264649b2\") " pod="openstack/ovsdbserver-nb-0" Mar 14 07:19:18 crc kubenswrapper[4893]: I0314 07:19:18.356991 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e12ac0dd-46ad-4d51-9ebc-acdd264649b2-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e12ac0dd-46ad-4d51-9ebc-acdd264649b2\") " pod="openstack/ovsdbserver-nb-0" Mar 14 07:19:18 crc kubenswrapper[4893]: I0314 07:19:18.357013 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5phk4\" (UniqueName: \"kubernetes.io/projected/e12ac0dd-46ad-4d51-9ebc-acdd264649b2-kube-api-access-5phk4\") pod \"ovsdbserver-nb-0\" (UID: \"e12ac0dd-46ad-4d51-9ebc-acdd264649b2\") " pod="openstack/ovsdbserver-nb-0" Mar 14 07:19:18 crc kubenswrapper[4893]: I0314 07:19:18.357037 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e12ac0dd-46ad-4d51-9ebc-acdd264649b2-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"e12ac0dd-46ad-4d51-9ebc-acdd264649b2\") " pod="openstack/ovsdbserver-nb-0" Mar 14 07:19:18 crc kubenswrapper[4893]: I0314 07:19:18.357085 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e12ac0dd-46ad-4d51-9ebc-acdd264649b2-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"e12ac0dd-46ad-4d51-9ebc-acdd264649b2\") " pod="openstack/ovsdbserver-nb-0" Mar 14 07:19:18 crc kubenswrapper[4893]: I0314 07:19:18.357107 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e12ac0dd-46ad-4d51-9ebc-acdd264649b2-config\") pod \"ovsdbserver-nb-0\" (UID: \"e12ac0dd-46ad-4d51-9ebc-acdd264649b2\") " pod="openstack/ovsdbserver-nb-0" Mar 14 07:19:18 crc kubenswrapper[4893]: I0314 07:19:18.357132 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e12ac0dd-46ad-4d51-9ebc-acdd264649b2-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"e12ac0dd-46ad-4d51-9ebc-acdd264649b2\") " pod="openstack/ovsdbserver-nb-0" Mar 14 07:19:18 crc kubenswrapper[4893]: I0314 07:19:18.458481 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e12ac0dd-46ad-4d51-9ebc-acdd264649b2-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"e12ac0dd-46ad-4d51-9ebc-acdd264649b2\") " pod="openstack/ovsdbserver-nb-0" Mar 14 07:19:18 crc kubenswrapper[4893]: I0314 07:19:18.458581 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e12ac0dd-46ad-4d51-9ebc-acdd264649b2-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"e12ac0dd-46ad-4d51-9ebc-acdd264649b2\") " pod="openstack/ovsdbserver-nb-0" Mar 14 07:19:18 crc kubenswrapper[4893]: I0314 07:19:18.458605 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e12ac0dd-46ad-4d51-9ebc-acdd264649b2-config\") pod \"ovsdbserver-nb-0\" (UID: \"e12ac0dd-46ad-4d51-9ebc-acdd264649b2\") " pod="openstack/ovsdbserver-nb-0" Mar 14 07:19:18 crc kubenswrapper[4893]: I0314 07:19:18.458628 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e12ac0dd-46ad-4d51-9ebc-acdd264649b2-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"e12ac0dd-46ad-4d51-9ebc-acdd264649b2\") " pod="openstack/ovsdbserver-nb-0" Mar 14 07:19:18 crc kubenswrapper[4893]: I0314 07:19:18.458683 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e12ac0dd-46ad-4d51-9ebc-acdd264649b2-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e12ac0dd-46ad-4d51-9ebc-acdd264649b2\") " pod="openstack/ovsdbserver-nb-0" Mar 14 07:19:18 crc kubenswrapper[4893]: I0314 07:19:18.458704 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"e12ac0dd-46ad-4d51-9ebc-acdd264649b2\") " pod="openstack/ovsdbserver-nb-0" Mar 14 07:19:18 crc kubenswrapper[4893]: I0314 07:19:18.458749 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e12ac0dd-46ad-4d51-9ebc-acdd264649b2-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e12ac0dd-46ad-4d51-9ebc-acdd264649b2\") " pod="openstack/ovsdbserver-nb-0" Mar 14 07:19:18 crc kubenswrapper[4893]: I0314 07:19:18.458767 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5phk4\" (UniqueName: \"kubernetes.io/projected/e12ac0dd-46ad-4d51-9ebc-acdd264649b2-kube-api-access-5phk4\") pod \"ovsdbserver-nb-0\" (UID: \"e12ac0dd-46ad-4d51-9ebc-acdd264649b2\") " pod="openstack/ovsdbserver-nb-0" Mar 14 07:19:18 crc kubenswrapper[4893]: I0314 07:19:18.458902 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e12ac0dd-46ad-4d51-9ebc-acdd264649b2-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"e12ac0dd-46ad-4d51-9ebc-acdd264649b2\") " pod="openstack/ovsdbserver-nb-0" Mar 14 07:19:18 crc kubenswrapper[4893]: I0314 07:19:18.459588 4893 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"e12ac0dd-46ad-4d51-9ebc-acdd264649b2\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/ovsdbserver-nb-0" Mar 14 07:19:18 crc kubenswrapper[4893]: I0314 07:19:18.459780 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e12ac0dd-46ad-4d51-9ebc-acdd264649b2-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"e12ac0dd-46ad-4d51-9ebc-acdd264649b2\") " pod="openstack/ovsdbserver-nb-0" Mar 14 07:19:18 crc kubenswrapper[4893]: I0314 07:19:18.460800 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e12ac0dd-46ad-4d51-9ebc-acdd264649b2-config\") pod \"ovsdbserver-nb-0\" (UID: \"e12ac0dd-46ad-4d51-9ebc-acdd264649b2\") " pod="openstack/ovsdbserver-nb-0" Mar 14 07:19:18 crc kubenswrapper[4893]: I0314 07:19:18.464245 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e12ac0dd-46ad-4d51-9ebc-acdd264649b2-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"e12ac0dd-46ad-4d51-9ebc-acdd264649b2\") " pod="openstack/ovsdbserver-nb-0" Mar 14 07:19:18 crc kubenswrapper[4893]: I0314 07:19:18.464273 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e12ac0dd-46ad-4d51-9ebc-acdd264649b2-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e12ac0dd-46ad-4d51-9ebc-acdd264649b2\") " pod="openstack/ovsdbserver-nb-0" Mar 14 07:19:18 crc kubenswrapper[4893]: I0314 07:19:18.464294 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e12ac0dd-46ad-4d51-9ebc-acdd264649b2-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e12ac0dd-46ad-4d51-9ebc-acdd264649b2\") " pod="openstack/ovsdbserver-nb-0" Mar 14 07:19:18 crc kubenswrapper[4893]: I0314 07:19:18.473097 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5phk4\" (UniqueName: \"kubernetes.io/projected/e12ac0dd-46ad-4d51-9ebc-acdd264649b2-kube-api-access-5phk4\") pod \"ovsdbserver-nb-0\" (UID: \"e12ac0dd-46ad-4d51-9ebc-acdd264649b2\") " pod="openstack/ovsdbserver-nb-0" Mar 14 07:19:18 crc kubenswrapper[4893]: I0314 07:19:18.487566 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"e12ac0dd-46ad-4d51-9ebc-acdd264649b2\") " pod="openstack/ovsdbserver-nb-0" Mar 14 07:19:18 crc kubenswrapper[4893]: I0314 07:19:18.501060 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 14 07:19:22 crc kubenswrapper[4893]: E0314 07:19:22.341027 4893 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:2087a09e7ea9f1dbadd433366bb46cc93dd5460ac9606b65f430460f4c2ee18d" Mar 14 07:19:22 crc kubenswrapper[4893]: E0314 07:19:22.341689 4893 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:2087a09e7ea9f1dbadd433366bb46cc93dd5460ac9606b65f430460f4c2ee18d,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-n5ws2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 14 07:19:22 crc kubenswrapper[4893]: E0314 07:19:22.342969 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e" Mar 14 07:19:33 crc kubenswrapper[4893]: E0314 07:19:33.458003 4893 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-memcached@sha256:0d759b31e4da88b3fa1b823ab634d982fd713e81ed648626de1d8ec40ae7cad4" Mar 14 07:19:33 crc kubenswrapper[4893]: E0314 07:19:33.458841 4893 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:memcached,Image:quay.io/podified-antelope-centos9/openstack-memcached@sha256:0d759b31e4da88b3fa1b823ab634d982fd713e81ed648626de1d8ec40ae7cad4,Command:[/usr/bin/dumb-init -- /usr/local/bin/kolla_start],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:memcached,HostPort:0,ContainerPort:11211,Protocol:TCP,HostIP:,},ContainerPort{Name:memcached-tls,HostPort:0,ContainerPort:11212,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:POD_IPS,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIPs,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:CONFIG_HASH,Value:n575h74h688h594h684h5c9h587hb6h7dh96hdbh554h654hffh5dfh5d8hc9h67h558h64h7ch5f5h569h597h659h67bhc7h555h66h5d8h5c8h55q,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/src,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/certs/memcached.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/private/memcached.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dgfn5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42457,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42457,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod memcached-0_openstack(6de6c560-1e2c-4dca-b4c2-be4e51a5300f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 14 07:19:33 crc kubenswrapper[4893]: E0314 07:19:33.459963 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/memcached-0" podUID="6de6c560-1e2c-4dca-b4c2-be4e51a5300f" Mar 14 07:19:33 crc kubenswrapper[4893]: E0314 07:19:33.557025 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-memcached@sha256:0d759b31e4da88b3fa1b823ab634d982fd713e81ed648626de1d8ec40ae7cad4\\\"\"" pod="openstack/memcached-0" podUID="6de6c560-1e2c-4dca-b4c2-be4e51a5300f" Mar 14 07:19:34 crc kubenswrapper[4893]: E0314 07:19:34.200854 4893 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51" Mar 14 07:19:34 crc kubenswrapper[4893]: E0314 07:19:34.201706 4893 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h8dtf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5448ff6dc7-nqmg5_openstack(1b03746c-6794-4325-a2ea-d43d5bf418f2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 14 07:19:34 crc kubenswrapper[4893]: E0314 07:19:34.210347 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5448ff6dc7-nqmg5" podUID="1b03746c-6794-4325-a2ea-d43d5bf418f2" Mar 14 07:19:34 crc kubenswrapper[4893]: E0314 07:19:34.223515 4893 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51" Mar 14 07:19:34 crc kubenswrapper[4893]: E0314 07:19:34.223802 4893 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jcvdn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-854f47b4f9-4n27f_openstack(4980f8ae-d3e9-4a80-8257-c40696e11036): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 14 07:19:34 crc kubenswrapper[4893]: E0314 07:19:34.225495 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-854f47b4f9-4n27f" podUID="4980f8ae-d3e9-4a80-8257-c40696e11036" Mar 14 07:19:34 crc kubenswrapper[4893]: E0314 07:19:34.232910 4893 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51" Mar 14 07:19:34 crc kubenswrapper[4893]: E0314 07:19:34.233063 4893 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vs56l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-54b5dffb47-dn9fz_openstack(664bdee5-ebab-425c-b11e-2f8eed685b18): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 14 07:19:34 crc kubenswrapper[4893]: E0314 07:19:34.240632 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-54b5dffb47-dn9fz" podUID="664bdee5-ebab-425c-b11e-2f8eed685b18" Mar 14 07:19:34 crc kubenswrapper[4893]: E0314 07:19:34.274623 4893 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51" Mar 14 07:19:34 crc kubenswrapper[4893]: E0314 07:19:34.275771 4893 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-75rm4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-64696987c5-whdn2_openstack(a0db85c4-654a-44af-9f4a-68f4e7f72ba7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 14 07:19:34 crc kubenswrapper[4893]: E0314 07:19:34.276969 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-64696987c5-whdn2" podUID="a0db85c4-654a-44af-9f4a-68f4e7f72ba7" Mar 14 07:19:34 crc kubenswrapper[4893]: I0314 07:19:34.568159 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"4f6afb83-bfaa-41a6-8429-b8588d82c7a7","Type":"ContainerStarted","Data":"5d483ecdea3679e076c45905acc44deb617277c515c2b95cca2cf608060961b7"} Mar 14 07:19:34 crc kubenswrapper[4893]: I0314 07:19:34.569947 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"444c0d3d-4ad4-47a3-9281-b7028d69a78a","Type":"ContainerStarted","Data":"ced6dde900ea0387dd4c09e308e98dd1da790cf886736ef5a116af997fc079ee"} Mar 14 07:19:34 crc kubenswrapper[4893]: E0314 07:19:34.571567 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51\\\"\"" pod="openstack/dnsmasq-dns-54b5dffb47-dn9fz" podUID="664bdee5-ebab-425c-b11e-2f8eed685b18" Mar 14 07:19:34 crc kubenswrapper[4893]: E0314 07:19:34.572039 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51\\\"\"" pod="openstack/dnsmasq-dns-854f47b4f9-4n27f" podUID="4980f8ae-d3e9-4a80-8257-c40696e11036" Mar 14 07:19:34 crc kubenswrapper[4893]: I0314 07:19:34.662118 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-8rcbf"] Mar 14 07:19:34 crc kubenswrapper[4893]: I0314 07:19:34.672725 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 14 07:19:34 crc kubenswrapper[4893]: W0314 07:19:34.763458 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb3f59e3_5322_4aa3_94c7_de3cc01c39cc.slice/crio-8eeb92ae66581a8ddc0966be1f4628f6975f2e307c6746198108c7c0efca84f4 WatchSource:0}: Error finding container 8eeb92ae66581a8ddc0966be1f4628f6975f2e307c6746198108c7c0efca84f4: Status 404 returned error can't find the container with id 8eeb92ae66581a8ddc0966be1f4628f6975f2e307c6746198108c7c0efca84f4 Mar 14 07:19:34 crc kubenswrapper[4893]: I0314 07:19:34.874894 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 14 07:19:34 crc kubenswrapper[4893]: W0314 07:19:34.965885 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode12ac0dd_46ad_4d51_9ebc_acdd264649b2.slice/crio-30159984625ce8d1a17f6484306a447ef75f2b4c6c364fedd5f4711d1cf4fe3e WatchSource:0}: Error finding container 30159984625ce8d1a17f6484306a447ef75f2b4c6c364fedd5f4711d1cf4fe3e: Status 404 returned error can't find the container with id 30159984625ce8d1a17f6484306a447ef75f2b4c6c364fedd5f4711d1cf4fe3e Mar 14 07:19:35 crc kubenswrapper[4893]: I0314 07:19:35.011657 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-bwq2l"] Mar 14 07:19:35 crc kubenswrapper[4893]: W0314 07:19:35.075154 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec3a7835_99ba_4d0d_b81d_2dea0dc7128b.slice/crio-4272f8f5af088c355e4ceba4df8758997438611eb4234c6ad57c12eec356d5bb WatchSource:0}: Error finding container 4272f8f5af088c355e4ceba4df8758997438611eb4234c6ad57c12eec356d5bb: Status 404 returned error can't find the container with id 4272f8f5af088c355e4ceba4df8758997438611eb4234c6ad57c12eec356d5bb Mar 14 07:19:35 crc kubenswrapper[4893]: I0314 07:19:35.134375 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64696987c5-whdn2" Mar 14 07:19:35 crc kubenswrapper[4893]: I0314 07:19:35.137964 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5448ff6dc7-nqmg5" Mar 14 07:19:35 crc kubenswrapper[4893]: I0314 07:19:35.239047 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0db85c4-654a-44af-9f4a-68f4e7f72ba7-config\") pod \"a0db85c4-654a-44af-9f4a-68f4e7f72ba7\" (UID: \"a0db85c4-654a-44af-9f4a-68f4e7f72ba7\") " Mar 14 07:19:35 crc kubenswrapper[4893]: I0314 07:19:35.239121 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a0db85c4-654a-44af-9f4a-68f4e7f72ba7-dns-svc\") pod \"a0db85c4-654a-44af-9f4a-68f4e7f72ba7\" (UID: \"a0db85c4-654a-44af-9f4a-68f4e7f72ba7\") " Mar 14 07:19:35 crc kubenswrapper[4893]: I0314 07:19:35.239180 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b03746c-6794-4325-a2ea-d43d5bf418f2-config\") pod \"1b03746c-6794-4325-a2ea-d43d5bf418f2\" (UID: \"1b03746c-6794-4325-a2ea-d43d5bf418f2\") " Mar 14 07:19:35 crc kubenswrapper[4893]: I0314 07:19:35.239207 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8dtf\" (UniqueName: \"kubernetes.io/projected/1b03746c-6794-4325-a2ea-d43d5bf418f2-kube-api-access-h8dtf\") pod \"1b03746c-6794-4325-a2ea-d43d5bf418f2\" (UID: \"1b03746c-6794-4325-a2ea-d43d5bf418f2\") " Mar 14 07:19:35 crc kubenswrapper[4893]: I0314 07:19:35.239259 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75rm4\" (UniqueName: \"kubernetes.io/projected/a0db85c4-654a-44af-9f4a-68f4e7f72ba7-kube-api-access-75rm4\") pod \"a0db85c4-654a-44af-9f4a-68f4e7f72ba7\" (UID: \"a0db85c4-654a-44af-9f4a-68f4e7f72ba7\") " Mar 14 07:19:35 crc kubenswrapper[4893]: I0314 07:19:35.239673 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0db85c4-654a-44af-9f4a-68f4e7f72ba7-config" (OuterVolumeSpecName: "config") pod "a0db85c4-654a-44af-9f4a-68f4e7f72ba7" (UID: "a0db85c4-654a-44af-9f4a-68f4e7f72ba7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:19:35 crc kubenswrapper[4893]: I0314 07:19:35.239954 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b03746c-6794-4325-a2ea-d43d5bf418f2-config" (OuterVolumeSpecName: "config") pod "1b03746c-6794-4325-a2ea-d43d5bf418f2" (UID: "1b03746c-6794-4325-a2ea-d43d5bf418f2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:19:35 crc kubenswrapper[4893]: I0314 07:19:35.240160 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0db85c4-654a-44af-9f4a-68f4e7f72ba7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a0db85c4-654a-44af-9f4a-68f4e7f72ba7" (UID: "a0db85c4-654a-44af-9f4a-68f4e7f72ba7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:19:35 crc kubenswrapper[4893]: I0314 07:19:35.262383 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0db85c4-654a-44af-9f4a-68f4e7f72ba7-kube-api-access-75rm4" (OuterVolumeSpecName: "kube-api-access-75rm4") pod "a0db85c4-654a-44af-9f4a-68f4e7f72ba7" (UID: "a0db85c4-654a-44af-9f4a-68f4e7f72ba7"). InnerVolumeSpecName "kube-api-access-75rm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:19:35 crc kubenswrapper[4893]: I0314 07:19:35.262455 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b03746c-6794-4325-a2ea-d43d5bf418f2-kube-api-access-h8dtf" (OuterVolumeSpecName: "kube-api-access-h8dtf") pod "1b03746c-6794-4325-a2ea-d43d5bf418f2" (UID: "1b03746c-6794-4325-a2ea-d43d5bf418f2"). InnerVolumeSpecName "kube-api-access-h8dtf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:19:35 crc kubenswrapper[4893]: I0314 07:19:35.341418 4893 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b03746c-6794-4325-a2ea-d43d5bf418f2-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:19:35 crc kubenswrapper[4893]: I0314 07:19:35.341456 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8dtf\" (UniqueName: \"kubernetes.io/projected/1b03746c-6794-4325-a2ea-d43d5bf418f2-kube-api-access-h8dtf\") on node \"crc\" DevicePath \"\"" Mar 14 07:19:35 crc kubenswrapper[4893]: I0314 07:19:35.341467 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75rm4\" (UniqueName: \"kubernetes.io/projected/a0db85c4-654a-44af-9f4a-68f4e7f72ba7-kube-api-access-75rm4\") on node \"crc\" DevicePath \"\"" Mar 14 07:19:35 crc kubenswrapper[4893]: I0314 07:19:35.341478 4893 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0db85c4-654a-44af-9f4a-68f4e7f72ba7-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:19:35 crc kubenswrapper[4893]: I0314 07:19:35.341488 4893 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a0db85c4-654a-44af-9f4a-68f4e7f72ba7-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 07:19:35 crc kubenswrapper[4893]: I0314 07:19:35.577561 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a752b3c8-284e-490f-be39-506e7a075c6f","Type":"ContainerStarted","Data":"292f113e5f439a52265f81f58611b262cd2d04d5dfef8d95fd26c5f4c46fb3b1"} Mar 14 07:19:35 crc kubenswrapper[4893]: I0314 07:19:35.582528 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"e12ac0dd-46ad-4d51-9ebc-acdd264649b2","Type":"ContainerStarted","Data":"30159984625ce8d1a17f6484306a447ef75f2b4c6c364fedd5f4711d1cf4fe3e"} Mar 14 07:19:35 crc kubenswrapper[4893]: I0314 07:19:35.584096 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64696987c5-whdn2" event={"ID":"a0db85c4-654a-44af-9f4a-68f4e7f72ba7","Type":"ContainerDied","Data":"719a6a92cbc3d9073df716850ccd7023e018219213a9636b92c9979b852cfa84"} Mar 14 07:19:35 crc kubenswrapper[4893]: I0314 07:19:35.584157 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64696987c5-whdn2" Mar 14 07:19:35 crc kubenswrapper[4893]: I0314 07:19:35.587085 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-bwq2l" event={"ID":"ec3a7835-99ba-4d0d-b81d-2dea0dc7128b","Type":"ContainerStarted","Data":"4272f8f5af088c355e4ceba4df8758997438611eb4234c6ad57c12eec356d5bb"} Mar 14 07:19:35 crc kubenswrapper[4893]: I0314 07:19:35.588092 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-8rcbf" event={"ID":"a4b44171-12ae-4a98-aac1-1adc9dff3941","Type":"ContainerStarted","Data":"054ed7098a508e1c9b815a5bf9a718392396f2369433d55ab70a0ca01821ed1c"} Mar 14 07:19:35 crc kubenswrapper[4893]: I0314 07:19:35.589139 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5448ff6dc7-nqmg5" event={"ID":"1b03746c-6794-4325-a2ea-d43d5bf418f2","Type":"ContainerDied","Data":"f24a87b724d3505789b7b4ed829600523b0d3d78692c80226d389808f1a00271"} Mar 14 07:19:35 crc kubenswrapper[4893]: I0314 07:19:35.589188 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5448ff6dc7-nqmg5" Mar 14 07:19:35 crc kubenswrapper[4893]: I0314 07:19:35.592321 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"cb3f59e3-5322-4aa3-94c7-de3cc01c39cc","Type":"ContainerStarted","Data":"8eeb92ae66581a8ddc0966be1f4628f6975f2e307c6746198108c7c0efca84f4"} Mar 14 07:19:35 crc kubenswrapper[4893]: I0314 07:19:35.660280 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64696987c5-whdn2"] Mar 14 07:19:35 crc kubenswrapper[4893]: I0314 07:19:35.672629 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-64696987c5-whdn2"] Mar 14 07:19:35 crc kubenswrapper[4893]: I0314 07:19:35.684582 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5448ff6dc7-nqmg5"] Mar 14 07:19:35 crc kubenswrapper[4893]: I0314 07:19:35.695764 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5448ff6dc7-nqmg5"] Mar 14 07:19:35 crc kubenswrapper[4893]: I0314 07:19:35.723281 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 14 07:19:36 crc kubenswrapper[4893]: I0314 07:19:36.600729 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"e1f33b9a-21cf-4ca9-82a1-5fb7592bb749","Type":"ContainerStarted","Data":"4b5631e650c731c2a185bb7430db72d82be2d70d5dbcd4c9189cb4f2306de86d"} Mar 14 07:19:36 crc kubenswrapper[4893]: I0314 07:19:36.603132 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e","Type":"ContainerStarted","Data":"682d8cd293d9dce0bae81b6b24fad5bbdb451e309a9432d29cbee73b0bde8366"} Mar 14 07:19:37 crc kubenswrapper[4893]: I0314 07:19:37.385159 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b03746c-6794-4325-a2ea-d43d5bf418f2" path="/var/lib/kubelet/pods/1b03746c-6794-4325-a2ea-d43d5bf418f2/volumes" Mar 14 07:19:37 crc kubenswrapper[4893]: I0314 07:19:37.385808 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0db85c4-654a-44af-9f4a-68f4e7f72ba7" path="/var/lib/kubelet/pods/a0db85c4-654a-44af-9f4a-68f4e7f72ba7/volumes" Mar 14 07:19:38 crc kubenswrapper[4893]: I0314 07:19:38.617754 4893 generic.go:334] "Generic (PLEG): container finished" podID="4f6afb83-bfaa-41a6-8429-b8588d82c7a7" containerID="5d483ecdea3679e076c45905acc44deb617277c515c2b95cca2cf608060961b7" exitCode=0 Mar 14 07:19:38 crc kubenswrapper[4893]: I0314 07:19:38.617861 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"4f6afb83-bfaa-41a6-8429-b8588d82c7a7","Type":"ContainerDied","Data":"5d483ecdea3679e076c45905acc44deb617277c515c2b95cca2cf608060961b7"} Mar 14 07:19:38 crc kubenswrapper[4893]: I0314 07:19:38.619837 4893 generic.go:334] "Generic (PLEG): container finished" podID="444c0d3d-4ad4-47a3-9281-b7028d69a78a" containerID="ced6dde900ea0387dd4c09e308e98dd1da790cf886736ef5a116af997fc079ee" exitCode=0 Mar 14 07:19:38 crc kubenswrapper[4893]: I0314 07:19:38.619884 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"444c0d3d-4ad4-47a3-9281-b7028d69a78a","Type":"ContainerDied","Data":"ced6dde900ea0387dd4c09e308e98dd1da790cf886736ef5a116af997fc079ee"} Mar 14 07:19:39 crc kubenswrapper[4893]: I0314 07:19:39.630250 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"e1f33b9a-21cf-4ca9-82a1-5fb7592bb749","Type":"ContainerStarted","Data":"d32e5cf30b4cda1330ae637ea8cabb6506d4c4567367cf5b048fc3ef60031d41"} Mar 14 07:19:39 crc kubenswrapper[4893]: I0314 07:19:39.632164 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"444c0d3d-4ad4-47a3-9281-b7028d69a78a","Type":"ContainerStarted","Data":"320bfcf2a61e367399a0d0afd62dedcfc1630863667c2d338451fad1fcb6b519"} Mar 14 07:19:39 crc kubenswrapper[4893]: I0314 07:19:39.633919 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"e12ac0dd-46ad-4d51-9ebc-acdd264649b2","Type":"ContainerStarted","Data":"89d23ad9d8b1070ff1ada0fd7fa8efcb4659ecb5a3b9953b1e1bb4bcf13f5396"} Mar 14 07:19:39 crc kubenswrapper[4893]: I0314 07:19:39.635010 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-bwq2l" event={"ID":"ec3a7835-99ba-4d0d-b81d-2dea0dc7128b","Type":"ContainerStarted","Data":"56801d7dbcb1352157e545571b9cb4eda563977d1b3b6342a302dc72311e1b23"} Mar 14 07:19:39 crc kubenswrapper[4893]: I0314 07:19:39.636260 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-8rcbf" event={"ID":"a4b44171-12ae-4a98-aac1-1adc9dff3941","Type":"ContainerStarted","Data":"9e22a88384bee5ff968ae0cd7ac28fcae82795ada865fc08d38fb7e786168779"} Mar 14 07:19:39 crc kubenswrapper[4893]: I0314 07:19:39.636319 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-8rcbf" Mar 14 07:19:39 crc kubenswrapper[4893]: I0314 07:19:39.637928 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"4f6afb83-bfaa-41a6-8429-b8588d82c7a7","Type":"ContainerStarted","Data":"027f613615f6fb782a95ff00ffffb3250ba4bf7974f3a506df7ad3cbe9d15d84"} Mar 14 07:19:39 crc kubenswrapper[4893]: I0314 07:19:39.639468 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"cb3f59e3-5322-4aa3-94c7-de3cc01c39cc","Type":"ContainerStarted","Data":"79bd7e0e66d8f0475a7827f0b449209a2d19414d3528e69c2515efbdf5c781ce"} Mar 14 07:19:39 crc kubenswrapper[4893]: I0314 07:19:39.639553 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 14 07:19:39 crc kubenswrapper[4893]: I0314 07:19:39.657321 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=7.87803777 podStartE2EDuration="33.657291393s" podCreationTimestamp="2026-03-14 07:19:06 +0000 UTC" firstStartedPulling="2026-03-14 07:19:08.366865816 +0000 UTC m=+1227.629042608" lastFinishedPulling="2026-03-14 07:19:34.146119449 +0000 UTC m=+1253.408296231" observedRunningTime="2026-03-14 07:19:39.651478222 +0000 UTC m=+1258.913655044" watchObservedRunningTime="2026-03-14 07:19:39.657291393 +0000 UTC m=+1258.919468175" Mar 14 07:19:39 crc kubenswrapper[4893]: I0314 07:19:39.672235 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=24.418658962 podStartE2EDuration="28.672217237s" podCreationTimestamp="2026-03-14 07:19:11 +0000 UTC" firstStartedPulling="2026-03-14 07:19:34.766807146 +0000 UTC m=+1254.028983938" lastFinishedPulling="2026-03-14 07:19:39.020365411 +0000 UTC m=+1258.282542213" observedRunningTime="2026-03-14 07:19:39.668844895 +0000 UTC m=+1258.931021697" watchObservedRunningTime="2026-03-14 07:19:39.672217237 +0000 UTC m=+1258.934394029" Mar 14 07:19:39 crc kubenswrapper[4893]: I0314 07:19:39.689063 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-8rcbf" podStartSLOduration=20.633166001 podStartE2EDuration="24.689042287s" podCreationTimestamp="2026-03-14 07:19:15 +0000 UTC" firstStartedPulling="2026-03-14 07:19:34.96513631 +0000 UTC m=+1254.227313092" lastFinishedPulling="2026-03-14 07:19:39.021012576 +0000 UTC m=+1258.283189378" observedRunningTime="2026-03-14 07:19:39.683951223 +0000 UTC m=+1258.946128035" watchObservedRunningTime="2026-03-14 07:19:39.689042287 +0000 UTC m=+1258.951219079" Mar 14 07:19:39 crc kubenswrapper[4893]: I0314 07:19:39.732567 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=8.46800888 podStartE2EDuration="32.732550348s" podCreationTimestamp="2026-03-14 07:19:07 +0000 UTC" firstStartedPulling="2026-03-14 07:19:09.959592892 +0000 UTC m=+1229.221769684" lastFinishedPulling="2026-03-14 07:19:34.22413436 +0000 UTC m=+1253.486311152" observedRunningTime="2026-03-14 07:19:39.73100583 +0000 UTC m=+1258.993182622" watchObservedRunningTime="2026-03-14 07:19:39.732550348 +0000 UTC m=+1258.994727140" Mar 14 07:19:40 crc kubenswrapper[4893]: I0314 07:19:40.648803 4893 generic.go:334] "Generic (PLEG): container finished" podID="ec3a7835-99ba-4d0d-b81d-2dea0dc7128b" containerID="56801d7dbcb1352157e545571b9cb4eda563977d1b3b6342a302dc72311e1b23" exitCode=0 Mar 14 07:19:40 crc kubenswrapper[4893]: I0314 07:19:40.650234 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-bwq2l" event={"ID":"ec3a7835-99ba-4d0d-b81d-2dea0dc7128b","Type":"ContainerDied","Data":"56801d7dbcb1352157e545571b9cb4eda563977d1b3b6342a302dc72311e1b23"} Mar 14 07:19:42 crc kubenswrapper[4893]: I0314 07:19:42.695608 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"e12ac0dd-46ad-4d51-9ebc-acdd264649b2","Type":"ContainerStarted","Data":"8649f7f4f4c1d54eaf72dbf5276bdbd35d2015a4eb0c3467c699f17e2aee8e05"} Mar 14 07:19:42 crc kubenswrapper[4893]: I0314 07:19:42.708037 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=22.190614015 podStartE2EDuration="28.707973152s" podCreationTimestamp="2026-03-14 07:19:14 +0000 UTC" firstStartedPulling="2026-03-14 07:19:35.873469716 +0000 UTC m=+1255.135646508" lastFinishedPulling="2026-03-14 07:19:42.390828833 +0000 UTC m=+1261.653005645" observedRunningTime="2026-03-14 07:19:42.705515432 +0000 UTC m=+1261.967692284" watchObservedRunningTime="2026-03-14 07:19:42.707973152 +0000 UTC m=+1261.970149964" Mar 14 07:19:42 crc kubenswrapper[4893]: I0314 07:19:42.737394 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=18.308063427 podStartE2EDuration="25.737348918s" podCreationTimestamp="2026-03-14 07:19:17 +0000 UTC" firstStartedPulling="2026-03-14 07:19:34.967767694 +0000 UTC m=+1254.229944486" lastFinishedPulling="2026-03-14 07:19:42.397053145 +0000 UTC m=+1261.659229977" observedRunningTime="2026-03-14 07:19:42.732782846 +0000 UTC m=+1261.994959658" watchObservedRunningTime="2026-03-14 07:19:42.737348918 +0000 UTC m=+1261.999525710" Mar 14 07:19:43 crc kubenswrapper[4893]: I0314 07:19:43.120065 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 14 07:19:43 crc kubenswrapper[4893]: I0314 07:19:43.170094 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 14 07:19:43 crc kubenswrapper[4893]: I0314 07:19:43.501793 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 14 07:19:43 crc kubenswrapper[4893]: I0314 07:19:43.716075 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-bwq2l" event={"ID":"ec3a7835-99ba-4d0d-b81d-2dea0dc7128b","Type":"ContainerStarted","Data":"14dfeeb1ac493f8c868ceb12cb9f2623644d493f702d4b8d551c54f2d4fd82cb"} Mar 14 07:19:43 crc kubenswrapper[4893]: I0314 07:19:43.716431 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-bwq2l" event={"ID":"ec3a7835-99ba-4d0d-b81d-2dea0dc7128b","Type":"ContainerStarted","Data":"815433c66f3f469063c8f960c990cc45531d8077d9256e3db6d30e4aeb47f852"} Mar 14 07:19:43 crc kubenswrapper[4893]: I0314 07:19:43.716673 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-bwq2l" Mar 14 07:19:43 crc kubenswrapper[4893]: I0314 07:19:43.716695 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-bwq2l" Mar 14 07:19:43 crc kubenswrapper[4893]: I0314 07:19:43.718829 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"e1f33b9a-21cf-4ca9-82a1-5fb7592bb749","Type":"ContainerStarted","Data":"eace24dbb800288f6b1dfc821a8adc0ebb57521e5ce25c1b28ee9518e85fe50f"} Mar 14 07:19:43 crc kubenswrapper[4893]: I0314 07:19:43.719482 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 14 07:19:44 crc kubenswrapper[4893]: I0314 07:19:44.799130 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 14 07:19:44 crc kubenswrapper[4893]: I0314 07:19:44.840233 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-bwq2l" podStartSLOduration=25.900701157 podStartE2EDuration="29.840209118s" podCreationTimestamp="2026-03-14 07:19:15 +0000 UTC" firstStartedPulling="2026-03-14 07:19:35.079074406 +0000 UTC m=+1254.341251228" lastFinishedPulling="2026-03-14 07:19:39.018582387 +0000 UTC m=+1258.280759189" observedRunningTime="2026-03-14 07:19:43.740482236 +0000 UTC m=+1263.002659028" watchObservedRunningTime="2026-03-14 07:19:44.840209118 +0000 UTC m=+1264.102385940" Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.080563 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54b5dffb47-dn9fz"] Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.132918 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7988f9db49-h6b2h"] Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.134323 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7988f9db49-h6b2h" Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.136091 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.141850 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7988f9db49-h6b2h"] Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.201553 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-7bbf4"] Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.202281 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvkbv\" (UniqueName: \"kubernetes.io/projected/f35a681c-c2d4-4513-8007-a3edefae561d-kube-api-access-bvkbv\") pod \"dnsmasq-dns-7988f9db49-h6b2h\" (UID: \"f35a681c-c2d4-4513-8007-a3edefae561d\") " pod="openstack/dnsmasq-dns-7988f9db49-h6b2h" Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.202349 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f35a681c-c2d4-4513-8007-a3edefae561d-config\") pod \"dnsmasq-dns-7988f9db49-h6b2h\" (UID: \"f35a681c-c2d4-4513-8007-a3edefae561d\") " pod="openstack/dnsmasq-dns-7988f9db49-h6b2h" Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.202401 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f35a681c-c2d4-4513-8007-a3edefae561d-dns-svc\") pod \"dnsmasq-dns-7988f9db49-h6b2h\" (UID: \"f35a681c-c2d4-4513-8007-a3edefae561d\") " pod="openstack/dnsmasq-dns-7988f9db49-h6b2h" Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.202425 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f35a681c-c2d4-4513-8007-a3edefae561d-ovsdbserver-sb\") pod \"dnsmasq-dns-7988f9db49-h6b2h\" (UID: \"f35a681c-c2d4-4513-8007-a3edefae561d\") " pod="openstack/dnsmasq-dns-7988f9db49-h6b2h" Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.203350 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-7bbf4" Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.206410 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.216201 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-7bbf4"] Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.305461 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvkbv\" (UniqueName: \"kubernetes.io/projected/f35a681c-c2d4-4513-8007-a3edefae561d-kube-api-access-bvkbv\") pod \"dnsmasq-dns-7988f9db49-h6b2h\" (UID: \"f35a681c-c2d4-4513-8007-a3edefae561d\") " pod="openstack/dnsmasq-dns-7988f9db49-h6b2h" Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.305803 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f35a681c-c2d4-4513-8007-a3edefae561d-config\") pod \"dnsmasq-dns-7988f9db49-h6b2h\" (UID: \"f35a681c-c2d4-4513-8007-a3edefae561d\") " pod="openstack/dnsmasq-dns-7988f9db49-h6b2h" Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.305831 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/457f660f-9b87-4d37-a92e-0c30bb2a2fea-ovs-rundir\") pod \"ovn-controller-metrics-7bbf4\" (UID: \"457f660f-9b87-4d37-a92e-0c30bb2a2fea\") " pod="openstack/ovn-controller-metrics-7bbf4" Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.305970 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/457f660f-9b87-4d37-a92e-0c30bb2a2fea-config\") pod \"ovn-controller-metrics-7bbf4\" (UID: \"457f660f-9b87-4d37-a92e-0c30bb2a2fea\") " pod="openstack/ovn-controller-metrics-7bbf4" Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.306042 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/457f660f-9b87-4d37-a92e-0c30bb2a2fea-combined-ca-bundle\") pod \"ovn-controller-metrics-7bbf4\" (UID: \"457f660f-9b87-4d37-a92e-0c30bb2a2fea\") " pod="openstack/ovn-controller-metrics-7bbf4" Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.306073 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f35a681c-c2d4-4513-8007-a3edefae561d-dns-svc\") pod \"dnsmasq-dns-7988f9db49-h6b2h\" (UID: \"f35a681c-c2d4-4513-8007-a3edefae561d\") " pod="openstack/dnsmasq-dns-7988f9db49-h6b2h" Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.306125 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f35a681c-c2d4-4513-8007-a3edefae561d-ovsdbserver-sb\") pod \"dnsmasq-dns-7988f9db49-h6b2h\" (UID: \"f35a681c-c2d4-4513-8007-a3edefae561d\") " pod="openstack/dnsmasq-dns-7988f9db49-h6b2h" Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.306151 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v82fz\" (UniqueName: \"kubernetes.io/projected/457f660f-9b87-4d37-a92e-0c30bb2a2fea-kube-api-access-v82fz\") pod \"ovn-controller-metrics-7bbf4\" (UID: \"457f660f-9b87-4d37-a92e-0c30bb2a2fea\") " pod="openstack/ovn-controller-metrics-7bbf4" Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.306187 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/457f660f-9b87-4d37-a92e-0c30bb2a2fea-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-7bbf4\" (UID: \"457f660f-9b87-4d37-a92e-0c30bb2a2fea\") " pod="openstack/ovn-controller-metrics-7bbf4" Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.306291 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/457f660f-9b87-4d37-a92e-0c30bb2a2fea-ovn-rundir\") pod \"ovn-controller-metrics-7bbf4\" (UID: \"457f660f-9b87-4d37-a92e-0c30bb2a2fea\") " pod="openstack/ovn-controller-metrics-7bbf4" Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.306729 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f35a681c-c2d4-4513-8007-a3edefae561d-config\") pod \"dnsmasq-dns-7988f9db49-h6b2h\" (UID: \"f35a681c-c2d4-4513-8007-a3edefae561d\") " pod="openstack/dnsmasq-dns-7988f9db49-h6b2h" Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.307145 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f35a681c-c2d4-4513-8007-a3edefae561d-dns-svc\") pod \"dnsmasq-dns-7988f9db49-h6b2h\" (UID: \"f35a681c-c2d4-4513-8007-a3edefae561d\") " pod="openstack/dnsmasq-dns-7988f9db49-h6b2h" Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.307364 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f35a681c-c2d4-4513-8007-a3edefae561d-ovsdbserver-sb\") pod \"dnsmasq-dns-7988f9db49-h6b2h\" (UID: \"f35a681c-c2d4-4513-8007-a3edefae561d\") " pod="openstack/dnsmasq-dns-7988f9db49-h6b2h" Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.326264 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvkbv\" (UniqueName: \"kubernetes.io/projected/f35a681c-c2d4-4513-8007-a3edefae561d-kube-api-access-bvkbv\") pod \"dnsmasq-dns-7988f9db49-h6b2h\" (UID: \"f35a681c-c2d4-4513-8007-a3edefae561d\") " pod="openstack/dnsmasq-dns-7988f9db49-h6b2h" Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.407759 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/457f660f-9b87-4d37-a92e-0c30bb2a2fea-ovs-rundir\") pod \"ovn-controller-metrics-7bbf4\" (UID: \"457f660f-9b87-4d37-a92e-0c30bb2a2fea\") " pod="openstack/ovn-controller-metrics-7bbf4" Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.407798 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/457f660f-9b87-4d37-a92e-0c30bb2a2fea-config\") pod \"ovn-controller-metrics-7bbf4\" (UID: \"457f660f-9b87-4d37-a92e-0c30bb2a2fea\") " pod="openstack/ovn-controller-metrics-7bbf4" Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.407824 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/457f660f-9b87-4d37-a92e-0c30bb2a2fea-combined-ca-bundle\") pod \"ovn-controller-metrics-7bbf4\" (UID: \"457f660f-9b87-4d37-a92e-0c30bb2a2fea\") " pod="openstack/ovn-controller-metrics-7bbf4" Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.407851 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v82fz\" (UniqueName: \"kubernetes.io/projected/457f660f-9b87-4d37-a92e-0c30bb2a2fea-kube-api-access-v82fz\") pod \"ovn-controller-metrics-7bbf4\" (UID: \"457f660f-9b87-4d37-a92e-0c30bb2a2fea\") " pod="openstack/ovn-controller-metrics-7bbf4" Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.407870 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/457f660f-9b87-4d37-a92e-0c30bb2a2fea-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-7bbf4\" (UID: \"457f660f-9b87-4d37-a92e-0c30bb2a2fea\") " pod="openstack/ovn-controller-metrics-7bbf4" Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.407904 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/457f660f-9b87-4d37-a92e-0c30bb2a2fea-ovn-rundir\") pod \"ovn-controller-metrics-7bbf4\" (UID: \"457f660f-9b87-4d37-a92e-0c30bb2a2fea\") " pod="openstack/ovn-controller-metrics-7bbf4" Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.408180 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/457f660f-9b87-4d37-a92e-0c30bb2a2fea-ovn-rundir\") pod \"ovn-controller-metrics-7bbf4\" (UID: \"457f660f-9b87-4d37-a92e-0c30bb2a2fea\") " pod="openstack/ovn-controller-metrics-7bbf4" Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.408223 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/457f660f-9b87-4d37-a92e-0c30bb2a2fea-ovs-rundir\") pod \"ovn-controller-metrics-7bbf4\" (UID: \"457f660f-9b87-4d37-a92e-0c30bb2a2fea\") " pod="openstack/ovn-controller-metrics-7bbf4" Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.408858 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/457f660f-9b87-4d37-a92e-0c30bb2a2fea-config\") pod \"ovn-controller-metrics-7bbf4\" (UID: \"457f660f-9b87-4d37-a92e-0c30bb2a2fea\") " pod="openstack/ovn-controller-metrics-7bbf4" Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.412417 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/457f660f-9b87-4d37-a92e-0c30bb2a2fea-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-7bbf4\" (UID: \"457f660f-9b87-4d37-a92e-0c30bb2a2fea\") " pod="openstack/ovn-controller-metrics-7bbf4" Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.416825 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/457f660f-9b87-4d37-a92e-0c30bb2a2fea-combined-ca-bundle\") pod \"ovn-controller-metrics-7bbf4\" (UID: \"457f660f-9b87-4d37-a92e-0c30bb2a2fea\") " pod="openstack/ovn-controller-metrics-7bbf4" Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.423739 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v82fz\" (UniqueName: \"kubernetes.io/projected/457f660f-9b87-4d37-a92e-0c30bb2a2fea-kube-api-access-v82fz\") pod \"ovn-controller-metrics-7bbf4\" (UID: \"457f660f-9b87-4d37-a92e-0c30bb2a2fea\") " pod="openstack/ovn-controller-metrics-7bbf4" Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.459942 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7988f9db49-h6b2h" Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.471004 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54b5dffb47-dn9fz" Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.501460 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.509420 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/664bdee5-ebab-425c-b11e-2f8eed685b18-dns-svc\") pod \"664bdee5-ebab-425c-b11e-2f8eed685b18\" (UID: \"664bdee5-ebab-425c-b11e-2f8eed685b18\") " Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.509639 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vs56l\" (UniqueName: \"kubernetes.io/projected/664bdee5-ebab-425c-b11e-2f8eed685b18-kube-api-access-vs56l\") pod \"664bdee5-ebab-425c-b11e-2f8eed685b18\" (UID: \"664bdee5-ebab-425c-b11e-2f8eed685b18\") " Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.509675 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/664bdee5-ebab-425c-b11e-2f8eed685b18-config\") pod \"664bdee5-ebab-425c-b11e-2f8eed685b18\" (UID: \"664bdee5-ebab-425c-b11e-2f8eed685b18\") " Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.509862 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/664bdee5-ebab-425c-b11e-2f8eed685b18-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "664bdee5-ebab-425c-b11e-2f8eed685b18" (UID: "664bdee5-ebab-425c-b11e-2f8eed685b18"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.510361 4893 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/664bdee5-ebab-425c-b11e-2f8eed685b18-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.514008 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/664bdee5-ebab-425c-b11e-2f8eed685b18-config" (OuterVolumeSpecName: "config") pod "664bdee5-ebab-425c-b11e-2f8eed685b18" (UID: "664bdee5-ebab-425c-b11e-2f8eed685b18"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.514294 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/664bdee5-ebab-425c-b11e-2f8eed685b18-kube-api-access-vs56l" (OuterVolumeSpecName: "kube-api-access-vs56l") pod "664bdee5-ebab-425c-b11e-2f8eed685b18" (UID: "664bdee5-ebab-425c-b11e-2f8eed685b18"). InnerVolumeSpecName "kube-api-access-vs56l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.524891 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-7bbf4" Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.541107 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-854f47b4f9-4n27f"] Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.557050 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d944d7b75-mrhh7"] Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.558185 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d944d7b75-mrhh7" Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.574992 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.589384 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.603539 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d944d7b75-mrhh7"] Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.613435 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25fed27d-191c-4fc5-8b22-493c62637e65-ovsdbserver-sb\") pod \"dnsmasq-dns-5d944d7b75-mrhh7\" (UID: \"25fed27d-191c-4fc5-8b22-493c62637e65\") " pod="openstack/dnsmasq-dns-5d944d7b75-mrhh7" Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.613507 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25fed27d-191c-4fc5-8b22-493c62637e65-config\") pod \"dnsmasq-dns-5d944d7b75-mrhh7\" (UID: \"25fed27d-191c-4fc5-8b22-493c62637e65\") " pod="openstack/dnsmasq-dns-5d944d7b75-mrhh7" Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.613553 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25fed27d-191c-4fc5-8b22-493c62637e65-dns-svc\") pod \"dnsmasq-dns-5d944d7b75-mrhh7\" (UID: \"25fed27d-191c-4fc5-8b22-493c62637e65\") " pod="openstack/dnsmasq-dns-5d944d7b75-mrhh7" Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.613621 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfsfj\" (UniqueName: \"kubernetes.io/projected/25fed27d-191c-4fc5-8b22-493c62637e65-kube-api-access-jfsfj\") pod \"dnsmasq-dns-5d944d7b75-mrhh7\" (UID: \"25fed27d-191c-4fc5-8b22-493c62637e65\") " pod="openstack/dnsmasq-dns-5d944d7b75-mrhh7" Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.613647 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25fed27d-191c-4fc5-8b22-493c62637e65-ovsdbserver-nb\") pod \"dnsmasq-dns-5d944d7b75-mrhh7\" (UID: \"25fed27d-191c-4fc5-8b22-493c62637e65\") " pod="openstack/dnsmasq-dns-5d944d7b75-mrhh7" Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.613703 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vs56l\" (UniqueName: \"kubernetes.io/projected/664bdee5-ebab-425c-b11e-2f8eed685b18-kube-api-access-vs56l\") on node \"crc\" DevicePath \"\"" Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.613716 4893 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/664bdee5-ebab-425c-b11e-2f8eed685b18-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.715139 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25fed27d-191c-4fc5-8b22-493c62637e65-config\") pod \"dnsmasq-dns-5d944d7b75-mrhh7\" (UID: \"25fed27d-191c-4fc5-8b22-493c62637e65\") " pod="openstack/dnsmasq-dns-5d944d7b75-mrhh7" Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.715198 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25fed27d-191c-4fc5-8b22-493c62637e65-dns-svc\") pod \"dnsmasq-dns-5d944d7b75-mrhh7\" (UID: \"25fed27d-191c-4fc5-8b22-493c62637e65\") " pod="openstack/dnsmasq-dns-5d944d7b75-mrhh7" Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.715270 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfsfj\" (UniqueName: \"kubernetes.io/projected/25fed27d-191c-4fc5-8b22-493c62637e65-kube-api-access-jfsfj\") pod \"dnsmasq-dns-5d944d7b75-mrhh7\" (UID: \"25fed27d-191c-4fc5-8b22-493c62637e65\") " pod="openstack/dnsmasq-dns-5d944d7b75-mrhh7" Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.715290 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25fed27d-191c-4fc5-8b22-493c62637e65-ovsdbserver-nb\") pod \"dnsmasq-dns-5d944d7b75-mrhh7\" (UID: \"25fed27d-191c-4fc5-8b22-493c62637e65\") " pod="openstack/dnsmasq-dns-5d944d7b75-mrhh7" Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.715322 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25fed27d-191c-4fc5-8b22-493c62637e65-ovsdbserver-sb\") pod \"dnsmasq-dns-5d944d7b75-mrhh7\" (UID: \"25fed27d-191c-4fc5-8b22-493c62637e65\") " pod="openstack/dnsmasq-dns-5d944d7b75-mrhh7" Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.716152 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25fed27d-191c-4fc5-8b22-493c62637e65-ovsdbserver-sb\") pod \"dnsmasq-dns-5d944d7b75-mrhh7\" (UID: \"25fed27d-191c-4fc5-8b22-493c62637e65\") " pod="openstack/dnsmasq-dns-5d944d7b75-mrhh7" Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.716415 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25fed27d-191c-4fc5-8b22-493c62637e65-dns-svc\") pod \"dnsmasq-dns-5d944d7b75-mrhh7\" (UID: \"25fed27d-191c-4fc5-8b22-493c62637e65\") " pod="openstack/dnsmasq-dns-5d944d7b75-mrhh7" Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.716467 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25fed27d-191c-4fc5-8b22-493c62637e65-config\") pod \"dnsmasq-dns-5d944d7b75-mrhh7\" (UID: \"25fed27d-191c-4fc5-8b22-493c62637e65\") " pod="openstack/dnsmasq-dns-5d944d7b75-mrhh7" Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.717029 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25fed27d-191c-4fc5-8b22-493c62637e65-ovsdbserver-nb\") pod \"dnsmasq-dns-5d944d7b75-mrhh7\" (UID: \"25fed27d-191c-4fc5-8b22-493c62637e65\") " pod="openstack/dnsmasq-dns-5d944d7b75-mrhh7" Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.734017 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfsfj\" (UniqueName: \"kubernetes.io/projected/25fed27d-191c-4fc5-8b22-493c62637e65-kube-api-access-jfsfj\") pod \"dnsmasq-dns-5d944d7b75-mrhh7\" (UID: \"25fed27d-191c-4fc5-8b22-493c62637e65\") " pod="openstack/dnsmasq-dns-5d944d7b75-mrhh7" Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.735822 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54b5dffb47-dn9fz" event={"ID":"664bdee5-ebab-425c-b11e-2f8eed685b18","Type":"ContainerDied","Data":"6eeed7a63111cb5779ae09fda949f727116b3d8743848089ef7a06b539f4a690"} Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.735923 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54b5dffb47-dn9fz" Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.789128 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.802287 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54b5dffb47-dn9fz"] Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.812434 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-54b5dffb47-dn9fz"] Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.892512 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d944d7b75-mrhh7" Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.952041 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-854f47b4f9-4n27f" Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.959731 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7988f9db49-h6b2h"] Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.969240 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.971351 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.983914 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.983935 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.984129 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 14 07:19:45 crc kubenswrapper[4893]: I0314 07:19:45.984196 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-vgplb" Mar 14 07:19:45 crc kubenswrapper[4893]: W0314 07:19:45.988285 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf35a681c_c2d4_4513_8007_a3edefae561d.slice/crio-c387672b68c23b026b8a386cf05ba0b503784af1676a207412180ea223e7b917 WatchSource:0}: Error finding container c387672b68c23b026b8a386cf05ba0b503784af1676a207412180ea223e7b917: Status 404 returned error can't find the container with id c387672b68c23b026b8a386cf05ba0b503784af1676a207412180ea223e7b917 Mar 14 07:19:46 crc kubenswrapper[4893]: I0314 07:19:46.019385 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 14 07:19:46 crc kubenswrapper[4893]: I0314 07:19:46.033270 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4980f8ae-d3e9-4a80-8257-c40696e11036-dns-svc\") pod \"4980f8ae-d3e9-4a80-8257-c40696e11036\" (UID: \"4980f8ae-d3e9-4a80-8257-c40696e11036\") " Mar 14 07:19:46 crc kubenswrapper[4893]: I0314 07:19:46.033313 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4980f8ae-d3e9-4a80-8257-c40696e11036-config\") pod \"4980f8ae-d3e9-4a80-8257-c40696e11036\" (UID: \"4980f8ae-d3e9-4a80-8257-c40696e11036\") " Mar 14 07:19:46 crc kubenswrapper[4893]: I0314 07:19:46.033390 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcvdn\" (UniqueName: \"kubernetes.io/projected/4980f8ae-d3e9-4a80-8257-c40696e11036-kube-api-access-jcvdn\") pod \"4980f8ae-d3e9-4a80-8257-c40696e11036\" (UID: \"4980f8ae-d3e9-4a80-8257-c40696e11036\") " Mar 14 07:19:46 crc kubenswrapper[4893]: I0314 07:19:46.033649 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7a6e4cc-c5a8-4551-bd39-ab4eb11331df-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"c7a6e4cc-c5a8-4551-bd39-ab4eb11331df\") " pod="openstack/ovn-northd-0" Mar 14 07:19:46 crc kubenswrapper[4893]: I0314 07:19:46.033690 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c7a6e4cc-c5a8-4551-bd39-ab4eb11331df-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"c7a6e4cc-c5a8-4551-bd39-ab4eb11331df\") " pod="openstack/ovn-northd-0" Mar 14 07:19:46 crc kubenswrapper[4893]: I0314 07:19:46.033746 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rdqs\" (UniqueName: \"kubernetes.io/projected/c7a6e4cc-c5a8-4551-bd39-ab4eb11331df-kube-api-access-6rdqs\") pod \"ovn-northd-0\" (UID: \"c7a6e4cc-c5a8-4551-bd39-ab4eb11331df\") " pod="openstack/ovn-northd-0" Mar 14 07:19:46 crc kubenswrapper[4893]: I0314 07:19:46.033773 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7a6e4cc-c5a8-4551-bd39-ab4eb11331df-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"c7a6e4cc-c5a8-4551-bd39-ab4eb11331df\") " pod="openstack/ovn-northd-0" Mar 14 07:19:46 crc kubenswrapper[4893]: I0314 07:19:46.033796 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7a6e4cc-c5a8-4551-bd39-ab4eb11331df-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"c7a6e4cc-c5a8-4551-bd39-ab4eb11331df\") " pod="openstack/ovn-northd-0" Mar 14 07:19:46 crc kubenswrapper[4893]: I0314 07:19:46.033823 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7a6e4cc-c5a8-4551-bd39-ab4eb11331df-config\") pod \"ovn-northd-0\" (UID: \"c7a6e4cc-c5a8-4551-bd39-ab4eb11331df\") " pod="openstack/ovn-northd-0" Mar 14 07:19:46 crc kubenswrapper[4893]: I0314 07:19:46.033884 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c7a6e4cc-c5a8-4551-bd39-ab4eb11331df-scripts\") pod \"ovn-northd-0\" (UID: \"c7a6e4cc-c5a8-4551-bd39-ab4eb11331df\") " pod="openstack/ovn-northd-0" Mar 14 07:19:46 crc kubenswrapper[4893]: I0314 07:19:46.034233 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4980f8ae-d3e9-4a80-8257-c40696e11036-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4980f8ae-d3e9-4a80-8257-c40696e11036" (UID: "4980f8ae-d3e9-4a80-8257-c40696e11036"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:19:46 crc kubenswrapper[4893]: I0314 07:19:46.034399 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4980f8ae-d3e9-4a80-8257-c40696e11036-config" (OuterVolumeSpecName: "config") pod "4980f8ae-d3e9-4a80-8257-c40696e11036" (UID: "4980f8ae-d3e9-4a80-8257-c40696e11036"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:19:46 crc kubenswrapper[4893]: I0314 07:19:46.037769 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4980f8ae-d3e9-4a80-8257-c40696e11036-kube-api-access-jcvdn" (OuterVolumeSpecName: "kube-api-access-jcvdn") pod "4980f8ae-d3e9-4a80-8257-c40696e11036" (UID: "4980f8ae-d3e9-4a80-8257-c40696e11036"). InnerVolumeSpecName "kube-api-access-jcvdn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:19:46 crc kubenswrapper[4893]: I0314 07:19:46.135000 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7a6e4cc-c5a8-4551-bd39-ab4eb11331df-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"c7a6e4cc-c5a8-4551-bd39-ab4eb11331df\") " pod="openstack/ovn-northd-0" Mar 14 07:19:46 crc kubenswrapper[4893]: I0314 07:19:46.135068 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c7a6e4cc-c5a8-4551-bd39-ab4eb11331df-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"c7a6e4cc-c5a8-4551-bd39-ab4eb11331df\") " pod="openstack/ovn-northd-0" Mar 14 07:19:46 crc kubenswrapper[4893]: I0314 07:19:46.135117 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rdqs\" (UniqueName: \"kubernetes.io/projected/c7a6e4cc-c5a8-4551-bd39-ab4eb11331df-kube-api-access-6rdqs\") pod \"ovn-northd-0\" (UID: \"c7a6e4cc-c5a8-4551-bd39-ab4eb11331df\") " pod="openstack/ovn-northd-0" Mar 14 07:19:46 crc kubenswrapper[4893]: I0314 07:19:46.135142 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7a6e4cc-c5a8-4551-bd39-ab4eb11331df-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"c7a6e4cc-c5a8-4551-bd39-ab4eb11331df\") " pod="openstack/ovn-northd-0" Mar 14 07:19:46 crc kubenswrapper[4893]: I0314 07:19:46.135166 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7a6e4cc-c5a8-4551-bd39-ab4eb11331df-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"c7a6e4cc-c5a8-4551-bd39-ab4eb11331df\") " pod="openstack/ovn-northd-0" Mar 14 07:19:46 crc kubenswrapper[4893]: I0314 07:19:46.135194 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7a6e4cc-c5a8-4551-bd39-ab4eb11331df-config\") pod \"ovn-northd-0\" (UID: \"c7a6e4cc-c5a8-4551-bd39-ab4eb11331df\") " pod="openstack/ovn-northd-0" Mar 14 07:19:46 crc kubenswrapper[4893]: I0314 07:19:46.135261 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c7a6e4cc-c5a8-4551-bd39-ab4eb11331df-scripts\") pod \"ovn-northd-0\" (UID: \"c7a6e4cc-c5a8-4551-bd39-ab4eb11331df\") " pod="openstack/ovn-northd-0" Mar 14 07:19:46 crc kubenswrapper[4893]: I0314 07:19:46.135332 4893 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4980f8ae-d3e9-4a80-8257-c40696e11036-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 07:19:46 crc kubenswrapper[4893]: I0314 07:19:46.135345 4893 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4980f8ae-d3e9-4a80-8257-c40696e11036-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:19:46 crc kubenswrapper[4893]: I0314 07:19:46.135357 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jcvdn\" (UniqueName: \"kubernetes.io/projected/4980f8ae-d3e9-4a80-8257-c40696e11036-kube-api-access-jcvdn\") on node \"crc\" DevicePath \"\"" Mar 14 07:19:46 crc kubenswrapper[4893]: I0314 07:19:46.136463 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c7a6e4cc-c5a8-4551-bd39-ab4eb11331df-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"c7a6e4cc-c5a8-4551-bd39-ab4eb11331df\") " pod="openstack/ovn-northd-0" Mar 14 07:19:46 crc kubenswrapper[4893]: I0314 07:19:46.137200 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c7a6e4cc-c5a8-4551-bd39-ab4eb11331df-scripts\") pod \"ovn-northd-0\" (UID: \"c7a6e4cc-c5a8-4551-bd39-ab4eb11331df\") " pod="openstack/ovn-northd-0" Mar 14 07:19:46 crc kubenswrapper[4893]: I0314 07:19:46.137440 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7a6e4cc-c5a8-4551-bd39-ab4eb11331df-config\") pod \"ovn-northd-0\" (UID: \"c7a6e4cc-c5a8-4551-bd39-ab4eb11331df\") " pod="openstack/ovn-northd-0" Mar 14 07:19:46 crc kubenswrapper[4893]: I0314 07:19:46.141153 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7a6e4cc-c5a8-4551-bd39-ab4eb11331df-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"c7a6e4cc-c5a8-4551-bd39-ab4eb11331df\") " pod="openstack/ovn-northd-0" Mar 14 07:19:46 crc kubenswrapper[4893]: I0314 07:19:46.144092 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7a6e4cc-c5a8-4551-bd39-ab4eb11331df-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"c7a6e4cc-c5a8-4551-bd39-ab4eb11331df\") " pod="openstack/ovn-northd-0" Mar 14 07:19:46 crc kubenswrapper[4893]: I0314 07:19:46.157833 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rdqs\" (UniqueName: \"kubernetes.io/projected/c7a6e4cc-c5a8-4551-bd39-ab4eb11331df-kube-api-access-6rdqs\") pod \"ovn-northd-0\" (UID: \"c7a6e4cc-c5a8-4551-bd39-ab4eb11331df\") " pod="openstack/ovn-northd-0" Mar 14 07:19:46 crc kubenswrapper[4893]: I0314 07:19:46.160725 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7a6e4cc-c5a8-4551-bd39-ab4eb11331df-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"c7a6e4cc-c5a8-4551-bd39-ab4eb11331df\") " pod="openstack/ovn-northd-0" Mar 14 07:19:46 crc kubenswrapper[4893]: I0314 07:19:46.162275 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-7bbf4"] Mar 14 07:19:46 crc kubenswrapper[4893]: I0314 07:19:46.289494 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 14 07:19:46 crc kubenswrapper[4893]: I0314 07:19:46.415491 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d944d7b75-mrhh7"] Mar 14 07:19:46 crc kubenswrapper[4893]: W0314 07:19:46.431971 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25fed27d_191c_4fc5_8b22_493c62637e65.slice/crio-d98ba75b2e83cde714ab20c495002a0e80255b027b65c04562918527633bf906 WatchSource:0}: Error finding container d98ba75b2e83cde714ab20c495002a0e80255b027b65c04562918527633bf906: Status 404 returned error can't find the container with id d98ba75b2e83cde714ab20c495002a0e80255b027b65c04562918527633bf906 Mar 14 07:19:46 crc kubenswrapper[4893]: I0314 07:19:46.728203 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 14 07:19:46 crc kubenswrapper[4893]: I0314 07:19:46.743628 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d944d7b75-mrhh7" event={"ID":"25fed27d-191c-4fc5-8b22-493c62637e65","Type":"ContainerStarted","Data":"d98ba75b2e83cde714ab20c495002a0e80255b027b65c04562918527633bf906"} Mar 14 07:19:46 crc kubenswrapper[4893]: I0314 07:19:46.745101 4893 generic.go:334] "Generic (PLEG): container finished" podID="f35a681c-c2d4-4513-8007-a3edefae561d" containerID="94a2a13f01d8b032ec54c5e5c2f9e6dd2fe470789abc9cb70c7d252e2b6ed284" exitCode=0 Mar 14 07:19:46 crc kubenswrapper[4893]: I0314 07:19:46.745132 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7988f9db49-h6b2h" event={"ID":"f35a681c-c2d4-4513-8007-a3edefae561d","Type":"ContainerDied","Data":"94a2a13f01d8b032ec54c5e5c2f9e6dd2fe470789abc9cb70c7d252e2b6ed284"} Mar 14 07:19:46 crc kubenswrapper[4893]: I0314 07:19:46.745164 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7988f9db49-h6b2h" event={"ID":"f35a681c-c2d4-4513-8007-a3edefae561d","Type":"ContainerStarted","Data":"c387672b68c23b026b8a386cf05ba0b503784af1676a207412180ea223e7b917"} Mar 14 07:19:46 crc kubenswrapper[4893]: I0314 07:19:46.746463 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-854f47b4f9-4n27f" event={"ID":"4980f8ae-d3e9-4a80-8257-c40696e11036","Type":"ContainerDied","Data":"0fa90bfb5f1581490e308d2ba53933cca665b5835491e4263ce6b46526e0aec0"} Mar 14 07:19:46 crc kubenswrapper[4893]: I0314 07:19:46.746546 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-854f47b4f9-4n27f" Mar 14 07:19:46 crc kubenswrapper[4893]: I0314 07:19:46.748021 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"6de6c560-1e2c-4dca-b4c2-be4e51a5300f","Type":"ContainerStarted","Data":"3976a4aa22870f7b24f354814138dccf9148f7886eec69f805fe6b26c9098749"} Mar 14 07:19:46 crc kubenswrapper[4893]: I0314 07:19:46.748216 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 14 07:19:46 crc kubenswrapper[4893]: I0314 07:19:46.750149 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-7bbf4" event={"ID":"457f660f-9b87-4d37-a92e-0c30bb2a2fea","Type":"ContainerStarted","Data":"eda28ec59118715e8794d9877944b4ef0d5e23bf08d3c266e2f2246bfd2ec127"} Mar 14 07:19:46 crc kubenswrapper[4893]: I0314 07:19:46.750175 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-7bbf4" event={"ID":"457f660f-9b87-4d37-a92e-0c30bb2a2fea","Type":"ContainerStarted","Data":"a755deaf332e8bde1c5b1a0c6f5a8dcc8c488df2d0fcb490d3d75e182b1de7d4"} Mar 14 07:19:46 crc kubenswrapper[4893]: I0314 07:19:46.800566 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-7bbf4" podStartSLOduration=1.800544594 podStartE2EDuration="1.800544594s" podCreationTimestamp="2026-03-14 07:19:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:19:46.795092931 +0000 UTC m=+1266.057269723" watchObservedRunningTime="2026-03-14 07:19:46.800544594 +0000 UTC m=+1266.062721396" Mar 14 07:19:46 crc kubenswrapper[4893]: I0314 07:19:46.823057 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=1.769739087 podStartE2EDuration="37.822875168s" podCreationTimestamp="2026-03-14 07:19:09 +0000 UTC" firstStartedPulling="2026-03-14 07:19:10.104040633 +0000 UTC m=+1229.366217425" lastFinishedPulling="2026-03-14 07:19:46.157176714 +0000 UTC m=+1265.419353506" observedRunningTime="2026-03-14 07:19:46.821227819 +0000 UTC m=+1266.083404621" watchObservedRunningTime="2026-03-14 07:19:46.822875168 +0000 UTC m=+1266.085051960" Mar 14 07:19:46 crc kubenswrapper[4893]: I0314 07:19:46.894215 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-854f47b4f9-4n27f"] Mar 14 07:19:46 crc kubenswrapper[4893]: I0314 07:19:46.899631 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-854f47b4f9-4n27f"] Mar 14 07:19:47 crc kubenswrapper[4893]: I0314 07:19:47.393890 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4980f8ae-d3e9-4a80-8257-c40696e11036" path="/var/lib/kubelet/pods/4980f8ae-d3e9-4a80-8257-c40696e11036/volumes" Mar 14 07:19:47 crc kubenswrapper[4893]: I0314 07:19:47.394780 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="664bdee5-ebab-425c-b11e-2f8eed685b18" path="/var/lib/kubelet/pods/664bdee5-ebab-425c-b11e-2f8eed685b18/volumes" Mar 14 07:19:47 crc kubenswrapper[4893]: I0314 07:19:47.755100 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 14 07:19:47 crc kubenswrapper[4893]: I0314 07:19:47.756562 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 14 07:19:47 crc kubenswrapper[4893]: I0314 07:19:47.771046 4893 generic.go:334] "Generic (PLEG): container finished" podID="25fed27d-191c-4fc5-8b22-493c62637e65" containerID="f6adeb8abc5960a89a79eab5107c713c8c71bc003558950c9cd2fffd15c5edf0" exitCode=0 Mar 14 07:19:47 crc kubenswrapper[4893]: I0314 07:19:47.771373 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d944d7b75-mrhh7" event={"ID":"25fed27d-191c-4fc5-8b22-493c62637e65","Type":"ContainerDied","Data":"f6adeb8abc5960a89a79eab5107c713c8c71bc003558950c9cd2fffd15c5edf0"} Mar 14 07:19:47 crc kubenswrapper[4893]: I0314 07:19:47.775319 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7988f9db49-h6b2h" event={"ID":"f35a681c-c2d4-4513-8007-a3edefae561d","Type":"ContainerStarted","Data":"9d3fb5752d805f8699050a9d375aa026ddf5a17782356e7c467cd5c39d287991"} Mar 14 07:19:47 crc kubenswrapper[4893]: I0314 07:19:47.775696 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7988f9db49-h6b2h" Mar 14 07:19:47 crc kubenswrapper[4893]: I0314 07:19:47.781437 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c7a6e4cc-c5a8-4551-bd39-ab4eb11331df","Type":"ContainerStarted","Data":"7b7e7564bf784fd604eab18c4e27a75f662ed319eef40cfe755745c6359a0f95"} Mar 14 07:19:47 crc kubenswrapper[4893]: I0314 07:19:47.852866 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 14 07:19:47 crc kubenswrapper[4893]: I0314 07:19:47.872871 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7988f9db49-h6b2h" podStartSLOduration=2.378105431 podStartE2EDuration="2.872855248s" podCreationTimestamp="2026-03-14 07:19:45 +0000 UTC" firstStartedPulling="2026-03-14 07:19:46.00559451 +0000 UTC m=+1265.267771302" lastFinishedPulling="2026-03-14 07:19:46.500344327 +0000 UTC m=+1265.762521119" observedRunningTime="2026-03-14 07:19:47.830053544 +0000 UTC m=+1267.092230356" watchObservedRunningTime="2026-03-14 07:19:47.872855248 +0000 UTC m=+1267.135032040" Mar 14 07:19:48 crc kubenswrapper[4893]: I0314 07:19:48.791060 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c7a6e4cc-c5a8-4551-bd39-ab4eb11331df","Type":"ContainerStarted","Data":"a353de5b0e8e4b5839fc133d37ff96079b4483877961980a7a0ff324ee5c28c1"} Mar 14 07:19:48 crc kubenswrapper[4893]: I0314 07:19:48.791487 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c7a6e4cc-c5a8-4551-bd39-ab4eb11331df","Type":"ContainerStarted","Data":"7eb5ea7adcd2372f21f4aaa33bfc67aa71545d592bca12e3e249bea3ada6eeec"} Mar 14 07:19:48 crc kubenswrapper[4893]: I0314 07:19:48.791508 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 14 07:19:48 crc kubenswrapper[4893]: I0314 07:19:48.794715 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d944d7b75-mrhh7" event={"ID":"25fed27d-191c-4fc5-8b22-493c62637e65","Type":"ContainerStarted","Data":"0527ab0580ec7d69b21c7cb444daa7adb9e83a922cbe6da8a4a89673354b9b90"} Mar 14 07:19:48 crc kubenswrapper[4893]: I0314 07:19:48.828740 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.8680227179999997 podStartE2EDuration="3.828702702s" podCreationTimestamp="2026-03-14 07:19:45 +0000 UTC" firstStartedPulling="2026-03-14 07:19:46.753966838 +0000 UTC m=+1266.016143630" lastFinishedPulling="2026-03-14 07:19:47.714646822 +0000 UTC m=+1266.976823614" observedRunningTime="2026-03-14 07:19:48.821086327 +0000 UTC m=+1268.083263189" watchObservedRunningTime="2026-03-14 07:19:48.828702702 +0000 UTC m=+1268.090879564" Mar 14 07:19:48 crc kubenswrapper[4893]: I0314 07:19:48.853478 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d944d7b75-mrhh7" podStartSLOduration=3.416803144 podStartE2EDuration="3.853457206s" podCreationTimestamp="2026-03-14 07:19:45 +0000 UTC" firstStartedPulling="2026-03-14 07:19:46.435080877 +0000 UTC m=+1265.697257669" lastFinishedPulling="2026-03-14 07:19:46.871734939 +0000 UTC m=+1266.133911731" observedRunningTime="2026-03-14 07:19:48.841132795 +0000 UTC m=+1268.103309627" watchObservedRunningTime="2026-03-14 07:19:48.853457206 +0000 UTC m=+1268.115634008" Mar 14 07:19:48 crc kubenswrapper[4893]: I0314 07:19:48.914974 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 14 07:19:49 crc kubenswrapper[4893]: I0314 07:19:49.363121 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 14 07:19:49 crc kubenswrapper[4893]: I0314 07:19:49.363192 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 14 07:19:49 crc kubenswrapper[4893]: I0314 07:19:49.482763 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 14 07:19:49 crc kubenswrapper[4893]: I0314 07:19:49.810383 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d944d7b75-mrhh7" Mar 14 07:19:49 crc kubenswrapper[4893]: I0314 07:19:49.926974 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 14 07:19:50 crc kubenswrapper[4893]: I0314 07:19:50.619723 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-1fc4-account-create-update-dlgm6"] Mar 14 07:19:50 crc kubenswrapper[4893]: I0314 07:19:50.620908 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-1fc4-account-create-update-dlgm6" Mar 14 07:19:50 crc kubenswrapper[4893]: I0314 07:19:50.623049 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 14 07:19:50 crc kubenswrapper[4893]: I0314 07:19:50.629304 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-1fc4-account-create-update-dlgm6"] Mar 14 07:19:50 crc kubenswrapper[4893]: I0314 07:19:50.691587 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-wjrnn"] Mar 14 07:19:50 crc kubenswrapper[4893]: I0314 07:19:50.692783 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-wjrnn" Mar 14 07:19:50 crc kubenswrapper[4893]: I0314 07:19:50.699698 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-wjrnn"] Mar 14 07:19:50 crc kubenswrapper[4893]: I0314 07:19:50.720862 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jg7qg\" (UniqueName: \"kubernetes.io/projected/a15dc402-9291-4c21-aec2-11e96c353687-kube-api-access-jg7qg\") pod \"keystone-1fc4-account-create-update-dlgm6\" (UID: \"a15dc402-9291-4c21-aec2-11e96c353687\") " pod="openstack/keystone-1fc4-account-create-update-dlgm6" Mar 14 07:19:50 crc kubenswrapper[4893]: I0314 07:19:50.720980 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a15dc402-9291-4c21-aec2-11e96c353687-operator-scripts\") pod \"keystone-1fc4-account-create-update-dlgm6\" (UID: \"a15dc402-9291-4c21-aec2-11e96c353687\") " pod="openstack/keystone-1fc4-account-create-update-dlgm6" Mar 14 07:19:50 crc kubenswrapper[4893]: I0314 07:19:50.815921 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-w6klp"] Mar 14 07:19:50 crc kubenswrapper[4893]: I0314 07:19:50.817076 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-w6klp" Mar 14 07:19:50 crc kubenswrapper[4893]: I0314 07:19:50.822981 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a15dc402-9291-4c21-aec2-11e96c353687-operator-scripts\") pod \"keystone-1fc4-account-create-update-dlgm6\" (UID: \"a15dc402-9291-4c21-aec2-11e96c353687\") " pod="openstack/keystone-1fc4-account-create-update-dlgm6" Mar 14 07:19:50 crc kubenswrapper[4893]: I0314 07:19:50.823141 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jg7qg\" (UniqueName: \"kubernetes.io/projected/a15dc402-9291-4c21-aec2-11e96c353687-kube-api-access-jg7qg\") pod \"keystone-1fc4-account-create-update-dlgm6\" (UID: \"a15dc402-9291-4c21-aec2-11e96c353687\") " pod="openstack/keystone-1fc4-account-create-update-dlgm6" Mar 14 07:19:50 crc kubenswrapper[4893]: I0314 07:19:50.823205 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dklwn\" (UniqueName: \"kubernetes.io/projected/55d860af-b6b6-4796-855c-ade1a4f88f33-kube-api-access-dklwn\") pod \"keystone-db-create-wjrnn\" (UID: \"55d860af-b6b6-4796-855c-ade1a4f88f33\") " pod="openstack/keystone-db-create-wjrnn" Mar 14 07:19:50 crc kubenswrapper[4893]: I0314 07:19:50.823251 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55d860af-b6b6-4796-855c-ade1a4f88f33-operator-scripts\") pod \"keystone-db-create-wjrnn\" (UID: \"55d860af-b6b6-4796-855c-ade1a4f88f33\") " pod="openstack/keystone-db-create-wjrnn" Mar 14 07:19:50 crc kubenswrapper[4893]: I0314 07:19:50.824190 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a15dc402-9291-4c21-aec2-11e96c353687-operator-scripts\") pod \"keystone-1fc4-account-create-update-dlgm6\" (UID: \"a15dc402-9291-4c21-aec2-11e96c353687\") " pod="openstack/keystone-1fc4-account-create-update-dlgm6" Mar 14 07:19:50 crc kubenswrapper[4893]: I0314 07:19:50.833786 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-w6klp"] Mar 14 07:19:50 crc kubenswrapper[4893]: I0314 07:19:50.845126 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-3b5a-account-create-update-8nwqm"] Mar 14 07:19:50 crc kubenswrapper[4893]: I0314 07:19:50.846452 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3b5a-account-create-update-8nwqm" Mar 14 07:19:50 crc kubenswrapper[4893]: I0314 07:19:50.865823 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 14 07:19:50 crc kubenswrapper[4893]: I0314 07:19:50.866830 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jg7qg\" (UniqueName: \"kubernetes.io/projected/a15dc402-9291-4c21-aec2-11e96c353687-kube-api-access-jg7qg\") pod \"keystone-1fc4-account-create-update-dlgm6\" (UID: \"a15dc402-9291-4c21-aec2-11e96c353687\") " pod="openstack/keystone-1fc4-account-create-update-dlgm6" Mar 14 07:19:50 crc kubenswrapper[4893]: I0314 07:19:50.875056 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-3b5a-account-create-update-8nwqm"] Mar 14 07:19:50 crc kubenswrapper[4893]: I0314 07:19:50.924491 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40b3f7b9-3c33-4486-bba1-2f528c0eb212-operator-scripts\") pod \"placement-3b5a-account-create-update-8nwqm\" (UID: \"40b3f7b9-3c33-4486-bba1-2f528c0eb212\") " pod="openstack/placement-3b5a-account-create-update-8nwqm" Mar 14 07:19:50 crc kubenswrapper[4893]: I0314 07:19:50.924602 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4txhk\" (UniqueName: \"kubernetes.io/projected/40b3f7b9-3c33-4486-bba1-2f528c0eb212-kube-api-access-4txhk\") pod \"placement-3b5a-account-create-update-8nwqm\" (UID: \"40b3f7b9-3c33-4486-bba1-2f528c0eb212\") " pod="openstack/placement-3b5a-account-create-update-8nwqm" Mar 14 07:19:50 crc kubenswrapper[4893]: I0314 07:19:50.924712 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dklwn\" (UniqueName: \"kubernetes.io/projected/55d860af-b6b6-4796-855c-ade1a4f88f33-kube-api-access-dklwn\") pod \"keystone-db-create-wjrnn\" (UID: \"55d860af-b6b6-4796-855c-ade1a4f88f33\") " pod="openstack/keystone-db-create-wjrnn" Mar 14 07:19:50 crc kubenswrapper[4893]: I0314 07:19:50.924753 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgkwd\" (UniqueName: \"kubernetes.io/projected/48cda34b-1f0b-463f-92a5-ba03353eac80-kube-api-access-lgkwd\") pod \"placement-db-create-w6klp\" (UID: \"48cda34b-1f0b-463f-92a5-ba03353eac80\") " pod="openstack/placement-db-create-w6klp" Mar 14 07:19:50 crc kubenswrapper[4893]: I0314 07:19:50.924785 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55d860af-b6b6-4796-855c-ade1a4f88f33-operator-scripts\") pod \"keystone-db-create-wjrnn\" (UID: \"55d860af-b6b6-4796-855c-ade1a4f88f33\") " pod="openstack/keystone-db-create-wjrnn" Mar 14 07:19:50 crc kubenswrapper[4893]: I0314 07:19:50.924833 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48cda34b-1f0b-463f-92a5-ba03353eac80-operator-scripts\") pod \"placement-db-create-w6klp\" (UID: \"48cda34b-1f0b-463f-92a5-ba03353eac80\") " pod="openstack/placement-db-create-w6klp" Mar 14 07:19:50 crc kubenswrapper[4893]: I0314 07:19:50.926365 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55d860af-b6b6-4796-855c-ade1a4f88f33-operator-scripts\") pod \"keystone-db-create-wjrnn\" (UID: \"55d860af-b6b6-4796-855c-ade1a4f88f33\") " pod="openstack/keystone-db-create-wjrnn" Mar 14 07:19:50 crc kubenswrapper[4893]: I0314 07:19:50.946256 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dklwn\" (UniqueName: \"kubernetes.io/projected/55d860af-b6b6-4796-855c-ade1a4f88f33-kube-api-access-dklwn\") pod \"keystone-db-create-wjrnn\" (UID: \"55d860af-b6b6-4796-855c-ade1a4f88f33\") " pod="openstack/keystone-db-create-wjrnn" Mar 14 07:19:50 crc kubenswrapper[4893]: I0314 07:19:50.970460 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-1fc4-account-create-update-dlgm6" Mar 14 07:19:51 crc kubenswrapper[4893]: I0314 07:19:51.008384 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-wjrnn" Mar 14 07:19:51 crc kubenswrapper[4893]: I0314 07:19:51.026132 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4txhk\" (UniqueName: \"kubernetes.io/projected/40b3f7b9-3c33-4486-bba1-2f528c0eb212-kube-api-access-4txhk\") pod \"placement-3b5a-account-create-update-8nwqm\" (UID: \"40b3f7b9-3c33-4486-bba1-2f528c0eb212\") " pod="openstack/placement-3b5a-account-create-update-8nwqm" Mar 14 07:19:51 crc kubenswrapper[4893]: I0314 07:19:51.026255 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgkwd\" (UniqueName: \"kubernetes.io/projected/48cda34b-1f0b-463f-92a5-ba03353eac80-kube-api-access-lgkwd\") pod \"placement-db-create-w6klp\" (UID: \"48cda34b-1f0b-463f-92a5-ba03353eac80\") " pod="openstack/placement-db-create-w6klp" Mar 14 07:19:51 crc kubenswrapper[4893]: I0314 07:19:51.026311 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48cda34b-1f0b-463f-92a5-ba03353eac80-operator-scripts\") pod \"placement-db-create-w6klp\" (UID: \"48cda34b-1f0b-463f-92a5-ba03353eac80\") " pod="openstack/placement-db-create-w6klp" Mar 14 07:19:51 crc kubenswrapper[4893]: I0314 07:19:51.026364 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40b3f7b9-3c33-4486-bba1-2f528c0eb212-operator-scripts\") pod \"placement-3b5a-account-create-update-8nwqm\" (UID: \"40b3f7b9-3c33-4486-bba1-2f528c0eb212\") " pod="openstack/placement-3b5a-account-create-update-8nwqm" Mar 14 07:19:51 crc kubenswrapper[4893]: I0314 07:19:51.027291 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40b3f7b9-3c33-4486-bba1-2f528c0eb212-operator-scripts\") pod \"placement-3b5a-account-create-update-8nwqm\" (UID: \"40b3f7b9-3c33-4486-bba1-2f528c0eb212\") " pod="openstack/placement-3b5a-account-create-update-8nwqm" Mar 14 07:19:51 crc kubenswrapper[4893]: I0314 07:19:51.028265 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48cda34b-1f0b-463f-92a5-ba03353eac80-operator-scripts\") pod \"placement-db-create-w6klp\" (UID: \"48cda34b-1f0b-463f-92a5-ba03353eac80\") " pod="openstack/placement-db-create-w6klp" Mar 14 07:19:51 crc kubenswrapper[4893]: I0314 07:19:51.053221 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4txhk\" (UniqueName: \"kubernetes.io/projected/40b3f7b9-3c33-4486-bba1-2f528c0eb212-kube-api-access-4txhk\") pod \"placement-3b5a-account-create-update-8nwqm\" (UID: \"40b3f7b9-3c33-4486-bba1-2f528c0eb212\") " pod="openstack/placement-3b5a-account-create-update-8nwqm" Mar 14 07:19:51 crc kubenswrapper[4893]: I0314 07:19:51.055168 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgkwd\" (UniqueName: \"kubernetes.io/projected/48cda34b-1f0b-463f-92a5-ba03353eac80-kube-api-access-lgkwd\") pod \"placement-db-create-w6klp\" (UID: \"48cda34b-1f0b-463f-92a5-ba03353eac80\") " pod="openstack/placement-db-create-w6klp" Mar 14 07:19:51 crc kubenswrapper[4893]: I0314 07:19:51.136475 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-w6klp" Mar 14 07:19:51 crc kubenswrapper[4893]: I0314 07:19:51.199964 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3b5a-account-create-update-8nwqm" Mar 14 07:19:52 crc kubenswrapper[4893]: I0314 07:19:51.410189 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-1fc4-account-create-update-dlgm6"] Mar 14 07:19:52 crc kubenswrapper[4893]: W0314 07:19:51.420938 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda15dc402_9291_4c21_aec2_11e96c353687.slice/crio-943e2158dfc92cef6d72b0e10c27800cd5e2d5fbfc620cf19ea960e2fe0e760b WatchSource:0}: Error finding container 943e2158dfc92cef6d72b0e10c27800cd5e2d5fbfc620cf19ea960e2fe0e760b: Status 404 returned error can't find the container with id 943e2158dfc92cef6d72b0e10c27800cd5e2d5fbfc620cf19ea960e2fe0e760b Mar 14 07:19:52 crc kubenswrapper[4893]: I0314 07:19:51.497606 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-wjrnn"] Mar 14 07:19:52 crc kubenswrapper[4893]: W0314 07:19:51.505611 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod55d860af_b6b6_4796_855c_ade1a4f88f33.slice/crio-4983b6a33907af0b0afa640791a2c05ea88daf4ba31267b482aad259a61f2d46 WatchSource:0}: Error finding container 4983b6a33907af0b0afa640791a2c05ea88daf4ba31267b482aad259a61f2d46: Status 404 returned error can't find the container with id 4983b6a33907af0b0afa640791a2c05ea88daf4ba31267b482aad259a61f2d46 Mar 14 07:19:52 crc kubenswrapper[4893]: I0314 07:19:51.611168 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 14 07:19:52 crc kubenswrapper[4893]: I0314 07:19:51.823727 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-1fc4-account-create-update-dlgm6" event={"ID":"a15dc402-9291-4c21-aec2-11e96c353687","Type":"ContainerStarted","Data":"82ec983d96ddb66f0c0f9d7b584bdc0e99009b85136248cfa0e09a5bbe3c7cc6"} Mar 14 07:19:52 crc kubenswrapper[4893]: I0314 07:19:51.824171 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-1fc4-account-create-update-dlgm6" event={"ID":"a15dc402-9291-4c21-aec2-11e96c353687","Type":"ContainerStarted","Data":"943e2158dfc92cef6d72b0e10c27800cd5e2d5fbfc620cf19ea960e2fe0e760b"} Mar 14 07:19:52 crc kubenswrapper[4893]: I0314 07:19:51.828104 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-wjrnn" event={"ID":"55d860af-b6b6-4796-855c-ade1a4f88f33","Type":"ContainerStarted","Data":"5ba2a8c54d4952bb7c42fc8298c22338c24c173a22e289545b0cedc062b91036"} Mar 14 07:19:52 crc kubenswrapper[4893]: I0314 07:19:51.828333 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-wjrnn" event={"ID":"55d860af-b6b6-4796-855c-ade1a4f88f33","Type":"ContainerStarted","Data":"4983b6a33907af0b0afa640791a2c05ea88daf4ba31267b482aad259a61f2d46"} Mar 14 07:19:52 crc kubenswrapper[4893]: I0314 07:19:51.849434 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-1fc4-account-create-update-dlgm6" podStartSLOduration=1.849411172 podStartE2EDuration="1.849411172s" podCreationTimestamp="2026-03-14 07:19:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:19:51.837197254 +0000 UTC m=+1271.099374046" watchObservedRunningTime="2026-03-14 07:19:51.849411172 +0000 UTC m=+1271.111587954" Mar 14 07:19:52 crc kubenswrapper[4893]: I0314 07:19:51.857404 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-wjrnn" podStartSLOduration=1.857385075 podStartE2EDuration="1.857385075s" podCreationTimestamp="2026-03-14 07:19:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:19:51.850355704 +0000 UTC m=+1271.112532496" watchObservedRunningTime="2026-03-14 07:19:51.857385075 +0000 UTC m=+1271.119561867" Mar 14 07:19:52 crc kubenswrapper[4893]: W0314 07:19:52.405638 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48cda34b_1f0b_463f_92a5_ba03353eac80.slice/crio-ebadd62023e4bbc383356a2889753abdc8c10335e2923970c9485b157635151e WatchSource:0}: Error finding container ebadd62023e4bbc383356a2889753abdc8c10335e2923970c9485b157635151e: Status 404 returned error can't find the container with id ebadd62023e4bbc383356a2889753abdc8c10335e2923970c9485b157635151e Mar 14 07:19:52 crc kubenswrapper[4893]: I0314 07:19:52.406253 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-w6klp"] Mar 14 07:19:52 crc kubenswrapper[4893]: I0314 07:19:52.499556 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-3b5a-account-create-update-8nwqm"] Mar 14 07:19:52 crc kubenswrapper[4893]: W0314 07:19:52.503956 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod40b3f7b9_3c33_4486_bba1_2f528c0eb212.slice/crio-339bc90ed77330e121be0811087cb9d0560d7cae0b4b1bbb2d9b64e76f83cfee WatchSource:0}: Error finding container 339bc90ed77330e121be0811087cb9d0560d7cae0b4b1bbb2d9b64e76f83cfee: Status 404 returned error can't find the container with id 339bc90ed77330e121be0811087cb9d0560d7cae0b4b1bbb2d9b64e76f83cfee Mar 14 07:19:52 crc kubenswrapper[4893]: I0314 07:19:52.837158 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3b5a-account-create-update-8nwqm" event={"ID":"40b3f7b9-3c33-4486-bba1-2f528c0eb212","Type":"ContainerStarted","Data":"415fb706f50c03599c55095f85501084d2b91b5d78ec775f3af93b69ce990538"} Mar 14 07:19:52 crc kubenswrapper[4893]: I0314 07:19:52.837206 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3b5a-account-create-update-8nwqm" event={"ID":"40b3f7b9-3c33-4486-bba1-2f528c0eb212","Type":"ContainerStarted","Data":"339bc90ed77330e121be0811087cb9d0560d7cae0b4b1bbb2d9b64e76f83cfee"} Mar 14 07:19:52 crc kubenswrapper[4893]: I0314 07:19:52.843543 4893 generic.go:334] "Generic (PLEG): container finished" podID="a15dc402-9291-4c21-aec2-11e96c353687" containerID="82ec983d96ddb66f0c0f9d7b584bdc0e99009b85136248cfa0e09a5bbe3c7cc6" exitCode=0 Mar 14 07:19:52 crc kubenswrapper[4893]: I0314 07:19:52.843649 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-1fc4-account-create-update-dlgm6" event={"ID":"a15dc402-9291-4c21-aec2-11e96c353687","Type":"ContainerDied","Data":"82ec983d96ddb66f0c0f9d7b584bdc0e99009b85136248cfa0e09a5bbe3c7cc6"} Mar 14 07:19:52 crc kubenswrapper[4893]: I0314 07:19:52.851356 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-w6klp" event={"ID":"48cda34b-1f0b-463f-92a5-ba03353eac80","Type":"ContainerStarted","Data":"6a03dce0d2766abd66647169b859b2593ab9d52e6798ec4c10aa43a61bbaab8a"} Mar 14 07:19:52 crc kubenswrapper[4893]: I0314 07:19:52.851418 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-w6klp" event={"ID":"48cda34b-1f0b-463f-92a5-ba03353eac80","Type":"ContainerStarted","Data":"ebadd62023e4bbc383356a2889753abdc8c10335e2923970c9485b157635151e"} Mar 14 07:19:52 crc kubenswrapper[4893]: I0314 07:19:52.853892 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-3b5a-account-create-update-8nwqm" podStartSLOduration=2.853872621 podStartE2EDuration="2.853872621s" podCreationTimestamp="2026-03-14 07:19:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:19:52.851858393 +0000 UTC m=+1272.114035195" watchObservedRunningTime="2026-03-14 07:19:52.853872621 +0000 UTC m=+1272.116049413" Mar 14 07:19:52 crc kubenswrapper[4893]: I0314 07:19:52.858458 4893 generic.go:334] "Generic (PLEG): container finished" podID="55d860af-b6b6-4796-855c-ade1a4f88f33" containerID="5ba2a8c54d4952bb7c42fc8298c22338c24c173a22e289545b0cedc062b91036" exitCode=0 Mar 14 07:19:52 crc kubenswrapper[4893]: I0314 07:19:52.858519 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-wjrnn" event={"ID":"55d860af-b6b6-4796-855c-ade1a4f88f33","Type":"ContainerDied","Data":"5ba2a8c54d4952bb7c42fc8298c22338c24c173a22e289545b0cedc062b91036"} Mar 14 07:19:53 crc kubenswrapper[4893]: I0314 07:19:53.868418 4893 generic.go:334] "Generic (PLEG): container finished" podID="48cda34b-1f0b-463f-92a5-ba03353eac80" containerID="6a03dce0d2766abd66647169b859b2593ab9d52e6798ec4c10aa43a61bbaab8a" exitCode=0 Mar 14 07:19:53 crc kubenswrapper[4893]: I0314 07:19:53.868507 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-w6klp" event={"ID":"48cda34b-1f0b-463f-92a5-ba03353eac80","Type":"ContainerDied","Data":"6a03dce0d2766abd66647169b859b2593ab9d52e6798ec4c10aa43a61bbaab8a"} Mar 14 07:19:53 crc kubenswrapper[4893]: I0314 07:19:53.872309 4893 generic.go:334] "Generic (PLEG): container finished" podID="40b3f7b9-3c33-4486-bba1-2f528c0eb212" containerID="415fb706f50c03599c55095f85501084d2b91b5d78ec775f3af93b69ce990538" exitCode=0 Mar 14 07:19:53 crc kubenswrapper[4893]: I0314 07:19:53.872363 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3b5a-account-create-update-8nwqm" event={"ID":"40b3f7b9-3c33-4486-bba1-2f528c0eb212","Type":"ContainerDied","Data":"415fb706f50c03599c55095f85501084d2b91b5d78ec775f3af93b69ce990538"} Mar 14 07:19:54 crc kubenswrapper[4893]: I0314 07:19:54.377983 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-1fc4-account-create-update-dlgm6" Mar 14 07:19:54 crc kubenswrapper[4893]: I0314 07:19:54.383414 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-wjrnn" Mar 14 07:19:54 crc kubenswrapper[4893]: I0314 07:19:54.387165 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-w6klp" Mar 14 07:19:54 crc kubenswrapper[4893]: I0314 07:19:54.481776 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dklwn\" (UniqueName: \"kubernetes.io/projected/55d860af-b6b6-4796-855c-ade1a4f88f33-kube-api-access-dklwn\") pod \"55d860af-b6b6-4796-855c-ade1a4f88f33\" (UID: \"55d860af-b6b6-4796-855c-ade1a4f88f33\") " Mar 14 07:19:54 crc kubenswrapper[4893]: I0314 07:19:54.481835 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lgkwd\" (UniqueName: \"kubernetes.io/projected/48cda34b-1f0b-463f-92a5-ba03353eac80-kube-api-access-lgkwd\") pod \"48cda34b-1f0b-463f-92a5-ba03353eac80\" (UID: \"48cda34b-1f0b-463f-92a5-ba03353eac80\") " Mar 14 07:19:54 crc kubenswrapper[4893]: I0314 07:19:54.481857 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55d860af-b6b6-4796-855c-ade1a4f88f33-operator-scripts\") pod \"55d860af-b6b6-4796-855c-ade1a4f88f33\" (UID: \"55d860af-b6b6-4796-855c-ade1a4f88f33\") " Mar 14 07:19:54 crc kubenswrapper[4893]: I0314 07:19:54.481896 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a15dc402-9291-4c21-aec2-11e96c353687-operator-scripts\") pod \"a15dc402-9291-4c21-aec2-11e96c353687\" (UID: \"a15dc402-9291-4c21-aec2-11e96c353687\") " Mar 14 07:19:54 crc kubenswrapper[4893]: I0314 07:19:54.481965 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jg7qg\" (UniqueName: \"kubernetes.io/projected/a15dc402-9291-4c21-aec2-11e96c353687-kube-api-access-jg7qg\") pod \"a15dc402-9291-4c21-aec2-11e96c353687\" (UID: \"a15dc402-9291-4c21-aec2-11e96c353687\") " Mar 14 07:19:54 crc kubenswrapper[4893]: I0314 07:19:54.481989 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48cda34b-1f0b-463f-92a5-ba03353eac80-operator-scripts\") pod \"48cda34b-1f0b-463f-92a5-ba03353eac80\" (UID: \"48cda34b-1f0b-463f-92a5-ba03353eac80\") " Mar 14 07:19:54 crc kubenswrapper[4893]: I0314 07:19:54.483540 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a15dc402-9291-4c21-aec2-11e96c353687-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a15dc402-9291-4c21-aec2-11e96c353687" (UID: "a15dc402-9291-4c21-aec2-11e96c353687"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:19:54 crc kubenswrapper[4893]: I0314 07:19:54.483912 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48cda34b-1f0b-463f-92a5-ba03353eac80-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "48cda34b-1f0b-463f-92a5-ba03353eac80" (UID: "48cda34b-1f0b-463f-92a5-ba03353eac80"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:19:54 crc kubenswrapper[4893]: I0314 07:19:54.483920 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55d860af-b6b6-4796-855c-ade1a4f88f33-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "55d860af-b6b6-4796-855c-ade1a4f88f33" (UID: "55d860af-b6b6-4796-855c-ade1a4f88f33"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:19:54 crc kubenswrapper[4893]: I0314 07:19:54.487387 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55d860af-b6b6-4796-855c-ade1a4f88f33-kube-api-access-dklwn" (OuterVolumeSpecName: "kube-api-access-dklwn") pod "55d860af-b6b6-4796-855c-ade1a4f88f33" (UID: "55d860af-b6b6-4796-855c-ade1a4f88f33"). InnerVolumeSpecName "kube-api-access-dklwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:19:54 crc kubenswrapper[4893]: I0314 07:19:54.487465 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48cda34b-1f0b-463f-92a5-ba03353eac80-kube-api-access-lgkwd" (OuterVolumeSpecName: "kube-api-access-lgkwd") pod "48cda34b-1f0b-463f-92a5-ba03353eac80" (UID: "48cda34b-1f0b-463f-92a5-ba03353eac80"). InnerVolumeSpecName "kube-api-access-lgkwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:19:54 crc kubenswrapper[4893]: I0314 07:19:54.494757 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a15dc402-9291-4c21-aec2-11e96c353687-kube-api-access-jg7qg" (OuterVolumeSpecName: "kube-api-access-jg7qg") pod "a15dc402-9291-4c21-aec2-11e96c353687" (UID: "a15dc402-9291-4c21-aec2-11e96c353687"). InnerVolumeSpecName "kube-api-access-jg7qg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:19:54 crc kubenswrapper[4893]: I0314 07:19:54.552333 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 14 07:19:54 crc kubenswrapper[4893]: I0314 07:19:54.584444 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dklwn\" (UniqueName: \"kubernetes.io/projected/55d860af-b6b6-4796-855c-ade1a4f88f33-kube-api-access-dklwn\") on node \"crc\" DevicePath \"\"" Mar 14 07:19:54 crc kubenswrapper[4893]: I0314 07:19:54.584676 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lgkwd\" (UniqueName: \"kubernetes.io/projected/48cda34b-1f0b-463f-92a5-ba03353eac80-kube-api-access-lgkwd\") on node \"crc\" DevicePath \"\"" Mar 14 07:19:54 crc kubenswrapper[4893]: I0314 07:19:54.584768 4893 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55d860af-b6b6-4796-855c-ade1a4f88f33-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:19:54 crc kubenswrapper[4893]: I0314 07:19:54.584826 4893 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a15dc402-9291-4c21-aec2-11e96c353687-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:19:54 crc kubenswrapper[4893]: I0314 07:19:54.584893 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jg7qg\" (UniqueName: \"kubernetes.io/projected/a15dc402-9291-4c21-aec2-11e96c353687-kube-api-access-jg7qg\") on node \"crc\" DevicePath \"\"" Mar 14 07:19:54 crc kubenswrapper[4893]: I0314 07:19:54.584960 4893 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48cda34b-1f0b-463f-92a5-ba03353eac80-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:19:54 crc kubenswrapper[4893]: I0314 07:19:54.881431 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-1fc4-account-create-update-dlgm6" event={"ID":"a15dc402-9291-4c21-aec2-11e96c353687","Type":"ContainerDied","Data":"943e2158dfc92cef6d72b0e10c27800cd5e2d5fbfc620cf19ea960e2fe0e760b"} Mar 14 07:19:54 crc kubenswrapper[4893]: I0314 07:19:54.881966 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="943e2158dfc92cef6d72b0e10c27800cd5e2d5fbfc620cf19ea960e2fe0e760b" Mar 14 07:19:54 crc kubenswrapper[4893]: I0314 07:19:54.881673 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-1fc4-account-create-update-dlgm6" Mar 14 07:19:54 crc kubenswrapper[4893]: I0314 07:19:54.883658 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-w6klp" Mar 14 07:19:54 crc kubenswrapper[4893]: I0314 07:19:54.883678 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-w6klp" event={"ID":"48cda34b-1f0b-463f-92a5-ba03353eac80","Type":"ContainerDied","Data":"ebadd62023e4bbc383356a2889753abdc8c10335e2923970c9485b157635151e"} Mar 14 07:19:54 crc kubenswrapper[4893]: I0314 07:19:54.883850 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ebadd62023e4bbc383356a2889753abdc8c10335e2923970c9485b157635151e" Mar 14 07:19:54 crc kubenswrapper[4893]: I0314 07:19:54.885928 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-wjrnn" event={"ID":"55d860af-b6b6-4796-855c-ade1a4f88f33","Type":"ContainerDied","Data":"4983b6a33907af0b0afa640791a2c05ea88daf4ba31267b482aad259a61f2d46"} Mar 14 07:19:54 crc kubenswrapper[4893]: I0314 07:19:54.885958 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4983b6a33907af0b0afa640791a2c05ea88daf4ba31267b482aad259a61f2d46" Mar 14 07:19:54 crc kubenswrapper[4893]: I0314 07:19:54.885994 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-wjrnn" Mar 14 07:19:55 crc kubenswrapper[4893]: I0314 07:19:55.507800 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7988f9db49-h6b2h" Mar 14 07:19:55 crc kubenswrapper[4893]: I0314 07:19:55.742310 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-k8kjv"] Mar 14 07:19:55 crc kubenswrapper[4893]: E0314 07:19:55.743067 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48cda34b-1f0b-463f-92a5-ba03353eac80" containerName="mariadb-database-create" Mar 14 07:19:55 crc kubenswrapper[4893]: I0314 07:19:55.743094 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="48cda34b-1f0b-463f-92a5-ba03353eac80" containerName="mariadb-database-create" Mar 14 07:19:55 crc kubenswrapper[4893]: E0314 07:19:55.743118 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a15dc402-9291-4c21-aec2-11e96c353687" containerName="mariadb-account-create-update" Mar 14 07:19:55 crc kubenswrapper[4893]: I0314 07:19:55.743126 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="a15dc402-9291-4c21-aec2-11e96c353687" containerName="mariadb-account-create-update" Mar 14 07:19:55 crc kubenswrapper[4893]: E0314 07:19:55.743139 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55d860af-b6b6-4796-855c-ade1a4f88f33" containerName="mariadb-database-create" Mar 14 07:19:55 crc kubenswrapper[4893]: I0314 07:19:55.743147 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="55d860af-b6b6-4796-855c-ade1a4f88f33" containerName="mariadb-database-create" Mar 14 07:19:55 crc kubenswrapper[4893]: I0314 07:19:55.743312 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="55d860af-b6b6-4796-855c-ade1a4f88f33" containerName="mariadb-database-create" Mar 14 07:19:55 crc kubenswrapper[4893]: I0314 07:19:55.743329 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="48cda34b-1f0b-463f-92a5-ba03353eac80" containerName="mariadb-database-create" Mar 14 07:19:55 crc kubenswrapper[4893]: I0314 07:19:55.743350 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="a15dc402-9291-4c21-aec2-11e96c353687" containerName="mariadb-account-create-update" Mar 14 07:19:55 crc kubenswrapper[4893]: I0314 07:19:55.744231 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-k8kjv" Mar 14 07:19:55 crc kubenswrapper[4893]: I0314 07:19:55.756920 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-k8kjv"] Mar 14 07:19:55 crc kubenswrapper[4893]: I0314 07:19:55.852448 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-9600-account-create-update-dwgnl"] Mar 14 07:19:55 crc kubenswrapper[4893]: I0314 07:19:55.854124 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9600-account-create-update-dwgnl" Mar 14 07:19:55 crc kubenswrapper[4893]: I0314 07:19:55.862902 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 14 07:19:55 crc kubenswrapper[4893]: I0314 07:19:55.865010 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-9600-account-create-update-dwgnl"] Mar 14 07:19:55 crc kubenswrapper[4893]: I0314 07:19:55.888829 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3b5a-account-create-update-8nwqm" Mar 14 07:19:55 crc kubenswrapper[4893]: I0314 07:19:55.895816 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5d944d7b75-mrhh7" Mar 14 07:19:55 crc kubenswrapper[4893]: I0314 07:19:55.895893 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3b5a-account-create-update-8nwqm" event={"ID":"40b3f7b9-3c33-4486-bba1-2f528c0eb212","Type":"ContainerDied","Data":"339bc90ed77330e121be0811087cb9d0560d7cae0b4b1bbb2d9b64e76f83cfee"} Mar 14 07:19:55 crc kubenswrapper[4893]: I0314 07:19:55.895928 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="339bc90ed77330e121be0811087cb9d0560d7cae0b4b1bbb2d9b64e76f83cfee" Mar 14 07:19:55 crc kubenswrapper[4893]: I0314 07:19:55.896082 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3b5a-account-create-update-8nwqm" Mar 14 07:19:55 crc kubenswrapper[4893]: I0314 07:19:55.916077 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b4ee6b9-d425-48ad-a01e-9fbb7354e798-operator-scripts\") pod \"glance-db-create-k8kjv\" (UID: \"0b4ee6b9-d425-48ad-a01e-9fbb7354e798\") " pod="openstack/glance-db-create-k8kjv" Mar 14 07:19:55 crc kubenswrapper[4893]: I0314 07:19:55.916159 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sjg6\" (UniqueName: \"kubernetes.io/projected/0b4ee6b9-d425-48ad-a01e-9fbb7354e798-kube-api-access-7sjg6\") pod \"glance-db-create-k8kjv\" (UID: \"0b4ee6b9-d425-48ad-a01e-9fbb7354e798\") " pod="openstack/glance-db-create-k8kjv" Mar 14 07:19:55 crc kubenswrapper[4893]: I0314 07:19:55.971324 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7988f9db49-h6b2h"] Mar 14 07:19:55 crc kubenswrapper[4893]: I0314 07:19:55.971688 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7988f9db49-h6b2h" podUID="f35a681c-c2d4-4513-8007-a3edefae561d" containerName="dnsmasq-dns" containerID="cri-o://9d3fb5752d805f8699050a9d375aa026ddf5a17782356e7c467cd5c39d287991" gracePeriod=10 Mar 14 07:19:56 crc kubenswrapper[4893]: I0314 07:19:56.017499 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4txhk\" (UniqueName: \"kubernetes.io/projected/40b3f7b9-3c33-4486-bba1-2f528c0eb212-kube-api-access-4txhk\") pod \"40b3f7b9-3c33-4486-bba1-2f528c0eb212\" (UID: \"40b3f7b9-3c33-4486-bba1-2f528c0eb212\") " Mar 14 07:19:56 crc kubenswrapper[4893]: I0314 07:19:56.017622 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40b3f7b9-3c33-4486-bba1-2f528c0eb212-operator-scripts\") pod \"40b3f7b9-3c33-4486-bba1-2f528c0eb212\" (UID: \"40b3f7b9-3c33-4486-bba1-2f528c0eb212\") " Mar 14 07:19:56 crc kubenswrapper[4893]: I0314 07:19:56.017813 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7sjg6\" (UniqueName: \"kubernetes.io/projected/0b4ee6b9-d425-48ad-a01e-9fbb7354e798-kube-api-access-7sjg6\") pod \"glance-db-create-k8kjv\" (UID: \"0b4ee6b9-d425-48ad-a01e-9fbb7354e798\") " pod="openstack/glance-db-create-k8kjv" Mar 14 07:19:56 crc kubenswrapper[4893]: I0314 07:19:56.017937 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12390666-c0e4-4e0f-90fe-1cf230bc4702-operator-scripts\") pod \"glance-9600-account-create-update-dwgnl\" (UID: \"12390666-c0e4-4e0f-90fe-1cf230bc4702\") " pod="openstack/glance-9600-account-create-update-dwgnl" Mar 14 07:19:56 crc kubenswrapper[4893]: I0314 07:19:56.017969 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b4ee6b9-d425-48ad-a01e-9fbb7354e798-operator-scripts\") pod \"glance-db-create-k8kjv\" (UID: \"0b4ee6b9-d425-48ad-a01e-9fbb7354e798\") " pod="openstack/glance-db-create-k8kjv" Mar 14 07:19:56 crc kubenswrapper[4893]: I0314 07:19:56.018021 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rwvm\" (UniqueName: \"kubernetes.io/projected/12390666-c0e4-4e0f-90fe-1cf230bc4702-kube-api-access-6rwvm\") pod \"glance-9600-account-create-update-dwgnl\" (UID: \"12390666-c0e4-4e0f-90fe-1cf230bc4702\") " pod="openstack/glance-9600-account-create-update-dwgnl" Mar 14 07:19:56 crc kubenswrapper[4893]: I0314 07:19:56.018502 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40b3f7b9-3c33-4486-bba1-2f528c0eb212-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "40b3f7b9-3c33-4486-bba1-2f528c0eb212" (UID: "40b3f7b9-3c33-4486-bba1-2f528c0eb212"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:19:56 crc kubenswrapper[4893]: I0314 07:19:56.019238 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b4ee6b9-d425-48ad-a01e-9fbb7354e798-operator-scripts\") pod \"glance-db-create-k8kjv\" (UID: \"0b4ee6b9-d425-48ad-a01e-9fbb7354e798\") " pod="openstack/glance-db-create-k8kjv" Mar 14 07:19:56 crc kubenswrapper[4893]: I0314 07:19:56.023886 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40b3f7b9-3c33-4486-bba1-2f528c0eb212-kube-api-access-4txhk" (OuterVolumeSpecName: "kube-api-access-4txhk") pod "40b3f7b9-3c33-4486-bba1-2f528c0eb212" (UID: "40b3f7b9-3c33-4486-bba1-2f528c0eb212"). InnerVolumeSpecName "kube-api-access-4txhk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:19:56 crc kubenswrapper[4893]: I0314 07:19:56.039142 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sjg6\" (UniqueName: \"kubernetes.io/projected/0b4ee6b9-d425-48ad-a01e-9fbb7354e798-kube-api-access-7sjg6\") pod \"glance-db-create-k8kjv\" (UID: \"0b4ee6b9-d425-48ad-a01e-9fbb7354e798\") " pod="openstack/glance-db-create-k8kjv" Mar 14 07:19:56 crc kubenswrapper[4893]: I0314 07:19:56.071061 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-k8kjv" Mar 14 07:19:56 crc kubenswrapper[4893]: I0314 07:19:56.119850 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12390666-c0e4-4e0f-90fe-1cf230bc4702-operator-scripts\") pod \"glance-9600-account-create-update-dwgnl\" (UID: \"12390666-c0e4-4e0f-90fe-1cf230bc4702\") " pod="openstack/glance-9600-account-create-update-dwgnl" Mar 14 07:19:56 crc kubenswrapper[4893]: I0314 07:19:56.119961 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rwvm\" (UniqueName: \"kubernetes.io/projected/12390666-c0e4-4e0f-90fe-1cf230bc4702-kube-api-access-6rwvm\") pod \"glance-9600-account-create-update-dwgnl\" (UID: \"12390666-c0e4-4e0f-90fe-1cf230bc4702\") " pod="openstack/glance-9600-account-create-update-dwgnl" Mar 14 07:19:56 crc kubenswrapper[4893]: I0314 07:19:56.120039 4893 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40b3f7b9-3c33-4486-bba1-2f528c0eb212-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:19:56 crc kubenswrapper[4893]: I0314 07:19:56.120051 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4txhk\" (UniqueName: \"kubernetes.io/projected/40b3f7b9-3c33-4486-bba1-2f528c0eb212-kube-api-access-4txhk\") on node \"crc\" DevicePath \"\"" Mar 14 07:19:56 crc kubenswrapper[4893]: I0314 07:19:56.120550 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12390666-c0e4-4e0f-90fe-1cf230bc4702-operator-scripts\") pod \"glance-9600-account-create-update-dwgnl\" (UID: \"12390666-c0e4-4e0f-90fe-1cf230bc4702\") " pod="openstack/glance-9600-account-create-update-dwgnl" Mar 14 07:19:56 crc kubenswrapper[4893]: I0314 07:19:56.139861 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rwvm\" (UniqueName: \"kubernetes.io/projected/12390666-c0e4-4e0f-90fe-1cf230bc4702-kube-api-access-6rwvm\") pod \"glance-9600-account-create-update-dwgnl\" (UID: \"12390666-c0e4-4e0f-90fe-1cf230bc4702\") " pod="openstack/glance-9600-account-create-update-dwgnl" Mar 14 07:19:56 crc kubenswrapper[4893]: I0314 07:19:56.202738 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9600-account-create-update-dwgnl" Mar 14 07:19:56 crc kubenswrapper[4893]: I0314 07:19:56.409721 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7988f9db49-h6b2h" Mar 14 07:19:56 crc kubenswrapper[4893]: I0314 07:19:56.418137 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-55gt6"] Mar 14 07:19:56 crc kubenswrapper[4893]: E0314 07:19:56.418570 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40b3f7b9-3c33-4486-bba1-2f528c0eb212" containerName="mariadb-account-create-update" Mar 14 07:19:56 crc kubenswrapper[4893]: I0314 07:19:56.418638 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="40b3f7b9-3c33-4486-bba1-2f528c0eb212" containerName="mariadb-account-create-update" Mar 14 07:19:56 crc kubenswrapper[4893]: E0314 07:19:56.418694 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f35a681c-c2d4-4513-8007-a3edefae561d" containerName="dnsmasq-dns" Mar 14 07:19:56 crc kubenswrapper[4893]: I0314 07:19:56.418707 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="f35a681c-c2d4-4513-8007-a3edefae561d" containerName="dnsmasq-dns" Mar 14 07:19:56 crc kubenswrapper[4893]: E0314 07:19:56.418725 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f35a681c-c2d4-4513-8007-a3edefae561d" containerName="init" Mar 14 07:19:56 crc kubenswrapper[4893]: I0314 07:19:56.418734 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="f35a681c-c2d4-4513-8007-a3edefae561d" containerName="init" Mar 14 07:19:56 crc kubenswrapper[4893]: I0314 07:19:56.418920 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="f35a681c-c2d4-4513-8007-a3edefae561d" containerName="dnsmasq-dns" Mar 14 07:19:56 crc kubenswrapper[4893]: I0314 07:19:56.418951 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="40b3f7b9-3c33-4486-bba1-2f528c0eb212" containerName="mariadb-account-create-update" Mar 14 07:19:56 crc kubenswrapper[4893]: I0314 07:19:56.419677 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-55gt6" Mar 14 07:19:56 crc kubenswrapper[4893]: I0314 07:19:56.422085 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 14 07:19:56 crc kubenswrapper[4893]: I0314 07:19:56.425799 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-55gt6"] Mar 14 07:19:56 crc kubenswrapper[4893]: I0314 07:19:56.526421 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f35a681c-c2d4-4513-8007-a3edefae561d-config\") pod \"f35a681c-c2d4-4513-8007-a3edefae561d\" (UID: \"f35a681c-c2d4-4513-8007-a3edefae561d\") " Mar 14 07:19:56 crc kubenswrapper[4893]: I0314 07:19:56.526600 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bvkbv\" (UniqueName: \"kubernetes.io/projected/f35a681c-c2d4-4513-8007-a3edefae561d-kube-api-access-bvkbv\") pod \"f35a681c-c2d4-4513-8007-a3edefae561d\" (UID: \"f35a681c-c2d4-4513-8007-a3edefae561d\") " Mar 14 07:19:56 crc kubenswrapper[4893]: I0314 07:19:56.526675 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f35a681c-c2d4-4513-8007-a3edefae561d-ovsdbserver-sb\") pod \"f35a681c-c2d4-4513-8007-a3edefae561d\" (UID: \"f35a681c-c2d4-4513-8007-a3edefae561d\") " Mar 14 07:19:56 crc kubenswrapper[4893]: I0314 07:19:56.526699 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f35a681c-c2d4-4513-8007-a3edefae561d-dns-svc\") pod \"f35a681c-c2d4-4513-8007-a3edefae561d\" (UID: \"f35a681c-c2d4-4513-8007-a3edefae561d\") " Mar 14 07:19:56 crc kubenswrapper[4893]: I0314 07:19:56.526912 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/100561fe-7c72-45e3-846b-67ab83db1136-operator-scripts\") pod \"root-account-create-update-55gt6\" (UID: \"100561fe-7c72-45e3-846b-67ab83db1136\") " pod="openstack/root-account-create-update-55gt6" Mar 14 07:19:56 crc kubenswrapper[4893]: I0314 07:19:56.527020 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzvsl\" (UniqueName: \"kubernetes.io/projected/100561fe-7c72-45e3-846b-67ab83db1136-kube-api-access-rzvsl\") pod \"root-account-create-update-55gt6\" (UID: \"100561fe-7c72-45e3-846b-67ab83db1136\") " pod="openstack/root-account-create-update-55gt6" Mar 14 07:19:56 crc kubenswrapper[4893]: I0314 07:19:56.533122 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f35a681c-c2d4-4513-8007-a3edefae561d-kube-api-access-bvkbv" (OuterVolumeSpecName: "kube-api-access-bvkbv") pod "f35a681c-c2d4-4513-8007-a3edefae561d" (UID: "f35a681c-c2d4-4513-8007-a3edefae561d"). InnerVolumeSpecName "kube-api-access-bvkbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:19:56 crc kubenswrapper[4893]: I0314 07:19:56.567426 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f35a681c-c2d4-4513-8007-a3edefae561d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f35a681c-c2d4-4513-8007-a3edefae561d" (UID: "f35a681c-c2d4-4513-8007-a3edefae561d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:19:56 crc kubenswrapper[4893]: I0314 07:19:56.568067 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f35a681c-c2d4-4513-8007-a3edefae561d-config" (OuterVolumeSpecName: "config") pod "f35a681c-c2d4-4513-8007-a3edefae561d" (UID: "f35a681c-c2d4-4513-8007-a3edefae561d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:19:56 crc kubenswrapper[4893]: I0314 07:19:56.568664 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f35a681c-c2d4-4513-8007-a3edefae561d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f35a681c-c2d4-4513-8007-a3edefae561d" (UID: "f35a681c-c2d4-4513-8007-a3edefae561d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:19:56 crc kubenswrapper[4893]: I0314 07:19:56.601418 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-k8kjv"] Mar 14 07:19:56 crc kubenswrapper[4893]: W0314 07:19:56.605267 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0b4ee6b9_d425_48ad_a01e_9fbb7354e798.slice/crio-904bb90f36f3b18f0de483eab8958f19a764ec14974259b92600a466204344cb WatchSource:0}: Error finding container 904bb90f36f3b18f0de483eab8958f19a764ec14974259b92600a466204344cb: Status 404 returned error can't find the container with id 904bb90f36f3b18f0de483eab8958f19a764ec14974259b92600a466204344cb Mar 14 07:19:56 crc kubenswrapper[4893]: I0314 07:19:56.629021 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/100561fe-7c72-45e3-846b-67ab83db1136-operator-scripts\") pod \"root-account-create-update-55gt6\" (UID: \"100561fe-7c72-45e3-846b-67ab83db1136\") " pod="openstack/root-account-create-update-55gt6" Mar 14 07:19:56 crc kubenswrapper[4893]: I0314 07:19:56.629158 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzvsl\" (UniqueName: \"kubernetes.io/projected/100561fe-7c72-45e3-846b-67ab83db1136-kube-api-access-rzvsl\") pod \"root-account-create-update-55gt6\" (UID: \"100561fe-7c72-45e3-846b-67ab83db1136\") " pod="openstack/root-account-create-update-55gt6" Mar 14 07:19:56 crc kubenswrapper[4893]: I0314 07:19:56.629258 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bvkbv\" (UniqueName: \"kubernetes.io/projected/f35a681c-c2d4-4513-8007-a3edefae561d-kube-api-access-bvkbv\") on node \"crc\" DevicePath \"\"" Mar 14 07:19:56 crc kubenswrapper[4893]: I0314 07:19:56.629280 4893 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f35a681c-c2d4-4513-8007-a3edefae561d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 14 07:19:56 crc kubenswrapper[4893]: I0314 07:19:56.629290 4893 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f35a681c-c2d4-4513-8007-a3edefae561d-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 07:19:56 crc kubenswrapper[4893]: I0314 07:19:56.629301 4893 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f35a681c-c2d4-4513-8007-a3edefae561d-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:19:56 crc kubenswrapper[4893]: I0314 07:19:56.629865 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/100561fe-7c72-45e3-846b-67ab83db1136-operator-scripts\") pod \"root-account-create-update-55gt6\" (UID: \"100561fe-7c72-45e3-846b-67ab83db1136\") " pod="openstack/root-account-create-update-55gt6" Mar 14 07:19:56 crc kubenswrapper[4893]: I0314 07:19:56.652963 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzvsl\" (UniqueName: \"kubernetes.io/projected/100561fe-7c72-45e3-846b-67ab83db1136-kube-api-access-rzvsl\") pod \"root-account-create-update-55gt6\" (UID: \"100561fe-7c72-45e3-846b-67ab83db1136\") " pod="openstack/root-account-create-update-55gt6" Mar 14 07:19:56 crc kubenswrapper[4893]: I0314 07:19:56.732455 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-9600-account-create-update-dwgnl"] Mar 14 07:19:56 crc kubenswrapper[4893]: W0314 07:19:56.735090 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12390666_c0e4_4e0f_90fe_1cf230bc4702.slice/crio-d2e7d2ffebcc1f9072b997748f9412a0c3f9fa18bd8735ad7d28d6f120d6ba80 WatchSource:0}: Error finding container d2e7d2ffebcc1f9072b997748f9412a0c3f9fa18bd8735ad7d28d6f120d6ba80: Status 404 returned error can't find the container with id d2e7d2ffebcc1f9072b997748f9412a0c3f9fa18bd8735ad7d28d6f120d6ba80 Mar 14 07:19:56 crc kubenswrapper[4893]: I0314 07:19:56.738426 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-55gt6" Mar 14 07:19:56 crc kubenswrapper[4893]: I0314 07:19:56.909239 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-k8kjv" event={"ID":"0b4ee6b9-d425-48ad-a01e-9fbb7354e798","Type":"ContainerStarted","Data":"904bb90f36f3b18f0de483eab8958f19a764ec14974259b92600a466204344cb"} Mar 14 07:19:56 crc kubenswrapper[4893]: I0314 07:19:56.912670 4893 generic.go:334] "Generic (PLEG): container finished" podID="f35a681c-c2d4-4513-8007-a3edefae561d" containerID="9d3fb5752d805f8699050a9d375aa026ddf5a17782356e7c467cd5c39d287991" exitCode=0 Mar 14 07:19:56 crc kubenswrapper[4893]: I0314 07:19:56.912736 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7988f9db49-h6b2h" event={"ID":"f35a681c-c2d4-4513-8007-a3edefae561d","Type":"ContainerDied","Data":"9d3fb5752d805f8699050a9d375aa026ddf5a17782356e7c467cd5c39d287991"} Mar 14 07:19:56 crc kubenswrapper[4893]: I0314 07:19:56.912759 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7988f9db49-h6b2h" event={"ID":"f35a681c-c2d4-4513-8007-a3edefae561d","Type":"ContainerDied","Data":"c387672b68c23b026b8a386cf05ba0b503784af1676a207412180ea223e7b917"} Mar 14 07:19:56 crc kubenswrapper[4893]: I0314 07:19:56.912776 4893 scope.go:117] "RemoveContainer" containerID="9d3fb5752d805f8699050a9d375aa026ddf5a17782356e7c467cd5c39d287991" Mar 14 07:19:56 crc kubenswrapper[4893]: I0314 07:19:56.912799 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7988f9db49-h6b2h" Mar 14 07:19:56 crc kubenswrapper[4893]: I0314 07:19:56.917275 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9600-account-create-update-dwgnl" event={"ID":"12390666-c0e4-4e0f-90fe-1cf230bc4702","Type":"ContainerStarted","Data":"d2e7d2ffebcc1f9072b997748f9412a0c3f9fa18bd8735ad7d28d6f120d6ba80"} Mar 14 07:19:56 crc kubenswrapper[4893]: I0314 07:19:56.941824 4893 scope.go:117] "RemoveContainer" containerID="94a2a13f01d8b032ec54c5e5c2f9e6dd2fe470789abc9cb70c7d252e2b6ed284" Mar 14 07:19:56 crc kubenswrapper[4893]: I0314 07:19:56.957453 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7988f9db49-h6b2h"] Mar 14 07:19:56 crc kubenswrapper[4893]: I0314 07:19:56.957498 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7988f9db49-h6b2h"] Mar 14 07:19:56 crc kubenswrapper[4893]: I0314 07:19:56.963580 4893 scope.go:117] "RemoveContainer" containerID="9d3fb5752d805f8699050a9d375aa026ddf5a17782356e7c467cd5c39d287991" Mar 14 07:19:56 crc kubenswrapper[4893]: E0314 07:19:56.964026 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d3fb5752d805f8699050a9d375aa026ddf5a17782356e7c467cd5c39d287991\": container with ID starting with 9d3fb5752d805f8699050a9d375aa026ddf5a17782356e7c467cd5c39d287991 not found: ID does not exist" containerID="9d3fb5752d805f8699050a9d375aa026ddf5a17782356e7c467cd5c39d287991" Mar 14 07:19:56 crc kubenswrapper[4893]: I0314 07:19:56.964068 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d3fb5752d805f8699050a9d375aa026ddf5a17782356e7c467cd5c39d287991"} err="failed to get container status \"9d3fb5752d805f8699050a9d375aa026ddf5a17782356e7c467cd5c39d287991\": rpc error: code = NotFound desc = could not find container \"9d3fb5752d805f8699050a9d375aa026ddf5a17782356e7c467cd5c39d287991\": container with ID starting with 9d3fb5752d805f8699050a9d375aa026ddf5a17782356e7c467cd5c39d287991 not found: ID does not exist" Mar 14 07:19:56 crc kubenswrapper[4893]: I0314 07:19:56.964088 4893 scope.go:117] "RemoveContainer" containerID="94a2a13f01d8b032ec54c5e5c2f9e6dd2fe470789abc9cb70c7d252e2b6ed284" Mar 14 07:19:56 crc kubenswrapper[4893]: E0314 07:19:56.964423 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94a2a13f01d8b032ec54c5e5c2f9e6dd2fe470789abc9cb70c7d252e2b6ed284\": container with ID starting with 94a2a13f01d8b032ec54c5e5c2f9e6dd2fe470789abc9cb70c7d252e2b6ed284 not found: ID does not exist" containerID="94a2a13f01d8b032ec54c5e5c2f9e6dd2fe470789abc9cb70c7d252e2b6ed284" Mar 14 07:19:56 crc kubenswrapper[4893]: I0314 07:19:56.964459 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94a2a13f01d8b032ec54c5e5c2f9e6dd2fe470789abc9cb70c7d252e2b6ed284"} err="failed to get container status \"94a2a13f01d8b032ec54c5e5c2f9e6dd2fe470789abc9cb70c7d252e2b6ed284\": rpc error: code = NotFound desc = could not find container \"94a2a13f01d8b032ec54c5e5c2f9e6dd2fe470789abc9cb70c7d252e2b6ed284\": container with ID starting with 94a2a13f01d8b032ec54c5e5c2f9e6dd2fe470789abc9cb70c7d252e2b6ed284 not found: ID does not exist" Mar 14 07:19:57 crc kubenswrapper[4893]: I0314 07:19:57.180378 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-55gt6"] Mar 14 07:19:57 crc kubenswrapper[4893]: W0314 07:19:57.188216 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod100561fe_7c72_45e3_846b_67ab83db1136.slice/crio-5aa68771f9acf7b6e8a90989e10e0124983a00e695f7adce03a5275d5e902973 WatchSource:0}: Error finding container 5aa68771f9acf7b6e8a90989e10e0124983a00e695f7adce03a5275d5e902973: Status 404 returned error can't find the container with id 5aa68771f9acf7b6e8a90989e10e0124983a00e695f7adce03a5275d5e902973 Mar 14 07:19:57 crc kubenswrapper[4893]: I0314 07:19:57.386639 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f35a681c-c2d4-4513-8007-a3edefae561d" path="/var/lib/kubelet/pods/f35a681c-c2d4-4513-8007-a3edefae561d/volumes" Mar 14 07:19:57 crc kubenswrapper[4893]: I0314 07:19:57.929589 4893 generic.go:334] "Generic (PLEG): container finished" podID="100561fe-7c72-45e3-846b-67ab83db1136" containerID="53154c3e57be85d18226e2f1de5df6ce18d25ffce08f2f5e8c8de1fcba876ebf" exitCode=0 Mar 14 07:19:57 crc kubenswrapper[4893]: I0314 07:19:57.929628 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-55gt6" event={"ID":"100561fe-7c72-45e3-846b-67ab83db1136","Type":"ContainerDied","Data":"53154c3e57be85d18226e2f1de5df6ce18d25ffce08f2f5e8c8de1fcba876ebf"} Mar 14 07:19:57 crc kubenswrapper[4893]: I0314 07:19:57.929660 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-55gt6" event={"ID":"100561fe-7c72-45e3-846b-67ab83db1136","Type":"ContainerStarted","Data":"5aa68771f9acf7b6e8a90989e10e0124983a00e695f7adce03a5275d5e902973"} Mar 14 07:19:57 crc kubenswrapper[4893]: I0314 07:19:57.931420 4893 generic.go:334] "Generic (PLEG): container finished" podID="12390666-c0e4-4e0f-90fe-1cf230bc4702" containerID="b504f6819a498b47ce1fdbb1b7b652aa93b55e0a07d73848aeab56754f086a86" exitCode=0 Mar 14 07:19:57 crc kubenswrapper[4893]: I0314 07:19:57.931512 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9600-account-create-update-dwgnl" event={"ID":"12390666-c0e4-4e0f-90fe-1cf230bc4702","Type":"ContainerDied","Data":"b504f6819a498b47ce1fdbb1b7b652aa93b55e0a07d73848aeab56754f086a86"} Mar 14 07:19:57 crc kubenswrapper[4893]: I0314 07:19:57.933953 4893 generic.go:334] "Generic (PLEG): container finished" podID="0b4ee6b9-d425-48ad-a01e-9fbb7354e798" containerID="2c709f059a4f699836e9f1897e343dfc2e8a083c4e8cbaf73b7b97a62fcff1f6" exitCode=0 Mar 14 07:19:57 crc kubenswrapper[4893]: I0314 07:19:57.934038 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-k8kjv" event={"ID":"0b4ee6b9-d425-48ad-a01e-9fbb7354e798","Type":"ContainerDied","Data":"2c709f059a4f699836e9f1897e343dfc2e8a083c4e8cbaf73b7b97a62fcff1f6"} Mar 14 07:19:59 crc kubenswrapper[4893]: I0314 07:19:59.425726 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-55gt6" Mar 14 07:19:59 crc kubenswrapper[4893]: I0314 07:19:59.431405 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-k8kjv" Mar 14 07:19:59 crc kubenswrapper[4893]: I0314 07:19:59.438149 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9600-account-create-update-dwgnl" Mar 14 07:19:59 crc kubenswrapper[4893]: I0314 07:19:59.575772 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7sjg6\" (UniqueName: \"kubernetes.io/projected/0b4ee6b9-d425-48ad-a01e-9fbb7354e798-kube-api-access-7sjg6\") pod \"0b4ee6b9-d425-48ad-a01e-9fbb7354e798\" (UID: \"0b4ee6b9-d425-48ad-a01e-9fbb7354e798\") " Mar 14 07:19:59 crc kubenswrapper[4893]: I0314 07:19:59.575907 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12390666-c0e4-4e0f-90fe-1cf230bc4702-operator-scripts\") pod \"12390666-c0e4-4e0f-90fe-1cf230bc4702\" (UID: \"12390666-c0e4-4e0f-90fe-1cf230bc4702\") " Mar 14 07:19:59 crc kubenswrapper[4893]: I0314 07:19:59.575934 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b4ee6b9-d425-48ad-a01e-9fbb7354e798-operator-scripts\") pod \"0b4ee6b9-d425-48ad-a01e-9fbb7354e798\" (UID: \"0b4ee6b9-d425-48ad-a01e-9fbb7354e798\") " Mar 14 07:19:59 crc kubenswrapper[4893]: I0314 07:19:59.575954 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rwvm\" (UniqueName: \"kubernetes.io/projected/12390666-c0e4-4e0f-90fe-1cf230bc4702-kube-api-access-6rwvm\") pod \"12390666-c0e4-4e0f-90fe-1cf230bc4702\" (UID: \"12390666-c0e4-4e0f-90fe-1cf230bc4702\") " Mar 14 07:19:59 crc kubenswrapper[4893]: I0314 07:19:59.575994 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/100561fe-7c72-45e3-846b-67ab83db1136-operator-scripts\") pod \"100561fe-7c72-45e3-846b-67ab83db1136\" (UID: \"100561fe-7c72-45e3-846b-67ab83db1136\") " Mar 14 07:19:59 crc kubenswrapper[4893]: I0314 07:19:59.576052 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzvsl\" (UniqueName: \"kubernetes.io/projected/100561fe-7c72-45e3-846b-67ab83db1136-kube-api-access-rzvsl\") pod \"100561fe-7c72-45e3-846b-67ab83db1136\" (UID: \"100561fe-7c72-45e3-846b-67ab83db1136\") " Mar 14 07:19:59 crc kubenswrapper[4893]: I0314 07:19:59.576769 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/100561fe-7c72-45e3-846b-67ab83db1136-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "100561fe-7c72-45e3-846b-67ab83db1136" (UID: "100561fe-7c72-45e3-846b-67ab83db1136"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:19:59 crc kubenswrapper[4893]: I0314 07:19:59.576778 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12390666-c0e4-4e0f-90fe-1cf230bc4702-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "12390666-c0e4-4e0f-90fe-1cf230bc4702" (UID: "12390666-c0e4-4e0f-90fe-1cf230bc4702"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:19:59 crc kubenswrapper[4893]: I0314 07:19:59.577277 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b4ee6b9-d425-48ad-a01e-9fbb7354e798-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0b4ee6b9-d425-48ad-a01e-9fbb7354e798" (UID: "0b4ee6b9-d425-48ad-a01e-9fbb7354e798"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:19:59 crc kubenswrapper[4893]: I0314 07:19:59.581443 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/100561fe-7c72-45e3-846b-67ab83db1136-kube-api-access-rzvsl" (OuterVolumeSpecName: "kube-api-access-rzvsl") pod "100561fe-7c72-45e3-846b-67ab83db1136" (UID: "100561fe-7c72-45e3-846b-67ab83db1136"). InnerVolumeSpecName "kube-api-access-rzvsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:19:59 crc kubenswrapper[4893]: I0314 07:19:59.581711 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b4ee6b9-d425-48ad-a01e-9fbb7354e798-kube-api-access-7sjg6" (OuterVolumeSpecName: "kube-api-access-7sjg6") pod "0b4ee6b9-d425-48ad-a01e-9fbb7354e798" (UID: "0b4ee6b9-d425-48ad-a01e-9fbb7354e798"). InnerVolumeSpecName "kube-api-access-7sjg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:19:59 crc kubenswrapper[4893]: I0314 07:19:59.581748 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12390666-c0e4-4e0f-90fe-1cf230bc4702-kube-api-access-6rwvm" (OuterVolumeSpecName: "kube-api-access-6rwvm") pod "12390666-c0e4-4e0f-90fe-1cf230bc4702" (UID: "12390666-c0e4-4e0f-90fe-1cf230bc4702"). InnerVolumeSpecName "kube-api-access-6rwvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:19:59 crc kubenswrapper[4893]: I0314 07:19:59.678173 4893 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/100561fe-7c72-45e3-846b-67ab83db1136-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:19:59 crc kubenswrapper[4893]: I0314 07:19:59.678221 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzvsl\" (UniqueName: \"kubernetes.io/projected/100561fe-7c72-45e3-846b-67ab83db1136-kube-api-access-rzvsl\") on node \"crc\" DevicePath \"\"" Mar 14 07:19:59 crc kubenswrapper[4893]: I0314 07:19:59.678243 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7sjg6\" (UniqueName: \"kubernetes.io/projected/0b4ee6b9-d425-48ad-a01e-9fbb7354e798-kube-api-access-7sjg6\") on node \"crc\" DevicePath \"\"" Mar 14 07:19:59 crc kubenswrapper[4893]: I0314 07:19:59.678262 4893 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12390666-c0e4-4e0f-90fe-1cf230bc4702-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:19:59 crc kubenswrapper[4893]: I0314 07:19:59.678279 4893 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b4ee6b9-d425-48ad-a01e-9fbb7354e798-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:19:59 crc kubenswrapper[4893]: I0314 07:19:59.678296 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rwvm\" (UniqueName: \"kubernetes.io/projected/12390666-c0e4-4e0f-90fe-1cf230bc4702-kube-api-access-6rwvm\") on node \"crc\" DevicePath \"\"" Mar 14 07:19:59 crc kubenswrapper[4893]: I0314 07:19:59.963864 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-k8kjv" Mar 14 07:19:59 crc kubenswrapper[4893]: I0314 07:19:59.963852 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-k8kjv" event={"ID":"0b4ee6b9-d425-48ad-a01e-9fbb7354e798","Type":"ContainerDied","Data":"904bb90f36f3b18f0de483eab8958f19a764ec14974259b92600a466204344cb"} Mar 14 07:19:59 crc kubenswrapper[4893]: I0314 07:19:59.964348 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="904bb90f36f3b18f0de483eab8958f19a764ec14974259b92600a466204344cb" Mar 14 07:19:59 crc kubenswrapper[4893]: I0314 07:19:59.967049 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-55gt6" event={"ID":"100561fe-7c72-45e3-846b-67ab83db1136","Type":"ContainerDied","Data":"5aa68771f9acf7b6e8a90989e10e0124983a00e695f7adce03a5275d5e902973"} Mar 14 07:19:59 crc kubenswrapper[4893]: I0314 07:19:59.967184 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5aa68771f9acf7b6e8a90989e10e0124983a00e695f7adce03a5275d5e902973" Mar 14 07:19:59 crc kubenswrapper[4893]: I0314 07:19:59.967094 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-55gt6" Mar 14 07:19:59 crc kubenswrapper[4893]: I0314 07:19:59.969611 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9600-account-create-update-dwgnl" event={"ID":"12390666-c0e4-4e0f-90fe-1cf230bc4702","Type":"ContainerDied","Data":"d2e7d2ffebcc1f9072b997748f9412a0c3f9fa18bd8735ad7d28d6f120d6ba80"} Mar 14 07:19:59 crc kubenswrapper[4893]: I0314 07:19:59.969662 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d2e7d2ffebcc1f9072b997748f9412a0c3f9fa18bd8735ad7d28d6f120d6ba80" Mar 14 07:19:59 crc kubenswrapper[4893]: I0314 07:19:59.969702 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9600-account-create-update-dwgnl" Mar 14 07:20:00 crc kubenswrapper[4893]: I0314 07:20:00.161092 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557880-glk9z"] Mar 14 07:20:00 crc kubenswrapper[4893]: E0314 07:20:00.161659 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b4ee6b9-d425-48ad-a01e-9fbb7354e798" containerName="mariadb-database-create" Mar 14 07:20:00 crc kubenswrapper[4893]: I0314 07:20:00.161687 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b4ee6b9-d425-48ad-a01e-9fbb7354e798" containerName="mariadb-database-create" Mar 14 07:20:00 crc kubenswrapper[4893]: E0314 07:20:00.161713 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12390666-c0e4-4e0f-90fe-1cf230bc4702" containerName="mariadb-account-create-update" Mar 14 07:20:00 crc kubenswrapper[4893]: I0314 07:20:00.161725 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="12390666-c0e4-4e0f-90fe-1cf230bc4702" containerName="mariadb-account-create-update" Mar 14 07:20:00 crc kubenswrapper[4893]: E0314 07:20:00.161768 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="100561fe-7c72-45e3-846b-67ab83db1136" containerName="mariadb-account-create-update" Mar 14 07:20:00 crc kubenswrapper[4893]: I0314 07:20:00.161780 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="100561fe-7c72-45e3-846b-67ab83db1136" containerName="mariadb-account-create-update" Mar 14 07:20:00 crc kubenswrapper[4893]: I0314 07:20:00.162058 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="100561fe-7c72-45e3-846b-67ab83db1136" containerName="mariadb-account-create-update" Mar 14 07:20:00 crc kubenswrapper[4893]: I0314 07:20:00.162096 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b4ee6b9-d425-48ad-a01e-9fbb7354e798" containerName="mariadb-database-create" Mar 14 07:20:00 crc kubenswrapper[4893]: I0314 07:20:00.162114 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="12390666-c0e4-4e0f-90fe-1cf230bc4702" containerName="mariadb-account-create-update" Mar 14 07:20:00 crc kubenswrapper[4893]: I0314 07:20:00.162929 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557880-glk9z" Mar 14 07:20:00 crc kubenswrapper[4893]: I0314 07:20:00.165257 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-44qb7" Mar 14 07:20:00 crc kubenswrapper[4893]: I0314 07:20:00.166722 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 07:20:00 crc kubenswrapper[4893]: I0314 07:20:00.167202 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 07:20:00 crc kubenswrapper[4893]: I0314 07:20:00.169864 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557880-glk9z"] Mar 14 07:20:00 crc kubenswrapper[4893]: I0314 07:20:00.287500 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85hgz\" (UniqueName: \"kubernetes.io/projected/a2b569a2-4d37-4f06-b9d3-12d05d3a66a9-kube-api-access-85hgz\") pod \"auto-csr-approver-29557880-glk9z\" (UID: \"a2b569a2-4d37-4f06-b9d3-12d05d3a66a9\") " pod="openshift-infra/auto-csr-approver-29557880-glk9z" Mar 14 07:20:00 crc kubenswrapper[4893]: I0314 07:20:00.389095 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85hgz\" (UniqueName: \"kubernetes.io/projected/a2b569a2-4d37-4f06-b9d3-12d05d3a66a9-kube-api-access-85hgz\") pod \"auto-csr-approver-29557880-glk9z\" (UID: \"a2b569a2-4d37-4f06-b9d3-12d05d3a66a9\") " pod="openshift-infra/auto-csr-approver-29557880-glk9z" Mar 14 07:20:00 crc kubenswrapper[4893]: I0314 07:20:00.412924 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85hgz\" (UniqueName: \"kubernetes.io/projected/a2b569a2-4d37-4f06-b9d3-12d05d3a66a9-kube-api-access-85hgz\") pod \"auto-csr-approver-29557880-glk9z\" (UID: \"a2b569a2-4d37-4f06-b9d3-12d05d3a66a9\") " pod="openshift-infra/auto-csr-approver-29557880-glk9z" Mar 14 07:20:00 crc kubenswrapper[4893]: I0314 07:20:00.492666 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557880-glk9z" Mar 14 07:20:00 crc kubenswrapper[4893]: I0314 07:20:00.932202 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557880-glk9z"] Mar 14 07:20:00 crc kubenswrapper[4893]: I0314 07:20:00.980582 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557880-glk9z" event={"ID":"a2b569a2-4d37-4f06-b9d3-12d05d3a66a9","Type":"ContainerStarted","Data":"8e83d49ef57157df1f51b872496c96fa347a395fffc9be13a9b305c02acb4197"} Mar 14 07:20:01 crc kubenswrapper[4893]: I0314 07:20:01.087371 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-98sg2"] Mar 14 07:20:01 crc kubenswrapper[4893]: I0314 07:20:01.088677 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-98sg2" Mar 14 07:20:01 crc kubenswrapper[4893]: I0314 07:20:01.091908 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 14 07:20:01 crc kubenswrapper[4893]: I0314 07:20:01.092195 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-74rpk" Mar 14 07:20:01 crc kubenswrapper[4893]: I0314 07:20:01.095148 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-98sg2"] Mar 14 07:20:01 crc kubenswrapper[4893]: I0314 07:20:01.206458 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6445s\" (UniqueName: \"kubernetes.io/projected/47f8afde-298d-42a9-a92e-dd3b4561b98c-kube-api-access-6445s\") pod \"glance-db-sync-98sg2\" (UID: \"47f8afde-298d-42a9-a92e-dd3b4561b98c\") " pod="openstack/glance-db-sync-98sg2" Mar 14 07:20:01 crc kubenswrapper[4893]: I0314 07:20:01.206755 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47f8afde-298d-42a9-a92e-dd3b4561b98c-combined-ca-bundle\") pod \"glance-db-sync-98sg2\" (UID: \"47f8afde-298d-42a9-a92e-dd3b4561b98c\") " pod="openstack/glance-db-sync-98sg2" Mar 14 07:20:01 crc kubenswrapper[4893]: I0314 07:20:01.206905 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47f8afde-298d-42a9-a92e-dd3b4561b98c-config-data\") pod \"glance-db-sync-98sg2\" (UID: \"47f8afde-298d-42a9-a92e-dd3b4561b98c\") " pod="openstack/glance-db-sync-98sg2" Mar 14 07:20:01 crc kubenswrapper[4893]: I0314 07:20:01.206998 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/47f8afde-298d-42a9-a92e-dd3b4561b98c-db-sync-config-data\") pod \"glance-db-sync-98sg2\" (UID: \"47f8afde-298d-42a9-a92e-dd3b4561b98c\") " pod="openstack/glance-db-sync-98sg2" Mar 14 07:20:01 crc kubenswrapper[4893]: I0314 07:20:01.308745 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6445s\" (UniqueName: \"kubernetes.io/projected/47f8afde-298d-42a9-a92e-dd3b4561b98c-kube-api-access-6445s\") pod \"glance-db-sync-98sg2\" (UID: \"47f8afde-298d-42a9-a92e-dd3b4561b98c\") " pod="openstack/glance-db-sync-98sg2" Mar 14 07:20:01 crc kubenswrapper[4893]: I0314 07:20:01.309095 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47f8afde-298d-42a9-a92e-dd3b4561b98c-combined-ca-bundle\") pod \"glance-db-sync-98sg2\" (UID: \"47f8afde-298d-42a9-a92e-dd3b4561b98c\") " pod="openstack/glance-db-sync-98sg2" Mar 14 07:20:01 crc kubenswrapper[4893]: I0314 07:20:01.309152 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47f8afde-298d-42a9-a92e-dd3b4561b98c-config-data\") pod \"glance-db-sync-98sg2\" (UID: \"47f8afde-298d-42a9-a92e-dd3b4561b98c\") " pod="openstack/glance-db-sync-98sg2" Mar 14 07:20:01 crc kubenswrapper[4893]: I0314 07:20:01.309180 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/47f8afde-298d-42a9-a92e-dd3b4561b98c-db-sync-config-data\") pod \"glance-db-sync-98sg2\" (UID: \"47f8afde-298d-42a9-a92e-dd3b4561b98c\") " pod="openstack/glance-db-sync-98sg2" Mar 14 07:20:01 crc kubenswrapper[4893]: I0314 07:20:01.313930 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/47f8afde-298d-42a9-a92e-dd3b4561b98c-db-sync-config-data\") pod \"glance-db-sync-98sg2\" (UID: \"47f8afde-298d-42a9-a92e-dd3b4561b98c\") " pod="openstack/glance-db-sync-98sg2" Mar 14 07:20:01 crc kubenswrapper[4893]: I0314 07:20:01.315449 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47f8afde-298d-42a9-a92e-dd3b4561b98c-combined-ca-bundle\") pod \"glance-db-sync-98sg2\" (UID: \"47f8afde-298d-42a9-a92e-dd3b4561b98c\") " pod="openstack/glance-db-sync-98sg2" Mar 14 07:20:01 crc kubenswrapper[4893]: I0314 07:20:01.316652 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47f8afde-298d-42a9-a92e-dd3b4561b98c-config-data\") pod \"glance-db-sync-98sg2\" (UID: \"47f8afde-298d-42a9-a92e-dd3b4561b98c\") " pod="openstack/glance-db-sync-98sg2" Mar 14 07:20:01 crc kubenswrapper[4893]: I0314 07:20:01.327561 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6445s\" (UniqueName: \"kubernetes.io/projected/47f8afde-298d-42a9-a92e-dd3b4561b98c-kube-api-access-6445s\") pod \"glance-db-sync-98sg2\" (UID: \"47f8afde-298d-42a9-a92e-dd3b4561b98c\") " pod="openstack/glance-db-sync-98sg2" Mar 14 07:20:01 crc kubenswrapper[4893]: I0314 07:20:01.406206 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-98sg2" Mar 14 07:20:01 crc kubenswrapper[4893]: I0314 07:20:01.714223 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b9fd7d84c-b8mk8"] Mar 14 07:20:01 crc kubenswrapper[4893]: I0314 07:20:01.728873 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b9fd7d84c-b8mk8" Mar 14 07:20:01 crc kubenswrapper[4893]: I0314 07:20:01.754598 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b9fd7d84c-b8mk8"] Mar 14 07:20:01 crc kubenswrapper[4893]: I0314 07:20:01.925513 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/82962bac-4b7c-45ec-a46b-7c2a01f4db61-ovsdbserver-sb\") pod \"dnsmasq-dns-7b9fd7d84c-b8mk8\" (UID: \"82962bac-4b7c-45ec-a46b-7c2a01f4db61\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-b8mk8" Mar 14 07:20:01 crc kubenswrapper[4893]: I0314 07:20:01.925759 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/82962bac-4b7c-45ec-a46b-7c2a01f4db61-ovsdbserver-nb\") pod \"dnsmasq-dns-7b9fd7d84c-b8mk8\" (UID: \"82962bac-4b7c-45ec-a46b-7c2a01f4db61\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-b8mk8" Mar 14 07:20:01 crc kubenswrapper[4893]: I0314 07:20:01.925835 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bglfz\" (UniqueName: \"kubernetes.io/projected/82962bac-4b7c-45ec-a46b-7c2a01f4db61-kube-api-access-bglfz\") pod \"dnsmasq-dns-7b9fd7d84c-b8mk8\" (UID: \"82962bac-4b7c-45ec-a46b-7c2a01f4db61\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-b8mk8" Mar 14 07:20:01 crc kubenswrapper[4893]: I0314 07:20:01.925985 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82962bac-4b7c-45ec-a46b-7c2a01f4db61-dns-svc\") pod \"dnsmasq-dns-7b9fd7d84c-b8mk8\" (UID: \"82962bac-4b7c-45ec-a46b-7c2a01f4db61\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-b8mk8" Mar 14 07:20:01 crc kubenswrapper[4893]: I0314 07:20:01.926035 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82962bac-4b7c-45ec-a46b-7c2a01f4db61-config\") pod \"dnsmasq-dns-7b9fd7d84c-b8mk8\" (UID: \"82962bac-4b7c-45ec-a46b-7c2a01f4db61\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-b8mk8" Mar 14 07:20:01 crc kubenswrapper[4893]: I0314 07:20:01.977980 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-98sg2"] Mar 14 07:20:01 crc kubenswrapper[4893]: I0314 07:20:01.994571 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-98sg2" event={"ID":"47f8afde-298d-42a9-a92e-dd3b4561b98c","Type":"ContainerStarted","Data":"dd59586a4186f8d0aaa06ad45d497f838321bcfb53ab8cd0efdb854d02ebc0e0"} Mar 14 07:20:02 crc kubenswrapper[4893]: I0314 07:20:02.026923 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82962bac-4b7c-45ec-a46b-7c2a01f4db61-dns-svc\") pod \"dnsmasq-dns-7b9fd7d84c-b8mk8\" (UID: \"82962bac-4b7c-45ec-a46b-7c2a01f4db61\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-b8mk8" Mar 14 07:20:02 crc kubenswrapper[4893]: I0314 07:20:02.026972 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82962bac-4b7c-45ec-a46b-7c2a01f4db61-config\") pod \"dnsmasq-dns-7b9fd7d84c-b8mk8\" (UID: \"82962bac-4b7c-45ec-a46b-7c2a01f4db61\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-b8mk8" Mar 14 07:20:02 crc kubenswrapper[4893]: I0314 07:20:02.026996 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/82962bac-4b7c-45ec-a46b-7c2a01f4db61-ovsdbserver-sb\") pod \"dnsmasq-dns-7b9fd7d84c-b8mk8\" (UID: \"82962bac-4b7c-45ec-a46b-7c2a01f4db61\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-b8mk8" Mar 14 07:20:02 crc kubenswrapper[4893]: I0314 07:20:02.027049 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/82962bac-4b7c-45ec-a46b-7c2a01f4db61-ovsdbserver-nb\") pod \"dnsmasq-dns-7b9fd7d84c-b8mk8\" (UID: \"82962bac-4b7c-45ec-a46b-7c2a01f4db61\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-b8mk8" Mar 14 07:20:02 crc kubenswrapper[4893]: I0314 07:20:02.027077 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bglfz\" (UniqueName: \"kubernetes.io/projected/82962bac-4b7c-45ec-a46b-7c2a01f4db61-kube-api-access-bglfz\") pod \"dnsmasq-dns-7b9fd7d84c-b8mk8\" (UID: \"82962bac-4b7c-45ec-a46b-7c2a01f4db61\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-b8mk8" Mar 14 07:20:02 crc kubenswrapper[4893]: I0314 07:20:02.028105 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82962bac-4b7c-45ec-a46b-7c2a01f4db61-config\") pod \"dnsmasq-dns-7b9fd7d84c-b8mk8\" (UID: \"82962bac-4b7c-45ec-a46b-7c2a01f4db61\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-b8mk8" Mar 14 07:20:02 crc kubenswrapper[4893]: I0314 07:20:02.028124 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/82962bac-4b7c-45ec-a46b-7c2a01f4db61-ovsdbserver-sb\") pod \"dnsmasq-dns-7b9fd7d84c-b8mk8\" (UID: \"82962bac-4b7c-45ec-a46b-7c2a01f4db61\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-b8mk8" Mar 14 07:20:02 crc kubenswrapper[4893]: I0314 07:20:02.028244 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82962bac-4b7c-45ec-a46b-7c2a01f4db61-dns-svc\") pod \"dnsmasq-dns-7b9fd7d84c-b8mk8\" (UID: \"82962bac-4b7c-45ec-a46b-7c2a01f4db61\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-b8mk8" Mar 14 07:20:02 crc kubenswrapper[4893]: I0314 07:20:02.028653 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/82962bac-4b7c-45ec-a46b-7c2a01f4db61-ovsdbserver-nb\") pod \"dnsmasq-dns-7b9fd7d84c-b8mk8\" (UID: \"82962bac-4b7c-45ec-a46b-7c2a01f4db61\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-b8mk8" Mar 14 07:20:02 crc kubenswrapper[4893]: I0314 07:20:02.045417 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bglfz\" (UniqueName: \"kubernetes.io/projected/82962bac-4b7c-45ec-a46b-7c2a01f4db61-kube-api-access-bglfz\") pod \"dnsmasq-dns-7b9fd7d84c-b8mk8\" (UID: \"82962bac-4b7c-45ec-a46b-7c2a01f4db61\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-b8mk8" Mar 14 07:20:02 crc kubenswrapper[4893]: I0314 07:20:02.056536 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b9fd7d84c-b8mk8" Mar 14 07:20:02 crc kubenswrapper[4893]: I0314 07:20:02.496983 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b9fd7d84c-b8mk8"] Mar 14 07:20:02 crc kubenswrapper[4893]: W0314 07:20:02.502145 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82962bac_4b7c_45ec_a46b_7c2a01f4db61.slice/crio-bfea1c41ccd77f44e0a66427622ac65c48195904248df159f994f9e9f4cd33c1 WatchSource:0}: Error finding container bfea1c41ccd77f44e0a66427622ac65c48195904248df159f994f9e9f4cd33c1: Status 404 returned error can't find the container with id bfea1c41ccd77f44e0a66427622ac65c48195904248df159f994f9e9f4cd33c1 Mar 14 07:20:02 crc kubenswrapper[4893]: I0314 07:20:02.902724 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 14 07:20:02 crc kubenswrapper[4893]: I0314 07:20:02.915758 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 14 07:20:02 crc kubenswrapper[4893]: I0314 07:20:02.919704 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 14 07:20:02 crc kubenswrapper[4893]: I0314 07:20:02.919893 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 14 07:20:02 crc kubenswrapper[4893]: I0314 07:20:02.920041 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-bd9kg" Mar 14 07:20:02 crc kubenswrapper[4893]: I0314 07:20:02.921323 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 14 07:20:02 crc kubenswrapper[4893]: I0314 07:20:02.938162 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 14 07:20:03 crc kubenswrapper[4893]: I0314 07:20:03.003422 4893 generic.go:334] "Generic (PLEG): container finished" podID="a2b569a2-4d37-4f06-b9d3-12d05d3a66a9" containerID="dfa315a626ac49bd79c0bdbc8fb09eebb1455a5dac7515215a91a87dd968d005" exitCode=0 Mar 14 07:20:03 crc kubenswrapper[4893]: I0314 07:20:03.003495 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557880-glk9z" event={"ID":"a2b569a2-4d37-4f06-b9d3-12d05d3a66a9","Type":"ContainerDied","Data":"dfa315a626ac49bd79c0bdbc8fb09eebb1455a5dac7515215a91a87dd968d005"} Mar 14 07:20:03 crc kubenswrapper[4893]: I0314 07:20:03.005681 4893 generic.go:334] "Generic (PLEG): container finished" podID="82962bac-4b7c-45ec-a46b-7c2a01f4db61" containerID="98de0aee6c451067a2b02027ded48994f7fca5c71d35a372b7006271043700be" exitCode=0 Mar 14 07:20:03 crc kubenswrapper[4893]: I0314 07:20:03.005735 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b9fd7d84c-b8mk8" event={"ID":"82962bac-4b7c-45ec-a46b-7c2a01f4db61","Type":"ContainerDied","Data":"98de0aee6c451067a2b02027ded48994f7fca5c71d35a372b7006271043700be"} Mar 14 07:20:03 crc kubenswrapper[4893]: I0314 07:20:03.005788 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b9fd7d84c-b8mk8" event={"ID":"82962bac-4b7c-45ec-a46b-7c2a01f4db61","Type":"ContainerStarted","Data":"bfea1c41ccd77f44e0a66427622ac65c48195904248df159f994f9e9f4cd33c1"} Mar 14 07:20:03 crc kubenswrapper[4893]: I0314 07:20:03.046408 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87cp7\" (UniqueName: \"kubernetes.io/projected/079232b7-87bb-42cf-96ff-1eb2d1cfe2b5-kube-api-access-87cp7\") pod \"swift-storage-0\" (UID: \"079232b7-87bb-42cf-96ff-1eb2d1cfe2b5\") " pod="openstack/swift-storage-0" Mar 14 07:20:03 crc kubenswrapper[4893]: I0314 07:20:03.046843 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/079232b7-87bb-42cf-96ff-1eb2d1cfe2b5-lock\") pod \"swift-storage-0\" (UID: \"079232b7-87bb-42cf-96ff-1eb2d1cfe2b5\") " pod="openstack/swift-storage-0" Mar 14 07:20:03 crc kubenswrapper[4893]: I0314 07:20:03.046958 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/079232b7-87bb-42cf-96ff-1eb2d1cfe2b5-etc-swift\") pod \"swift-storage-0\" (UID: \"079232b7-87bb-42cf-96ff-1eb2d1cfe2b5\") " pod="openstack/swift-storage-0" Mar 14 07:20:03 crc kubenswrapper[4893]: I0314 07:20:03.047049 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/079232b7-87bb-42cf-96ff-1eb2d1cfe2b5-cache\") pod \"swift-storage-0\" (UID: \"079232b7-87bb-42cf-96ff-1eb2d1cfe2b5\") " pod="openstack/swift-storage-0" Mar 14 07:20:03 crc kubenswrapper[4893]: I0314 07:20:03.047126 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/079232b7-87bb-42cf-96ff-1eb2d1cfe2b5-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"079232b7-87bb-42cf-96ff-1eb2d1cfe2b5\") " pod="openstack/swift-storage-0" Mar 14 07:20:03 crc kubenswrapper[4893]: I0314 07:20:03.047290 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"079232b7-87bb-42cf-96ff-1eb2d1cfe2b5\") " pod="openstack/swift-storage-0" Mar 14 07:20:03 crc kubenswrapper[4893]: I0314 07:20:03.052023 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-55gt6"] Mar 14 07:20:03 crc kubenswrapper[4893]: I0314 07:20:03.058008 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-55gt6"] Mar 14 07:20:03 crc kubenswrapper[4893]: I0314 07:20:03.149741 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/079232b7-87bb-42cf-96ff-1eb2d1cfe2b5-lock\") pod \"swift-storage-0\" (UID: \"079232b7-87bb-42cf-96ff-1eb2d1cfe2b5\") " pod="openstack/swift-storage-0" Mar 14 07:20:03 crc kubenswrapper[4893]: I0314 07:20:03.150123 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/079232b7-87bb-42cf-96ff-1eb2d1cfe2b5-etc-swift\") pod \"swift-storage-0\" (UID: \"079232b7-87bb-42cf-96ff-1eb2d1cfe2b5\") " pod="openstack/swift-storage-0" Mar 14 07:20:03 crc kubenswrapper[4893]: I0314 07:20:03.150164 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/079232b7-87bb-42cf-96ff-1eb2d1cfe2b5-cache\") pod \"swift-storage-0\" (UID: \"079232b7-87bb-42cf-96ff-1eb2d1cfe2b5\") " pod="openstack/swift-storage-0" Mar 14 07:20:03 crc kubenswrapper[4893]: I0314 07:20:03.150191 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/079232b7-87bb-42cf-96ff-1eb2d1cfe2b5-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"079232b7-87bb-42cf-96ff-1eb2d1cfe2b5\") " pod="openstack/swift-storage-0" Mar 14 07:20:03 crc kubenswrapper[4893]: I0314 07:20:03.150299 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"079232b7-87bb-42cf-96ff-1eb2d1cfe2b5\") " pod="openstack/swift-storage-0" Mar 14 07:20:03 crc kubenswrapper[4893]: E0314 07:20:03.150321 4893 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 14 07:20:03 crc kubenswrapper[4893]: E0314 07:20:03.150341 4893 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 14 07:20:03 crc kubenswrapper[4893]: I0314 07:20:03.150362 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87cp7\" (UniqueName: \"kubernetes.io/projected/079232b7-87bb-42cf-96ff-1eb2d1cfe2b5-kube-api-access-87cp7\") pod \"swift-storage-0\" (UID: \"079232b7-87bb-42cf-96ff-1eb2d1cfe2b5\") " pod="openstack/swift-storage-0" Mar 14 07:20:03 crc kubenswrapper[4893]: E0314 07:20:03.150392 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/079232b7-87bb-42cf-96ff-1eb2d1cfe2b5-etc-swift podName:079232b7-87bb-42cf-96ff-1eb2d1cfe2b5 nodeName:}" failed. No retries permitted until 2026-03-14 07:20:03.65037175 +0000 UTC m=+1282.912548552 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/079232b7-87bb-42cf-96ff-1eb2d1cfe2b5-etc-swift") pod "swift-storage-0" (UID: "079232b7-87bb-42cf-96ff-1eb2d1cfe2b5") : configmap "swift-ring-files" not found Mar 14 07:20:03 crc kubenswrapper[4893]: I0314 07:20:03.150461 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/079232b7-87bb-42cf-96ff-1eb2d1cfe2b5-lock\") pod \"swift-storage-0\" (UID: \"079232b7-87bb-42cf-96ff-1eb2d1cfe2b5\") " pod="openstack/swift-storage-0" Mar 14 07:20:03 crc kubenswrapper[4893]: I0314 07:20:03.150562 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/079232b7-87bb-42cf-96ff-1eb2d1cfe2b5-cache\") pod \"swift-storage-0\" (UID: \"079232b7-87bb-42cf-96ff-1eb2d1cfe2b5\") " pod="openstack/swift-storage-0" Mar 14 07:20:03 crc kubenswrapper[4893]: I0314 07:20:03.150789 4893 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"079232b7-87bb-42cf-96ff-1eb2d1cfe2b5\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/swift-storage-0" Mar 14 07:20:03 crc kubenswrapper[4893]: I0314 07:20:03.155969 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/079232b7-87bb-42cf-96ff-1eb2d1cfe2b5-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"079232b7-87bb-42cf-96ff-1eb2d1cfe2b5\") " pod="openstack/swift-storage-0" Mar 14 07:20:03 crc kubenswrapper[4893]: I0314 07:20:03.167118 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87cp7\" (UniqueName: \"kubernetes.io/projected/079232b7-87bb-42cf-96ff-1eb2d1cfe2b5-kube-api-access-87cp7\") pod \"swift-storage-0\" (UID: \"079232b7-87bb-42cf-96ff-1eb2d1cfe2b5\") " pod="openstack/swift-storage-0" Mar 14 07:20:03 crc kubenswrapper[4893]: I0314 07:20:03.176674 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"079232b7-87bb-42cf-96ff-1eb2d1cfe2b5\") " pod="openstack/swift-storage-0" Mar 14 07:20:03 crc kubenswrapper[4893]: I0314 07:20:03.414427 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="100561fe-7c72-45e3-846b-67ab83db1136" path="/var/lib/kubelet/pods/100561fe-7c72-45e3-846b-67ab83db1136/volumes" Mar 14 07:20:03 crc kubenswrapper[4893]: I0314 07:20:03.658097 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/079232b7-87bb-42cf-96ff-1eb2d1cfe2b5-etc-swift\") pod \"swift-storage-0\" (UID: \"079232b7-87bb-42cf-96ff-1eb2d1cfe2b5\") " pod="openstack/swift-storage-0" Mar 14 07:20:03 crc kubenswrapper[4893]: E0314 07:20:03.658271 4893 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 14 07:20:03 crc kubenswrapper[4893]: E0314 07:20:03.658285 4893 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 14 07:20:03 crc kubenswrapper[4893]: E0314 07:20:03.658326 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/079232b7-87bb-42cf-96ff-1eb2d1cfe2b5-etc-swift podName:079232b7-87bb-42cf-96ff-1eb2d1cfe2b5 nodeName:}" failed. No retries permitted until 2026-03-14 07:20:04.6583131 +0000 UTC m=+1283.920489892 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/079232b7-87bb-42cf-96ff-1eb2d1cfe2b5-etc-swift") pod "swift-storage-0" (UID: "079232b7-87bb-42cf-96ff-1eb2d1cfe2b5") : configmap "swift-ring-files" not found Mar 14 07:20:04 crc kubenswrapper[4893]: I0314 07:20:04.015305 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b9fd7d84c-b8mk8" event={"ID":"82962bac-4b7c-45ec-a46b-7c2a01f4db61","Type":"ContainerStarted","Data":"e410af8fdac966c35f099c0771adc94f1028ee35fd0e61eb19085b6a9b8c1b83"} Mar 14 07:20:04 crc kubenswrapper[4893]: I0314 07:20:04.035235 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7b9fd7d84c-b8mk8" podStartSLOduration=3.035218765 podStartE2EDuration="3.035218765s" podCreationTimestamp="2026-03-14 07:20:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:20:04.031962326 +0000 UTC m=+1283.294139108" watchObservedRunningTime="2026-03-14 07:20:04.035218765 +0000 UTC m=+1283.297395557" Mar 14 07:20:04 crc kubenswrapper[4893]: I0314 07:20:04.361542 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557880-glk9z" Mar 14 07:20:04 crc kubenswrapper[4893]: I0314 07:20:04.475941 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85hgz\" (UniqueName: \"kubernetes.io/projected/a2b569a2-4d37-4f06-b9d3-12d05d3a66a9-kube-api-access-85hgz\") pod \"a2b569a2-4d37-4f06-b9d3-12d05d3a66a9\" (UID: \"a2b569a2-4d37-4f06-b9d3-12d05d3a66a9\") " Mar 14 07:20:04 crc kubenswrapper[4893]: I0314 07:20:04.482916 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2b569a2-4d37-4f06-b9d3-12d05d3a66a9-kube-api-access-85hgz" (OuterVolumeSpecName: "kube-api-access-85hgz") pod "a2b569a2-4d37-4f06-b9d3-12d05d3a66a9" (UID: "a2b569a2-4d37-4f06-b9d3-12d05d3a66a9"). InnerVolumeSpecName "kube-api-access-85hgz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:20:04 crc kubenswrapper[4893]: I0314 07:20:04.579596 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85hgz\" (UniqueName: \"kubernetes.io/projected/a2b569a2-4d37-4f06-b9d3-12d05d3a66a9-kube-api-access-85hgz\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:04 crc kubenswrapper[4893]: I0314 07:20:04.681606 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/079232b7-87bb-42cf-96ff-1eb2d1cfe2b5-etc-swift\") pod \"swift-storage-0\" (UID: \"079232b7-87bb-42cf-96ff-1eb2d1cfe2b5\") " pod="openstack/swift-storage-0" Mar 14 07:20:04 crc kubenswrapper[4893]: E0314 07:20:04.681771 4893 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 14 07:20:04 crc kubenswrapper[4893]: E0314 07:20:04.681785 4893 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 14 07:20:04 crc kubenswrapper[4893]: E0314 07:20:04.681837 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/079232b7-87bb-42cf-96ff-1eb2d1cfe2b5-etc-swift podName:079232b7-87bb-42cf-96ff-1eb2d1cfe2b5 nodeName:}" failed. No retries permitted until 2026-03-14 07:20:06.681822774 +0000 UTC m=+1285.943999566 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/079232b7-87bb-42cf-96ff-1eb2d1cfe2b5-etc-swift") pod "swift-storage-0" (UID: "079232b7-87bb-42cf-96ff-1eb2d1cfe2b5") : configmap "swift-ring-files" not found Mar 14 07:20:05 crc kubenswrapper[4893]: I0314 07:20:05.024146 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557880-glk9z" event={"ID":"a2b569a2-4d37-4f06-b9d3-12d05d3a66a9","Type":"ContainerDied","Data":"8e83d49ef57157df1f51b872496c96fa347a395fffc9be13a9b305c02acb4197"} Mar 14 07:20:05 crc kubenswrapper[4893]: I0314 07:20:05.024980 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e83d49ef57157df1f51b872496c96fa347a395fffc9be13a9b305c02acb4197" Mar 14 07:20:05 crc kubenswrapper[4893]: I0314 07:20:05.025012 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7b9fd7d84c-b8mk8" Mar 14 07:20:05 crc kubenswrapper[4893]: I0314 07:20:05.024163 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557880-glk9z" Mar 14 07:20:05 crc kubenswrapper[4893]: I0314 07:20:05.410037 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557874-7lqhj"] Mar 14 07:20:05 crc kubenswrapper[4893]: I0314 07:20:05.419448 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557874-7lqhj"] Mar 14 07:20:06 crc kubenswrapper[4893]: I0314 07:20:06.412900 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 14 07:20:06 crc kubenswrapper[4893]: I0314 07:20:06.738618 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/079232b7-87bb-42cf-96ff-1eb2d1cfe2b5-etc-swift\") pod \"swift-storage-0\" (UID: \"079232b7-87bb-42cf-96ff-1eb2d1cfe2b5\") " pod="openstack/swift-storage-0" Mar 14 07:20:06 crc kubenswrapper[4893]: E0314 07:20:06.738774 4893 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 14 07:20:06 crc kubenswrapper[4893]: E0314 07:20:06.739061 4893 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 14 07:20:06 crc kubenswrapper[4893]: E0314 07:20:06.739128 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/079232b7-87bb-42cf-96ff-1eb2d1cfe2b5-etc-swift podName:079232b7-87bb-42cf-96ff-1eb2d1cfe2b5 nodeName:}" failed. No retries permitted until 2026-03-14 07:20:10.739107482 +0000 UTC m=+1290.001284284 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/079232b7-87bb-42cf-96ff-1eb2d1cfe2b5-etc-swift") pod "swift-storage-0" (UID: "079232b7-87bb-42cf-96ff-1eb2d1cfe2b5") : configmap "swift-ring-files" not found Mar 14 07:20:06 crc kubenswrapper[4893]: I0314 07:20:06.742835 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-jpm56"] Mar 14 07:20:06 crc kubenswrapper[4893]: E0314 07:20:06.743251 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2b569a2-4d37-4f06-b9d3-12d05d3a66a9" containerName="oc" Mar 14 07:20:06 crc kubenswrapper[4893]: I0314 07:20:06.743274 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2b569a2-4d37-4f06-b9d3-12d05d3a66a9" containerName="oc" Mar 14 07:20:06 crc kubenswrapper[4893]: I0314 07:20:06.743445 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2b569a2-4d37-4f06-b9d3-12d05d3a66a9" containerName="oc" Mar 14 07:20:06 crc kubenswrapper[4893]: I0314 07:20:06.744145 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-jpm56" Mar 14 07:20:06 crc kubenswrapper[4893]: I0314 07:20:06.747054 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 14 07:20:06 crc kubenswrapper[4893]: I0314 07:20:06.747339 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 14 07:20:06 crc kubenswrapper[4893]: I0314 07:20:06.749083 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 14 07:20:06 crc kubenswrapper[4893]: I0314 07:20:06.772114 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-jpm56"] Mar 14 07:20:06 crc kubenswrapper[4893]: E0314 07:20:06.772816 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-lkp2v ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-lkp2v ring-data-devices scripts swiftconf]: context canceled" pod="openstack/swift-ring-rebalance-jpm56" podUID="d36f5ab5-4eef-47c7-af0b-7c72437db021" Mar 14 07:20:06 crc kubenswrapper[4893]: I0314 07:20:06.782598 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-wr6c4"] Mar 14 07:20:06 crc kubenswrapper[4893]: I0314 07:20:06.783761 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-wr6c4" Mar 14 07:20:06 crc kubenswrapper[4893]: I0314 07:20:06.800178 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-wr6c4"] Mar 14 07:20:06 crc kubenswrapper[4893]: I0314 07:20:06.822796 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-jpm56"] Mar 14 07:20:06 crc kubenswrapper[4893]: I0314 07:20:06.840220 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d36f5ab5-4eef-47c7-af0b-7c72437db021-swiftconf\") pod \"swift-ring-rebalance-jpm56\" (UID: \"d36f5ab5-4eef-47c7-af0b-7c72437db021\") " pod="openstack/swift-ring-rebalance-jpm56" Mar 14 07:20:06 crc kubenswrapper[4893]: I0314 07:20:06.840331 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkp2v\" (UniqueName: \"kubernetes.io/projected/d36f5ab5-4eef-47c7-af0b-7c72437db021-kube-api-access-lkp2v\") pod \"swift-ring-rebalance-jpm56\" (UID: \"d36f5ab5-4eef-47c7-af0b-7c72437db021\") " pod="openstack/swift-ring-rebalance-jpm56" Mar 14 07:20:06 crc kubenswrapper[4893]: I0314 07:20:06.840359 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e77e9706-a6b7-4f26-9897-8f5d66642a67-swiftconf\") pod \"swift-ring-rebalance-wr6c4\" (UID: \"e77e9706-a6b7-4f26-9897-8f5d66642a67\") " pod="openstack/swift-ring-rebalance-wr6c4" Mar 14 07:20:06 crc kubenswrapper[4893]: I0314 07:20:06.840381 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e77e9706-a6b7-4f26-9897-8f5d66642a67-dispersionconf\") pod \"swift-ring-rebalance-wr6c4\" (UID: \"e77e9706-a6b7-4f26-9897-8f5d66642a67\") " pod="openstack/swift-ring-rebalance-wr6c4" Mar 14 07:20:06 crc kubenswrapper[4893]: I0314 07:20:06.840398 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d36f5ab5-4eef-47c7-af0b-7c72437db021-combined-ca-bundle\") pod \"swift-ring-rebalance-jpm56\" (UID: \"d36f5ab5-4eef-47c7-af0b-7c72437db021\") " pod="openstack/swift-ring-rebalance-jpm56" Mar 14 07:20:06 crc kubenswrapper[4893]: I0314 07:20:06.840457 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d36f5ab5-4eef-47c7-af0b-7c72437db021-dispersionconf\") pod \"swift-ring-rebalance-jpm56\" (UID: \"d36f5ab5-4eef-47c7-af0b-7c72437db021\") " pod="openstack/swift-ring-rebalance-jpm56" Mar 14 07:20:06 crc kubenswrapper[4893]: I0314 07:20:06.840486 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d36f5ab5-4eef-47c7-af0b-7c72437db021-ring-data-devices\") pod \"swift-ring-rebalance-jpm56\" (UID: \"d36f5ab5-4eef-47c7-af0b-7c72437db021\") " pod="openstack/swift-ring-rebalance-jpm56" Mar 14 07:20:06 crc kubenswrapper[4893]: I0314 07:20:06.840533 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e77e9706-a6b7-4f26-9897-8f5d66642a67-combined-ca-bundle\") pod \"swift-ring-rebalance-wr6c4\" (UID: \"e77e9706-a6b7-4f26-9897-8f5d66642a67\") " pod="openstack/swift-ring-rebalance-wr6c4" Mar 14 07:20:06 crc kubenswrapper[4893]: I0314 07:20:06.840633 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e77e9706-a6b7-4f26-9897-8f5d66642a67-etc-swift\") pod \"swift-ring-rebalance-wr6c4\" (UID: \"e77e9706-a6b7-4f26-9897-8f5d66642a67\") " pod="openstack/swift-ring-rebalance-wr6c4" Mar 14 07:20:06 crc kubenswrapper[4893]: I0314 07:20:06.840705 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e77e9706-a6b7-4f26-9897-8f5d66642a67-scripts\") pod \"swift-ring-rebalance-wr6c4\" (UID: \"e77e9706-a6b7-4f26-9897-8f5d66642a67\") " pod="openstack/swift-ring-rebalance-wr6c4" Mar 14 07:20:06 crc kubenswrapper[4893]: I0314 07:20:06.840736 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xdqz\" (UniqueName: \"kubernetes.io/projected/e77e9706-a6b7-4f26-9897-8f5d66642a67-kube-api-access-5xdqz\") pod \"swift-ring-rebalance-wr6c4\" (UID: \"e77e9706-a6b7-4f26-9897-8f5d66642a67\") " pod="openstack/swift-ring-rebalance-wr6c4" Mar 14 07:20:06 crc kubenswrapper[4893]: I0314 07:20:06.840771 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d36f5ab5-4eef-47c7-af0b-7c72437db021-scripts\") pod \"swift-ring-rebalance-jpm56\" (UID: \"d36f5ab5-4eef-47c7-af0b-7c72437db021\") " pod="openstack/swift-ring-rebalance-jpm56" Mar 14 07:20:06 crc kubenswrapper[4893]: I0314 07:20:06.840815 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e77e9706-a6b7-4f26-9897-8f5d66642a67-ring-data-devices\") pod \"swift-ring-rebalance-wr6c4\" (UID: \"e77e9706-a6b7-4f26-9897-8f5d66642a67\") " pod="openstack/swift-ring-rebalance-wr6c4" Mar 14 07:20:06 crc kubenswrapper[4893]: I0314 07:20:06.840844 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d36f5ab5-4eef-47c7-af0b-7c72437db021-etc-swift\") pod \"swift-ring-rebalance-jpm56\" (UID: \"d36f5ab5-4eef-47c7-af0b-7c72437db021\") " pod="openstack/swift-ring-rebalance-jpm56" Mar 14 07:20:06 crc kubenswrapper[4893]: I0314 07:20:06.942380 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e77e9706-a6b7-4f26-9897-8f5d66642a67-etc-swift\") pod \"swift-ring-rebalance-wr6c4\" (UID: \"e77e9706-a6b7-4f26-9897-8f5d66642a67\") " pod="openstack/swift-ring-rebalance-wr6c4" Mar 14 07:20:06 crc kubenswrapper[4893]: I0314 07:20:06.942447 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e77e9706-a6b7-4f26-9897-8f5d66642a67-scripts\") pod \"swift-ring-rebalance-wr6c4\" (UID: \"e77e9706-a6b7-4f26-9897-8f5d66642a67\") " pod="openstack/swift-ring-rebalance-wr6c4" Mar 14 07:20:06 crc kubenswrapper[4893]: I0314 07:20:06.942506 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xdqz\" (UniqueName: \"kubernetes.io/projected/e77e9706-a6b7-4f26-9897-8f5d66642a67-kube-api-access-5xdqz\") pod \"swift-ring-rebalance-wr6c4\" (UID: \"e77e9706-a6b7-4f26-9897-8f5d66642a67\") " pod="openstack/swift-ring-rebalance-wr6c4" Mar 14 07:20:06 crc kubenswrapper[4893]: I0314 07:20:06.942553 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d36f5ab5-4eef-47c7-af0b-7c72437db021-scripts\") pod \"swift-ring-rebalance-jpm56\" (UID: \"d36f5ab5-4eef-47c7-af0b-7c72437db021\") " pod="openstack/swift-ring-rebalance-jpm56" Mar 14 07:20:06 crc kubenswrapper[4893]: I0314 07:20:06.942588 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e77e9706-a6b7-4f26-9897-8f5d66642a67-ring-data-devices\") pod \"swift-ring-rebalance-wr6c4\" (UID: \"e77e9706-a6b7-4f26-9897-8f5d66642a67\") " pod="openstack/swift-ring-rebalance-wr6c4" Mar 14 07:20:06 crc kubenswrapper[4893]: I0314 07:20:06.942787 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d36f5ab5-4eef-47c7-af0b-7c72437db021-etc-swift\") pod \"swift-ring-rebalance-jpm56\" (UID: \"d36f5ab5-4eef-47c7-af0b-7c72437db021\") " pod="openstack/swift-ring-rebalance-jpm56" Mar 14 07:20:06 crc kubenswrapper[4893]: I0314 07:20:06.942830 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d36f5ab5-4eef-47c7-af0b-7c72437db021-swiftconf\") pod \"swift-ring-rebalance-jpm56\" (UID: \"d36f5ab5-4eef-47c7-af0b-7c72437db021\") " pod="openstack/swift-ring-rebalance-jpm56" Mar 14 07:20:06 crc kubenswrapper[4893]: I0314 07:20:06.942888 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkp2v\" (UniqueName: \"kubernetes.io/projected/d36f5ab5-4eef-47c7-af0b-7c72437db021-kube-api-access-lkp2v\") pod \"swift-ring-rebalance-jpm56\" (UID: \"d36f5ab5-4eef-47c7-af0b-7c72437db021\") " pod="openstack/swift-ring-rebalance-jpm56" Mar 14 07:20:06 crc kubenswrapper[4893]: I0314 07:20:06.942916 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e77e9706-a6b7-4f26-9897-8f5d66642a67-swiftconf\") pod \"swift-ring-rebalance-wr6c4\" (UID: \"e77e9706-a6b7-4f26-9897-8f5d66642a67\") " pod="openstack/swift-ring-rebalance-wr6c4" Mar 14 07:20:06 crc kubenswrapper[4893]: I0314 07:20:06.942962 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e77e9706-a6b7-4f26-9897-8f5d66642a67-dispersionconf\") pod \"swift-ring-rebalance-wr6c4\" (UID: \"e77e9706-a6b7-4f26-9897-8f5d66642a67\") " pod="openstack/swift-ring-rebalance-wr6c4" Mar 14 07:20:06 crc kubenswrapper[4893]: I0314 07:20:06.942989 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d36f5ab5-4eef-47c7-af0b-7c72437db021-combined-ca-bundle\") pod \"swift-ring-rebalance-jpm56\" (UID: \"d36f5ab5-4eef-47c7-af0b-7c72437db021\") " pod="openstack/swift-ring-rebalance-jpm56" Mar 14 07:20:06 crc kubenswrapper[4893]: I0314 07:20:06.943023 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d36f5ab5-4eef-47c7-af0b-7c72437db021-dispersionconf\") pod \"swift-ring-rebalance-jpm56\" (UID: \"d36f5ab5-4eef-47c7-af0b-7c72437db021\") " pod="openstack/swift-ring-rebalance-jpm56" Mar 14 07:20:06 crc kubenswrapper[4893]: I0314 07:20:06.943053 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d36f5ab5-4eef-47c7-af0b-7c72437db021-ring-data-devices\") pod \"swift-ring-rebalance-jpm56\" (UID: \"d36f5ab5-4eef-47c7-af0b-7c72437db021\") " pod="openstack/swift-ring-rebalance-jpm56" Mar 14 07:20:06 crc kubenswrapper[4893]: I0314 07:20:06.943081 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e77e9706-a6b7-4f26-9897-8f5d66642a67-combined-ca-bundle\") pod \"swift-ring-rebalance-wr6c4\" (UID: \"e77e9706-a6b7-4f26-9897-8f5d66642a67\") " pod="openstack/swift-ring-rebalance-wr6c4" Mar 14 07:20:06 crc kubenswrapper[4893]: I0314 07:20:06.943741 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e77e9706-a6b7-4f26-9897-8f5d66642a67-etc-swift\") pod \"swift-ring-rebalance-wr6c4\" (UID: \"e77e9706-a6b7-4f26-9897-8f5d66642a67\") " pod="openstack/swift-ring-rebalance-wr6c4" Mar 14 07:20:06 crc kubenswrapper[4893]: I0314 07:20:06.943926 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e77e9706-a6b7-4f26-9897-8f5d66642a67-ring-data-devices\") pod \"swift-ring-rebalance-wr6c4\" (UID: \"e77e9706-a6b7-4f26-9897-8f5d66642a67\") " pod="openstack/swift-ring-rebalance-wr6c4" Mar 14 07:20:06 crc kubenswrapper[4893]: I0314 07:20:06.944041 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d36f5ab5-4eef-47c7-af0b-7c72437db021-etc-swift\") pod \"swift-ring-rebalance-jpm56\" (UID: \"d36f5ab5-4eef-47c7-af0b-7c72437db021\") " pod="openstack/swift-ring-rebalance-jpm56" Mar 14 07:20:06 crc kubenswrapper[4893]: I0314 07:20:06.944164 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e77e9706-a6b7-4f26-9897-8f5d66642a67-scripts\") pod \"swift-ring-rebalance-wr6c4\" (UID: \"e77e9706-a6b7-4f26-9897-8f5d66642a67\") " pod="openstack/swift-ring-rebalance-wr6c4" Mar 14 07:20:06 crc kubenswrapper[4893]: I0314 07:20:06.944462 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d36f5ab5-4eef-47c7-af0b-7c72437db021-scripts\") pod \"swift-ring-rebalance-jpm56\" (UID: \"d36f5ab5-4eef-47c7-af0b-7c72437db021\") " pod="openstack/swift-ring-rebalance-jpm56" Mar 14 07:20:06 crc kubenswrapper[4893]: I0314 07:20:06.945422 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d36f5ab5-4eef-47c7-af0b-7c72437db021-ring-data-devices\") pod \"swift-ring-rebalance-jpm56\" (UID: \"d36f5ab5-4eef-47c7-af0b-7c72437db021\") " pod="openstack/swift-ring-rebalance-jpm56" Mar 14 07:20:06 crc kubenswrapper[4893]: I0314 07:20:06.948920 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d36f5ab5-4eef-47c7-af0b-7c72437db021-dispersionconf\") pod \"swift-ring-rebalance-jpm56\" (UID: \"d36f5ab5-4eef-47c7-af0b-7c72437db021\") " pod="openstack/swift-ring-rebalance-jpm56" Mar 14 07:20:06 crc kubenswrapper[4893]: I0314 07:20:06.949308 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e77e9706-a6b7-4f26-9897-8f5d66642a67-swiftconf\") pod \"swift-ring-rebalance-wr6c4\" (UID: \"e77e9706-a6b7-4f26-9897-8f5d66642a67\") " pod="openstack/swift-ring-rebalance-wr6c4" Mar 14 07:20:06 crc kubenswrapper[4893]: I0314 07:20:06.949449 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e77e9706-a6b7-4f26-9897-8f5d66642a67-combined-ca-bundle\") pod \"swift-ring-rebalance-wr6c4\" (UID: \"e77e9706-a6b7-4f26-9897-8f5d66642a67\") " pod="openstack/swift-ring-rebalance-wr6c4" Mar 14 07:20:06 crc kubenswrapper[4893]: I0314 07:20:06.949594 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d36f5ab5-4eef-47c7-af0b-7c72437db021-swiftconf\") pod \"swift-ring-rebalance-jpm56\" (UID: \"d36f5ab5-4eef-47c7-af0b-7c72437db021\") " pod="openstack/swift-ring-rebalance-jpm56" Mar 14 07:20:06 crc kubenswrapper[4893]: I0314 07:20:06.949866 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d36f5ab5-4eef-47c7-af0b-7c72437db021-combined-ca-bundle\") pod \"swift-ring-rebalance-jpm56\" (UID: \"d36f5ab5-4eef-47c7-af0b-7c72437db021\") " pod="openstack/swift-ring-rebalance-jpm56" Mar 14 07:20:06 crc kubenswrapper[4893]: I0314 07:20:06.960855 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xdqz\" (UniqueName: \"kubernetes.io/projected/e77e9706-a6b7-4f26-9897-8f5d66642a67-kube-api-access-5xdqz\") pod \"swift-ring-rebalance-wr6c4\" (UID: \"e77e9706-a6b7-4f26-9897-8f5d66642a67\") " pod="openstack/swift-ring-rebalance-wr6c4" Mar 14 07:20:06 crc kubenswrapper[4893]: I0314 07:20:06.969105 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e77e9706-a6b7-4f26-9897-8f5d66642a67-dispersionconf\") pod \"swift-ring-rebalance-wr6c4\" (UID: \"e77e9706-a6b7-4f26-9897-8f5d66642a67\") " pod="openstack/swift-ring-rebalance-wr6c4" Mar 14 07:20:06 crc kubenswrapper[4893]: I0314 07:20:06.979031 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkp2v\" (UniqueName: \"kubernetes.io/projected/d36f5ab5-4eef-47c7-af0b-7c72437db021-kube-api-access-lkp2v\") pod \"swift-ring-rebalance-jpm56\" (UID: \"d36f5ab5-4eef-47c7-af0b-7c72437db021\") " pod="openstack/swift-ring-rebalance-jpm56" Mar 14 07:20:07 crc kubenswrapper[4893]: I0314 07:20:07.037983 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-jpm56" Mar 14 07:20:07 crc kubenswrapper[4893]: I0314 07:20:07.049099 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-jpm56" Mar 14 07:20:07 crc kubenswrapper[4893]: I0314 07:20:07.106326 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-wr6c4" Mar 14 07:20:07 crc kubenswrapper[4893]: I0314 07:20:07.153309 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lkp2v\" (UniqueName: \"kubernetes.io/projected/d36f5ab5-4eef-47c7-af0b-7c72437db021-kube-api-access-lkp2v\") pod \"d36f5ab5-4eef-47c7-af0b-7c72437db021\" (UID: \"d36f5ab5-4eef-47c7-af0b-7c72437db021\") " Mar 14 07:20:07 crc kubenswrapper[4893]: I0314 07:20:07.153388 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d36f5ab5-4eef-47c7-af0b-7c72437db021-ring-data-devices\") pod \"d36f5ab5-4eef-47c7-af0b-7c72437db021\" (UID: \"d36f5ab5-4eef-47c7-af0b-7c72437db021\") " Mar 14 07:20:07 crc kubenswrapper[4893]: I0314 07:20:07.153419 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d36f5ab5-4eef-47c7-af0b-7c72437db021-swiftconf\") pod \"d36f5ab5-4eef-47c7-af0b-7c72437db021\" (UID: \"d36f5ab5-4eef-47c7-af0b-7c72437db021\") " Mar 14 07:20:07 crc kubenswrapper[4893]: I0314 07:20:07.153461 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d36f5ab5-4eef-47c7-af0b-7c72437db021-scripts\") pod \"d36f5ab5-4eef-47c7-af0b-7c72437db021\" (UID: \"d36f5ab5-4eef-47c7-af0b-7c72437db021\") " Mar 14 07:20:07 crc kubenswrapper[4893]: I0314 07:20:07.153507 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d36f5ab5-4eef-47c7-af0b-7c72437db021-combined-ca-bundle\") pod \"d36f5ab5-4eef-47c7-af0b-7c72437db021\" (UID: \"d36f5ab5-4eef-47c7-af0b-7c72437db021\") " Mar 14 07:20:07 crc kubenswrapper[4893]: I0314 07:20:07.153570 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d36f5ab5-4eef-47c7-af0b-7c72437db021-etc-swift\") pod \"d36f5ab5-4eef-47c7-af0b-7c72437db021\" (UID: \"d36f5ab5-4eef-47c7-af0b-7c72437db021\") " Mar 14 07:20:07 crc kubenswrapper[4893]: I0314 07:20:07.153661 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d36f5ab5-4eef-47c7-af0b-7c72437db021-dispersionconf\") pod \"d36f5ab5-4eef-47c7-af0b-7c72437db021\" (UID: \"d36f5ab5-4eef-47c7-af0b-7c72437db021\") " Mar 14 07:20:07 crc kubenswrapper[4893]: I0314 07:20:07.158106 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d36f5ab5-4eef-47c7-af0b-7c72437db021-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "d36f5ab5-4eef-47c7-af0b-7c72437db021" (UID: "d36f5ab5-4eef-47c7-af0b-7c72437db021"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:20:07 crc kubenswrapper[4893]: I0314 07:20:07.161230 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d36f5ab5-4eef-47c7-af0b-7c72437db021-kube-api-access-lkp2v" (OuterVolumeSpecName: "kube-api-access-lkp2v") pod "d36f5ab5-4eef-47c7-af0b-7c72437db021" (UID: "d36f5ab5-4eef-47c7-af0b-7c72437db021"). InnerVolumeSpecName "kube-api-access-lkp2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:20:07 crc kubenswrapper[4893]: I0314 07:20:07.161855 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d36f5ab5-4eef-47c7-af0b-7c72437db021-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "d36f5ab5-4eef-47c7-af0b-7c72437db021" (UID: "d36f5ab5-4eef-47c7-af0b-7c72437db021"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:20:07 crc kubenswrapper[4893]: I0314 07:20:07.162184 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d36f5ab5-4eef-47c7-af0b-7c72437db021-scripts" (OuterVolumeSpecName: "scripts") pod "d36f5ab5-4eef-47c7-af0b-7c72437db021" (UID: "d36f5ab5-4eef-47c7-af0b-7c72437db021"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:20:07 crc kubenswrapper[4893]: I0314 07:20:07.162313 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d36f5ab5-4eef-47c7-af0b-7c72437db021-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "d36f5ab5-4eef-47c7-af0b-7c72437db021" (UID: "d36f5ab5-4eef-47c7-af0b-7c72437db021"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:20:07 crc kubenswrapper[4893]: I0314 07:20:07.166024 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d36f5ab5-4eef-47c7-af0b-7c72437db021-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d36f5ab5-4eef-47c7-af0b-7c72437db021" (UID: "d36f5ab5-4eef-47c7-af0b-7c72437db021"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:20:07 crc kubenswrapper[4893]: I0314 07:20:07.168817 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d36f5ab5-4eef-47c7-af0b-7c72437db021-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "d36f5ab5-4eef-47c7-af0b-7c72437db021" (UID: "d36f5ab5-4eef-47c7-af0b-7c72437db021"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:20:07 crc kubenswrapper[4893]: I0314 07:20:07.256648 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lkp2v\" (UniqueName: \"kubernetes.io/projected/d36f5ab5-4eef-47c7-af0b-7c72437db021-kube-api-access-lkp2v\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:07 crc kubenswrapper[4893]: I0314 07:20:07.257049 4893 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d36f5ab5-4eef-47c7-af0b-7c72437db021-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:07 crc kubenswrapper[4893]: I0314 07:20:07.257066 4893 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d36f5ab5-4eef-47c7-af0b-7c72437db021-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:07 crc kubenswrapper[4893]: I0314 07:20:07.257076 4893 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d36f5ab5-4eef-47c7-af0b-7c72437db021-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:07 crc kubenswrapper[4893]: I0314 07:20:07.257085 4893 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d36f5ab5-4eef-47c7-af0b-7c72437db021-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:07 crc kubenswrapper[4893]: I0314 07:20:07.257093 4893 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d36f5ab5-4eef-47c7-af0b-7c72437db021-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:07 crc kubenswrapper[4893]: I0314 07:20:07.257103 4893 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d36f5ab5-4eef-47c7-af0b-7c72437db021-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:07 crc kubenswrapper[4893]: I0314 07:20:07.396343 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="521aeee7-0d64-4708-8f7b-7718bfbaed47" path="/var/lib/kubelet/pods/521aeee7-0d64-4708-8f7b-7718bfbaed47/volumes" Mar 14 07:20:07 crc kubenswrapper[4893]: I0314 07:20:07.554097 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-wr6c4"] Mar 14 07:20:07 crc kubenswrapper[4893]: W0314 07:20:07.562982 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode77e9706_a6b7_4f26_9897_8f5d66642a67.slice/crio-6cc85b47370c01e18ef8e4b505fc6f861b34bbed5fa422f0d1a805a8d5e89187 WatchSource:0}: Error finding container 6cc85b47370c01e18ef8e4b505fc6f861b34bbed5fa422f0d1a805a8d5e89187: Status 404 returned error can't find the container with id 6cc85b47370c01e18ef8e4b505fc6f861b34bbed5fa422f0d1a805a8d5e89187 Mar 14 07:20:08 crc kubenswrapper[4893]: I0314 07:20:08.046368 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-wr6c4" event={"ID":"e77e9706-a6b7-4f26-9897-8f5d66642a67","Type":"ContainerStarted","Data":"6cc85b47370c01e18ef8e4b505fc6f861b34bbed5fa422f0d1a805a8d5e89187"} Mar 14 07:20:08 crc kubenswrapper[4893]: I0314 07:20:08.046660 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-7crn6"] Mar 14 07:20:08 crc kubenswrapper[4893]: I0314 07:20:08.047648 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-7crn6" Mar 14 07:20:08 crc kubenswrapper[4893]: I0314 07:20:08.049534 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 14 07:20:08 crc kubenswrapper[4893]: I0314 07:20:08.053917 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-7crn6"] Mar 14 07:20:08 crc kubenswrapper[4893]: I0314 07:20:08.054167 4893 generic.go:334] "Generic (PLEG): container finished" podID="7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e" containerID="682d8cd293d9dce0bae81b6b24fad5bbdb451e309a9432d29cbee73b0bde8366" exitCode=0 Mar 14 07:20:08 crc kubenswrapper[4893]: I0314 07:20:08.054231 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e","Type":"ContainerDied","Data":"682d8cd293d9dce0bae81b6b24fad5bbdb451e309a9432d29cbee73b0bde8366"} Mar 14 07:20:08 crc kubenswrapper[4893]: I0314 07:20:08.055605 4893 generic.go:334] "Generic (PLEG): container finished" podID="a752b3c8-284e-490f-be39-506e7a075c6f" containerID="292f113e5f439a52265f81f58611b262cd2d04d5dfef8d95fd26c5f4c46fb3b1" exitCode=0 Mar 14 07:20:08 crc kubenswrapper[4893]: I0314 07:20:08.055662 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-jpm56" Mar 14 07:20:08 crc kubenswrapper[4893]: I0314 07:20:08.055664 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a752b3c8-284e-490f-be39-506e7a075c6f","Type":"ContainerDied","Data":"292f113e5f439a52265f81f58611b262cd2d04d5dfef8d95fd26c5f4c46fb3b1"} Mar 14 07:20:08 crc kubenswrapper[4893]: I0314 07:20:08.069470 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxjxb\" (UniqueName: \"kubernetes.io/projected/2bb86b36-ba27-4788-9a19-451d42d8a4e2-kube-api-access-lxjxb\") pod \"root-account-create-update-7crn6\" (UID: \"2bb86b36-ba27-4788-9a19-451d42d8a4e2\") " pod="openstack/root-account-create-update-7crn6" Mar 14 07:20:08 crc kubenswrapper[4893]: I0314 07:20:08.069557 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2bb86b36-ba27-4788-9a19-451d42d8a4e2-operator-scripts\") pod \"root-account-create-update-7crn6\" (UID: \"2bb86b36-ba27-4788-9a19-451d42d8a4e2\") " pod="openstack/root-account-create-update-7crn6" Mar 14 07:20:08 crc kubenswrapper[4893]: I0314 07:20:08.171482 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxjxb\" (UniqueName: \"kubernetes.io/projected/2bb86b36-ba27-4788-9a19-451d42d8a4e2-kube-api-access-lxjxb\") pod \"root-account-create-update-7crn6\" (UID: \"2bb86b36-ba27-4788-9a19-451d42d8a4e2\") " pod="openstack/root-account-create-update-7crn6" Mar 14 07:20:08 crc kubenswrapper[4893]: I0314 07:20:08.171578 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2bb86b36-ba27-4788-9a19-451d42d8a4e2-operator-scripts\") pod \"root-account-create-update-7crn6\" (UID: \"2bb86b36-ba27-4788-9a19-451d42d8a4e2\") " pod="openstack/root-account-create-update-7crn6" Mar 14 07:20:08 crc kubenswrapper[4893]: I0314 07:20:08.172304 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2bb86b36-ba27-4788-9a19-451d42d8a4e2-operator-scripts\") pod \"root-account-create-update-7crn6\" (UID: \"2bb86b36-ba27-4788-9a19-451d42d8a4e2\") " pod="openstack/root-account-create-update-7crn6" Mar 14 07:20:08 crc kubenswrapper[4893]: I0314 07:20:08.174169 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-jpm56"] Mar 14 07:20:08 crc kubenswrapper[4893]: I0314 07:20:08.184053 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-jpm56"] Mar 14 07:20:08 crc kubenswrapper[4893]: I0314 07:20:08.196975 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxjxb\" (UniqueName: \"kubernetes.io/projected/2bb86b36-ba27-4788-9a19-451d42d8a4e2-kube-api-access-lxjxb\") pod \"root-account-create-update-7crn6\" (UID: \"2bb86b36-ba27-4788-9a19-451d42d8a4e2\") " pod="openstack/root-account-create-update-7crn6" Mar 14 07:20:08 crc kubenswrapper[4893]: I0314 07:20:08.363894 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-7crn6" Mar 14 07:20:09 crc kubenswrapper[4893]: I0314 07:20:09.393772 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d36f5ab5-4eef-47c7-af0b-7c72437db021" path="/var/lib/kubelet/pods/d36f5ab5-4eef-47c7-af0b-7c72437db021/volumes" Mar 14 07:20:10 crc kubenswrapper[4893]: I0314 07:20:10.554717 4893 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-8rcbf" podUID="a4b44171-12ae-4a98-aac1-1adc9dff3941" containerName="ovn-controller" probeResult="failure" output=< Mar 14 07:20:10 crc kubenswrapper[4893]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 14 07:20:10 crc kubenswrapper[4893]: > Mar 14 07:20:10 crc kubenswrapper[4893]: I0314 07:20:10.828604 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/079232b7-87bb-42cf-96ff-1eb2d1cfe2b5-etc-swift\") pod \"swift-storage-0\" (UID: \"079232b7-87bb-42cf-96ff-1eb2d1cfe2b5\") " pod="openstack/swift-storage-0" Mar 14 07:20:10 crc kubenswrapper[4893]: E0314 07:20:10.828856 4893 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 14 07:20:10 crc kubenswrapper[4893]: E0314 07:20:10.828891 4893 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 14 07:20:10 crc kubenswrapper[4893]: E0314 07:20:10.828958 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/079232b7-87bb-42cf-96ff-1eb2d1cfe2b5-etc-swift podName:079232b7-87bb-42cf-96ff-1eb2d1cfe2b5 nodeName:}" failed. No retries permitted until 2026-03-14 07:20:18.828937006 +0000 UTC m=+1298.091113808 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/079232b7-87bb-42cf-96ff-1eb2d1cfe2b5-etc-swift") pod "swift-storage-0" (UID: "079232b7-87bb-42cf-96ff-1eb2d1cfe2b5") : configmap "swift-ring-files" not found Mar 14 07:20:12 crc kubenswrapper[4893]: I0314 07:20:12.058513 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7b9fd7d84c-b8mk8" Mar 14 07:20:12 crc kubenswrapper[4893]: I0314 07:20:12.112300 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d944d7b75-mrhh7"] Mar 14 07:20:12 crc kubenswrapper[4893]: I0314 07:20:12.112623 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5d944d7b75-mrhh7" podUID="25fed27d-191c-4fc5-8b22-493c62637e65" containerName="dnsmasq-dns" containerID="cri-o://0527ab0580ec7d69b21c7cb444daa7adb9e83a922cbe6da8a4a89673354b9b90" gracePeriod=10 Mar 14 07:20:13 crc kubenswrapper[4893]: I0314 07:20:13.097488 4893 generic.go:334] "Generic (PLEG): container finished" podID="25fed27d-191c-4fc5-8b22-493c62637e65" containerID="0527ab0580ec7d69b21c7cb444daa7adb9e83a922cbe6da8a4a89673354b9b90" exitCode=0 Mar 14 07:20:13 crc kubenswrapper[4893]: I0314 07:20:13.097591 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d944d7b75-mrhh7" event={"ID":"25fed27d-191c-4fc5-8b22-493c62637e65","Type":"ContainerDied","Data":"0527ab0580ec7d69b21c7cb444daa7adb9e83a922cbe6da8a4a89673354b9b90"} Mar 14 07:20:15 crc kubenswrapper[4893]: I0314 07:20:15.368507 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d944d7b75-mrhh7" Mar 14 07:20:15 crc kubenswrapper[4893]: I0314 07:20:15.410065 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25fed27d-191c-4fc5-8b22-493c62637e65-ovsdbserver-nb\") pod \"25fed27d-191c-4fc5-8b22-493c62637e65\" (UID: \"25fed27d-191c-4fc5-8b22-493c62637e65\") " Mar 14 07:20:15 crc kubenswrapper[4893]: I0314 07:20:15.410190 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25fed27d-191c-4fc5-8b22-493c62637e65-config\") pod \"25fed27d-191c-4fc5-8b22-493c62637e65\" (UID: \"25fed27d-191c-4fc5-8b22-493c62637e65\") " Mar 14 07:20:15 crc kubenswrapper[4893]: I0314 07:20:15.410293 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfsfj\" (UniqueName: \"kubernetes.io/projected/25fed27d-191c-4fc5-8b22-493c62637e65-kube-api-access-jfsfj\") pod \"25fed27d-191c-4fc5-8b22-493c62637e65\" (UID: \"25fed27d-191c-4fc5-8b22-493c62637e65\") " Mar 14 07:20:15 crc kubenswrapper[4893]: I0314 07:20:15.410371 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25fed27d-191c-4fc5-8b22-493c62637e65-dns-svc\") pod \"25fed27d-191c-4fc5-8b22-493c62637e65\" (UID: \"25fed27d-191c-4fc5-8b22-493c62637e65\") " Mar 14 07:20:15 crc kubenswrapper[4893]: I0314 07:20:15.410423 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25fed27d-191c-4fc5-8b22-493c62637e65-ovsdbserver-sb\") pod \"25fed27d-191c-4fc5-8b22-493c62637e65\" (UID: \"25fed27d-191c-4fc5-8b22-493c62637e65\") " Mar 14 07:20:15 crc kubenswrapper[4893]: I0314 07:20:15.416008 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25fed27d-191c-4fc5-8b22-493c62637e65-kube-api-access-jfsfj" (OuterVolumeSpecName: "kube-api-access-jfsfj") pod "25fed27d-191c-4fc5-8b22-493c62637e65" (UID: "25fed27d-191c-4fc5-8b22-493c62637e65"). InnerVolumeSpecName "kube-api-access-jfsfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:20:15 crc kubenswrapper[4893]: I0314 07:20:15.471304 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25fed27d-191c-4fc5-8b22-493c62637e65-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "25fed27d-191c-4fc5-8b22-493c62637e65" (UID: "25fed27d-191c-4fc5-8b22-493c62637e65"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:20:15 crc kubenswrapper[4893]: I0314 07:20:15.474274 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25fed27d-191c-4fc5-8b22-493c62637e65-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "25fed27d-191c-4fc5-8b22-493c62637e65" (UID: "25fed27d-191c-4fc5-8b22-493c62637e65"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:20:15 crc kubenswrapper[4893]: I0314 07:20:15.475234 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25fed27d-191c-4fc5-8b22-493c62637e65-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "25fed27d-191c-4fc5-8b22-493c62637e65" (UID: "25fed27d-191c-4fc5-8b22-493c62637e65"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:20:15 crc kubenswrapper[4893]: I0314 07:20:15.491214 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25fed27d-191c-4fc5-8b22-493c62637e65-config" (OuterVolumeSpecName: "config") pod "25fed27d-191c-4fc5-8b22-493c62637e65" (UID: "25fed27d-191c-4fc5-8b22-493c62637e65"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:20:15 crc kubenswrapper[4893]: I0314 07:20:15.511762 4893 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25fed27d-191c-4fc5-8b22-493c62637e65-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:15 crc kubenswrapper[4893]: I0314 07:20:15.511798 4893 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25fed27d-191c-4fc5-8b22-493c62637e65-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:15 crc kubenswrapper[4893]: I0314 07:20:15.511808 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jfsfj\" (UniqueName: \"kubernetes.io/projected/25fed27d-191c-4fc5-8b22-493c62637e65-kube-api-access-jfsfj\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:15 crc kubenswrapper[4893]: I0314 07:20:15.511817 4893 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25fed27d-191c-4fc5-8b22-493c62637e65-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:15 crc kubenswrapper[4893]: I0314 07:20:15.511826 4893 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25fed27d-191c-4fc5-8b22-493c62637e65-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:15 crc kubenswrapper[4893]: I0314 07:20:15.554593 4893 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-8rcbf" podUID="a4b44171-12ae-4a98-aac1-1adc9dff3941" containerName="ovn-controller" probeResult="failure" output=< Mar 14 07:20:15 crc kubenswrapper[4893]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 14 07:20:15 crc kubenswrapper[4893]: > Mar 14 07:20:15 crc kubenswrapper[4893]: I0314 07:20:15.578988 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-7crn6"] Mar 14 07:20:15 crc kubenswrapper[4893]: I0314 07:20:15.590366 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-bwq2l" Mar 14 07:20:15 crc kubenswrapper[4893]: I0314 07:20:15.600114 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-bwq2l" Mar 14 07:20:15 crc kubenswrapper[4893]: I0314 07:20:15.813702 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-8rcbf-config-lp7xn"] Mar 14 07:20:15 crc kubenswrapper[4893]: E0314 07:20:15.814039 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25fed27d-191c-4fc5-8b22-493c62637e65" containerName="init" Mar 14 07:20:15 crc kubenswrapper[4893]: I0314 07:20:15.814054 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="25fed27d-191c-4fc5-8b22-493c62637e65" containerName="init" Mar 14 07:20:15 crc kubenswrapper[4893]: E0314 07:20:15.814067 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25fed27d-191c-4fc5-8b22-493c62637e65" containerName="dnsmasq-dns" Mar 14 07:20:15 crc kubenswrapper[4893]: I0314 07:20:15.814074 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="25fed27d-191c-4fc5-8b22-493c62637e65" containerName="dnsmasq-dns" Mar 14 07:20:15 crc kubenswrapper[4893]: I0314 07:20:15.814222 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="25fed27d-191c-4fc5-8b22-493c62637e65" containerName="dnsmasq-dns" Mar 14 07:20:15 crc kubenswrapper[4893]: I0314 07:20:15.814695 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-8rcbf-config-lp7xn" Mar 14 07:20:15 crc kubenswrapper[4893]: I0314 07:20:15.819248 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 14 07:20:15 crc kubenswrapper[4893]: I0314 07:20:15.847314 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-8rcbf-config-lp7xn"] Mar 14 07:20:15 crc kubenswrapper[4893]: I0314 07:20:15.919697 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b5ed732c-e04f-4542-860c-ef297ad9d847-var-run-ovn\") pod \"ovn-controller-8rcbf-config-lp7xn\" (UID: \"b5ed732c-e04f-4542-860c-ef297ad9d847\") " pod="openstack/ovn-controller-8rcbf-config-lp7xn" Mar 14 07:20:15 crc kubenswrapper[4893]: I0314 07:20:15.919735 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b5ed732c-e04f-4542-860c-ef297ad9d847-var-log-ovn\") pod \"ovn-controller-8rcbf-config-lp7xn\" (UID: \"b5ed732c-e04f-4542-860c-ef297ad9d847\") " pod="openstack/ovn-controller-8rcbf-config-lp7xn" Mar 14 07:20:15 crc kubenswrapper[4893]: I0314 07:20:15.919844 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b5ed732c-e04f-4542-860c-ef297ad9d847-additional-scripts\") pod \"ovn-controller-8rcbf-config-lp7xn\" (UID: \"b5ed732c-e04f-4542-860c-ef297ad9d847\") " pod="openstack/ovn-controller-8rcbf-config-lp7xn" Mar 14 07:20:15 crc kubenswrapper[4893]: I0314 07:20:15.919905 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b5ed732c-e04f-4542-860c-ef297ad9d847-scripts\") pod \"ovn-controller-8rcbf-config-lp7xn\" (UID: \"b5ed732c-e04f-4542-860c-ef297ad9d847\") " pod="openstack/ovn-controller-8rcbf-config-lp7xn" Mar 14 07:20:15 crc kubenswrapper[4893]: I0314 07:20:15.919936 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9p6p\" (UniqueName: \"kubernetes.io/projected/b5ed732c-e04f-4542-860c-ef297ad9d847-kube-api-access-n9p6p\") pod \"ovn-controller-8rcbf-config-lp7xn\" (UID: \"b5ed732c-e04f-4542-860c-ef297ad9d847\") " pod="openstack/ovn-controller-8rcbf-config-lp7xn" Mar 14 07:20:15 crc kubenswrapper[4893]: I0314 07:20:15.919968 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b5ed732c-e04f-4542-860c-ef297ad9d847-var-run\") pod \"ovn-controller-8rcbf-config-lp7xn\" (UID: \"b5ed732c-e04f-4542-860c-ef297ad9d847\") " pod="openstack/ovn-controller-8rcbf-config-lp7xn" Mar 14 07:20:16 crc kubenswrapper[4893]: I0314 07:20:16.021491 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b5ed732c-e04f-4542-860c-ef297ad9d847-var-run\") pod \"ovn-controller-8rcbf-config-lp7xn\" (UID: \"b5ed732c-e04f-4542-860c-ef297ad9d847\") " pod="openstack/ovn-controller-8rcbf-config-lp7xn" Mar 14 07:20:16 crc kubenswrapper[4893]: I0314 07:20:16.021589 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b5ed732c-e04f-4542-860c-ef297ad9d847-var-run-ovn\") pod \"ovn-controller-8rcbf-config-lp7xn\" (UID: \"b5ed732c-e04f-4542-860c-ef297ad9d847\") " pod="openstack/ovn-controller-8rcbf-config-lp7xn" Mar 14 07:20:16 crc kubenswrapper[4893]: I0314 07:20:16.021618 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b5ed732c-e04f-4542-860c-ef297ad9d847-var-log-ovn\") pod \"ovn-controller-8rcbf-config-lp7xn\" (UID: \"b5ed732c-e04f-4542-860c-ef297ad9d847\") " pod="openstack/ovn-controller-8rcbf-config-lp7xn" Mar 14 07:20:16 crc kubenswrapper[4893]: I0314 07:20:16.021708 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b5ed732c-e04f-4542-860c-ef297ad9d847-additional-scripts\") pod \"ovn-controller-8rcbf-config-lp7xn\" (UID: \"b5ed732c-e04f-4542-860c-ef297ad9d847\") " pod="openstack/ovn-controller-8rcbf-config-lp7xn" Mar 14 07:20:16 crc kubenswrapper[4893]: I0314 07:20:16.021763 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b5ed732c-e04f-4542-860c-ef297ad9d847-scripts\") pod \"ovn-controller-8rcbf-config-lp7xn\" (UID: \"b5ed732c-e04f-4542-860c-ef297ad9d847\") " pod="openstack/ovn-controller-8rcbf-config-lp7xn" Mar 14 07:20:16 crc kubenswrapper[4893]: I0314 07:20:16.021803 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9p6p\" (UniqueName: \"kubernetes.io/projected/b5ed732c-e04f-4542-860c-ef297ad9d847-kube-api-access-n9p6p\") pod \"ovn-controller-8rcbf-config-lp7xn\" (UID: \"b5ed732c-e04f-4542-860c-ef297ad9d847\") " pod="openstack/ovn-controller-8rcbf-config-lp7xn" Mar 14 07:20:16 crc kubenswrapper[4893]: I0314 07:20:16.022399 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b5ed732c-e04f-4542-860c-ef297ad9d847-var-run\") pod \"ovn-controller-8rcbf-config-lp7xn\" (UID: \"b5ed732c-e04f-4542-860c-ef297ad9d847\") " pod="openstack/ovn-controller-8rcbf-config-lp7xn" Mar 14 07:20:16 crc kubenswrapper[4893]: I0314 07:20:16.022460 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b5ed732c-e04f-4542-860c-ef297ad9d847-var-run-ovn\") pod \"ovn-controller-8rcbf-config-lp7xn\" (UID: \"b5ed732c-e04f-4542-860c-ef297ad9d847\") " pod="openstack/ovn-controller-8rcbf-config-lp7xn" Mar 14 07:20:16 crc kubenswrapper[4893]: I0314 07:20:16.022508 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b5ed732c-e04f-4542-860c-ef297ad9d847-var-log-ovn\") pod \"ovn-controller-8rcbf-config-lp7xn\" (UID: \"b5ed732c-e04f-4542-860c-ef297ad9d847\") " pod="openstack/ovn-controller-8rcbf-config-lp7xn" Mar 14 07:20:16 crc kubenswrapper[4893]: I0314 07:20:16.023157 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b5ed732c-e04f-4542-860c-ef297ad9d847-additional-scripts\") pod \"ovn-controller-8rcbf-config-lp7xn\" (UID: \"b5ed732c-e04f-4542-860c-ef297ad9d847\") " pod="openstack/ovn-controller-8rcbf-config-lp7xn" Mar 14 07:20:16 crc kubenswrapper[4893]: I0314 07:20:16.025220 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b5ed732c-e04f-4542-860c-ef297ad9d847-scripts\") pod \"ovn-controller-8rcbf-config-lp7xn\" (UID: \"b5ed732c-e04f-4542-860c-ef297ad9d847\") " pod="openstack/ovn-controller-8rcbf-config-lp7xn" Mar 14 07:20:16 crc kubenswrapper[4893]: I0314 07:20:16.043170 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9p6p\" (UniqueName: \"kubernetes.io/projected/b5ed732c-e04f-4542-860c-ef297ad9d847-kube-api-access-n9p6p\") pod \"ovn-controller-8rcbf-config-lp7xn\" (UID: \"b5ed732c-e04f-4542-860c-ef297ad9d847\") " pod="openstack/ovn-controller-8rcbf-config-lp7xn" Mar 14 07:20:16 crc kubenswrapper[4893]: I0314 07:20:16.132393 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-8rcbf-config-lp7xn" Mar 14 07:20:16 crc kubenswrapper[4893]: I0314 07:20:16.136297 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a752b3c8-284e-490f-be39-506e7a075c6f","Type":"ContainerStarted","Data":"ee01de4008b1335a2c5408eb31930a4fb5131e254bc97793e1f6069614788d91"} Mar 14 07:20:16 crc kubenswrapper[4893]: I0314 07:20:16.136551 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 14 07:20:16 crc kubenswrapper[4893]: I0314 07:20:16.151543 4893 generic.go:334] "Generic (PLEG): container finished" podID="2bb86b36-ba27-4788-9a19-451d42d8a4e2" containerID="51f5220871dd0aca09702139c996f8aa409270e4f425fc07daec2280fa1f537e" exitCode=0 Mar 14 07:20:16 crc kubenswrapper[4893]: I0314 07:20:16.151611 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-7crn6" event={"ID":"2bb86b36-ba27-4788-9a19-451d42d8a4e2","Type":"ContainerDied","Data":"51f5220871dd0aca09702139c996f8aa409270e4f425fc07daec2280fa1f537e"} Mar 14 07:20:16 crc kubenswrapper[4893]: I0314 07:20:16.151636 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-7crn6" event={"ID":"2bb86b36-ba27-4788-9a19-451d42d8a4e2","Type":"ContainerStarted","Data":"3967414fdb74af366493bbc1b9ce0dbf72f319c75d9fc11f770a4a2474ee2ccc"} Mar 14 07:20:16 crc kubenswrapper[4893]: I0314 07:20:16.159757 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d944d7b75-mrhh7" event={"ID":"25fed27d-191c-4fc5-8b22-493c62637e65","Type":"ContainerDied","Data":"d98ba75b2e83cde714ab20c495002a0e80255b027b65c04562918527633bf906"} Mar 14 07:20:16 crc kubenswrapper[4893]: I0314 07:20:16.159794 4893 scope.go:117] "RemoveContainer" containerID="0527ab0580ec7d69b21c7cb444daa7adb9e83a922cbe6da8a4a89673354b9b90" Mar 14 07:20:16 crc kubenswrapper[4893]: I0314 07:20:16.159907 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d944d7b75-mrhh7" Mar 14 07:20:16 crc kubenswrapper[4893]: I0314 07:20:16.167339 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=44.972509707 podStartE2EDuration="1m12.16732461s" podCreationTimestamp="2026-03-14 07:19:04 +0000 UTC" firstStartedPulling="2026-03-14 07:19:06.278952931 +0000 UTC m=+1225.541129723" lastFinishedPulling="2026-03-14 07:19:33.473767834 +0000 UTC m=+1252.735944626" observedRunningTime="2026-03-14 07:20:16.157785487 +0000 UTC m=+1295.419962289" watchObservedRunningTime="2026-03-14 07:20:16.16732461 +0000 UTC m=+1295.429501412" Mar 14 07:20:16 crc kubenswrapper[4893]: I0314 07:20:16.179955 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-98sg2" event={"ID":"47f8afde-298d-42a9-a92e-dd3b4561b98c","Type":"ContainerStarted","Data":"6d895c5fb663549e3906c267cd862fb00a0c7b62a71de4e4499999dcb434b003"} Mar 14 07:20:16 crc kubenswrapper[4893]: I0314 07:20:16.185139 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e","Type":"ContainerStarted","Data":"a5f24be09a6c54f86e91629c32362dd6fc01dad63e4f1a792a6a2d72329e86ae"} Mar 14 07:20:16 crc kubenswrapper[4893]: I0314 07:20:16.185549 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:20:16 crc kubenswrapper[4893]: I0314 07:20:16.211636 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d944d7b75-mrhh7"] Mar 14 07:20:16 crc kubenswrapper[4893]: I0314 07:20:16.218889 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d944d7b75-mrhh7"] Mar 14 07:20:16 crc kubenswrapper[4893]: I0314 07:20:16.230468 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-98sg2" podStartSLOduration=2.1060639 podStartE2EDuration="15.230449048s" podCreationTimestamp="2026-03-14 07:20:01 +0000 UTC" firstStartedPulling="2026-03-14 07:20:01.989700463 +0000 UTC m=+1281.251877245" lastFinishedPulling="2026-03-14 07:20:15.114085601 +0000 UTC m=+1294.376262393" observedRunningTime="2026-03-14 07:20:16.218563218 +0000 UTC m=+1295.480740010" watchObservedRunningTime="2026-03-14 07:20:16.230449048 +0000 UTC m=+1295.492625860" Mar 14 07:20:16 crc kubenswrapper[4893]: I0314 07:20:16.243767 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=-9223371964.611034 podStartE2EDuration="1m12.243740532s" podCreationTimestamp="2026-03-14 07:19:04 +0000 UTC" firstStartedPulling="2026-03-14 07:19:06.913539506 +0000 UTC m=+1226.175716298" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:20:16.240757629 +0000 UTC m=+1295.502934431" watchObservedRunningTime="2026-03-14 07:20:16.243740532 +0000 UTC m=+1295.505917334" Mar 14 07:20:17 crc kubenswrapper[4893]: I0314 07:20:17.402769 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25fed27d-191c-4fc5-8b22-493c62637e65" path="/var/lib/kubelet/pods/25fed27d-191c-4fc5-8b22-493c62637e65/volumes" Mar 14 07:20:18 crc kubenswrapper[4893]: I0314 07:20:18.382937 4893 scope.go:117] "RemoveContainer" containerID="f6adeb8abc5960a89a79eab5107c713c8c71bc003558950c9cd2fffd15c5edf0" Mar 14 07:20:18 crc kubenswrapper[4893]: I0314 07:20:18.609599 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-7crn6" Mar 14 07:20:18 crc kubenswrapper[4893]: I0314 07:20:18.768800 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxjxb\" (UniqueName: \"kubernetes.io/projected/2bb86b36-ba27-4788-9a19-451d42d8a4e2-kube-api-access-lxjxb\") pod \"2bb86b36-ba27-4788-9a19-451d42d8a4e2\" (UID: \"2bb86b36-ba27-4788-9a19-451d42d8a4e2\") " Mar 14 07:20:18 crc kubenswrapper[4893]: I0314 07:20:18.769273 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2bb86b36-ba27-4788-9a19-451d42d8a4e2-operator-scripts\") pod \"2bb86b36-ba27-4788-9a19-451d42d8a4e2\" (UID: \"2bb86b36-ba27-4788-9a19-451d42d8a4e2\") " Mar 14 07:20:18 crc kubenswrapper[4893]: I0314 07:20:18.770367 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bb86b36-ba27-4788-9a19-451d42d8a4e2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2bb86b36-ba27-4788-9a19-451d42d8a4e2" (UID: "2bb86b36-ba27-4788-9a19-451d42d8a4e2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:20:18 crc kubenswrapper[4893]: I0314 07:20:18.777007 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bb86b36-ba27-4788-9a19-451d42d8a4e2-kube-api-access-lxjxb" (OuterVolumeSpecName: "kube-api-access-lxjxb") pod "2bb86b36-ba27-4788-9a19-451d42d8a4e2" (UID: "2bb86b36-ba27-4788-9a19-451d42d8a4e2"). InnerVolumeSpecName "kube-api-access-lxjxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:20:18 crc kubenswrapper[4893]: I0314 07:20:18.871734 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/079232b7-87bb-42cf-96ff-1eb2d1cfe2b5-etc-swift\") pod \"swift-storage-0\" (UID: \"079232b7-87bb-42cf-96ff-1eb2d1cfe2b5\") " pod="openstack/swift-storage-0" Mar 14 07:20:18 crc kubenswrapper[4893]: E0314 07:20:18.871972 4893 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 14 07:20:18 crc kubenswrapper[4893]: E0314 07:20:18.872040 4893 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 14 07:20:18 crc kubenswrapper[4893]: E0314 07:20:18.872121 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/079232b7-87bb-42cf-96ff-1eb2d1cfe2b5-etc-swift podName:079232b7-87bb-42cf-96ff-1eb2d1cfe2b5 nodeName:}" failed. No retries permitted until 2026-03-14 07:20:34.87209169 +0000 UTC m=+1314.134268522 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/079232b7-87bb-42cf-96ff-1eb2d1cfe2b5-etc-swift") pod "swift-storage-0" (UID: "079232b7-87bb-42cf-96ff-1eb2d1cfe2b5") : configmap "swift-ring-files" not found Mar 14 07:20:18 crc kubenswrapper[4893]: I0314 07:20:18.871981 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxjxb\" (UniqueName: \"kubernetes.io/projected/2bb86b36-ba27-4788-9a19-451d42d8a4e2-kube-api-access-lxjxb\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:18 crc kubenswrapper[4893]: I0314 07:20:18.872247 4893 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2bb86b36-ba27-4788-9a19-451d42d8a4e2-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:18 crc kubenswrapper[4893]: I0314 07:20:18.950584 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-8rcbf-config-lp7xn"] Mar 14 07:20:19 crc kubenswrapper[4893]: I0314 07:20:19.207399 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-wr6c4" event={"ID":"e77e9706-a6b7-4f26-9897-8f5d66642a67","Type":"ContainerStarted","Data":"371538afb0b740f188caaef5fc1d7c03c20a31b7d8b9647370541a43e9085a8a"} Mar 14 07:20:19 crc kubenswrapper[4893]: I0314 07:20:19.208818 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-7crn6" event={"ID":"2bb86b36-ba27-4788-9a19-451d42d8a4e2","Type":"ContainerDied","Data":"3967414fdb74af366493bbc1b9ce0dbf72f319c75d9fc11f770a4a2474ee2ccc"} Mar 14 07:20:19 crc kubenswrapper[4893]: I0314 07:20:19.208844 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3967414fdb74af366493bbc1b9ce0dbf72f319c75d9fc11f770a4a2474ee2ccc" Mar 14 07:20:19 crc kubenswrapper[4893]: I0314 07:20:19.208870 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-7crn6" Mar 14 07:20:19 crc kubenswrapper[4893]: I0314 07:20:19.210135 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-8rcbf-config-lp7xn" event={"ID":"b5ed732c-e04f-4542-860c-ef297ad9d847","Type":"ContainerStarted","Data":"21718373de219bdb23e7f873e226c950c8c75ee535d15e7bc1b3b152a28ecb40"} Mar 14 07:20:19 crc kubenswrapper[4893]: I0314 07:20:19.243731 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-wr6c4" podStartSLOduration=2.372471787 podStartE2EDuration="13.243713456s" podCreationTimestamp="2026-03-14 07:20:06 +0000 UTC" firstStartedPulling="2026-03-14 07:20:07.565564434 +0000 UTC m=+1286.827741226" lastFinishedPulling="2026-03-14 07:20:18.436806093 +0000 UTC m=+1297.698982895" observedRunningTime="2026-03-14 07:20:19.236164683 +0000 UTC m=+1298.498341475" watchObservedRunningTime="2026-03-14 07:20:19.243713456 +0000 UTC m=+1298.505890248" Mar 14 07:20:20 crc kubenswrapper[4893]: I0314 07:20:20.218437 4893 generic.go:334] "Generic (PLEG): container finished" podID="b5ed732c-e04f-4542-860c-ef297ad9d847" containerID="418ccc387f45c514ece7bef0efcc443727d9cf334cffaee3ab2b54a98504fe59" exitCode=0 Mar 14 07:20:20 crc kubenswrapper[4893]: I0314 07:20:20.218496 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-8rcbf-config-lp7xn" event={"ID":"b5ed732c-e04f-4542-860c-ef297ad9d847","Type":"ContainerDied","Data":"418ccc387f45c514ece7bef0efcc443727d9cf334cffaee3ab2b54a98504fe59"} Mar 14 07:20:20 crc kubenswrapper[4893]: I0314 07:20:20.562094 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-8rcbf" Mar 14 07:20:21 crc kubenswrapper[4893]: I0314 07:20:21.562095 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-8rcbf-config-lp7xn" Mar 14 07:20:21 crc kubenswrapper[4893]: I0314 07:20:21.717114 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b5ed732c-e04f-4542-860c-ef297ad9d847-var-run\") pod \"b5ed732c-e04f-4542-860c-ef297ad9d847\" (UID: \"b5ed732c-e04f-4542-860c-ef297ad9d847\") " Mar 14 07:20:21 crc kubenswrapper[4893]: I0314 07:20:21.717275 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b5ed732c-e04f-4542-860c-ef297ad9d847-var-log-ovn\") pod \"b5ed732c-e04f-4542-860c-ef297ad9d847\" (UID: \"b5ed732c-e04f-4542-860c-ef297ad9d847\") " Mar 14 07:20:21 crc kubenswrapper[4893]: I0314 07:20:21.717228 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b5ed732c-e04f-4542-860c-ef297ad9d847-var-run" (OuterVolumeSpecName: "var-run") pod "b5ed732c-e04f-4542-860c-ef297ad9d847" (UID: "b5ed732c-e04f-4542-860c-ef297ad9d847"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:20:21 crc kubenswrapper[4893]: I0314 07:20:21.717351 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b5ed732c-e04f-4542-860c-ef297ad9d847-var-run-ovn\") pod \"b5ed732c-e04f-4542-860c-ef297ad9d847\" (UID: \"b5ed732c-e04f-4542-860c-ef297ad9d847\") " Mar 14 07:20:21 crc kubenswrapper[4893]: I0314 07:20:21.717409 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b5ed732c-e04f-4542-860c-ef297ad9d847-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "b5ed732c-e04f-4542-860c-ef297ad9d847" (UID: "b5ed732c-e04f-4542-860c-ef297ad9d847"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:20:21 crc kubenswrapper[4893]: I0314 07:20:21.717404 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b5ed732c-e04f-4542-860c-ef297ad9d847-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "b5ed732c-e04f-4542-860c-ef297ad9d847" (UID: "b5ed732c-e04f-4542-860c-ef297ad9d847"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:20:21 crc kubenswrapper[4893]: I0314 07:20:21.717507 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9p6p\" (UniqueName: \"kubernetes.io/projected/b5ed732c-e04f-4542-860c-ef297ad9d847-kube-api-access-n9p6p\") pod \"b5ed732c-e04f-4542-860c-ef297ad9d847\" (UID: \"b5ed732c-e04f-4542-860c-ef297ad9d847\") " Mar 14 07:20:21 crc kubenswrapper[4893]: I0314 07:20:21.717552 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b5ed732c-e04f-4542-860c-ef297ad9d847-scripts\") pod \"b5ed732c-e04f-4542-860c-ef297ad9d847\" (UID: \"b5ed732c-e04f-4542-860c-ef297ad9d847\") " Mar 14 07:20:21 crc kubenswrapper[4893]: I0314 07:20:21.717635 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b5ed732c-e04f-4542-860c-ef297ad9d847-additional-scripts\") pod \"b5ed732c-e04f-4542-860c-ef297ad9d847\" (UID: \"b5ed732c-e04f-4542-860c-ef297ad9d847\") " Mar 14 07:20:21 crc kubenswrapper[4893]: I0314 07:20:21.718382 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5ed732c-e04f-4542-860c-ef297ad9d847-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "b5ed732c-e04f-4542-860c-ef297ad9d847" (UID: "b5ed732c-e04f-4542-860c-ef297ad9d847"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:20:21 crc kubenswrapper[4893]: I0314 07:20:21.718666 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5ed732c-e04f-4542-860c-ef297ad9d847-scripts" (OuterVolumeSpecName: "scripts") pod "b5ed732c-e04f-4542-860c-ef297ad9d847" (UID: "b5ed732c-e04f-4542-860c-ef297ad9d847"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:20:21 crc kubenswrapper[4893]: I0314 07:20:21.718939 4893 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b5ed732c-e04f-4542-860c-ef297ad9d847-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:21 crc kubenswrapper[4893]: I0314 07:20:21.718986 4893 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b5ed732c-e04f-4542-860c-ef297ad9d847-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:21 crc kubenswrapper[4893]: I0314 07:20:21.719000 4893 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b5ed732c-e04f-4542-860c-ef297ad9d847-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:21 crc kubenswrapper[4893]: I0314 07:20:21.719013 4893 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b5ed732c-e04f-4542-860c-ef297ad9d847-var-run\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:21 crc kubenswrapper[4893]: I0314 07:20:21.719027 4893 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b5ed732c-e04f-4542-860c-ef297ad9d847-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:21 crc kubenswrapper[4893]: I0314 07:20:21.731319 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5ed732c-e04f-4542-860c-ef297ad9d847-kube-api-access-n9p6p" (OuterVolumeSpecName: "kube-api-access-n9p6p") pod "b5ed732c-e04f-4542-860c-ef297ad9d847" (UID: "b5ed732c-e04f-4542-860c-ef297ad9d847"). InnerVolumeSpecName "kube-api-access-n9p6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:20:21 crc kubenswrapper[4893]: I0314 07:20:21.820935 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9p6p\" (UniqueName: \"kubernetes.io/projected/b5ed732c-e04f-4542-860c-ef297ad9d847-kube-api-access-n9p6p\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:22 crc kubenswrapper[4893]: I0314 07:20:22.239925 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-8rcbf-config-lp7xn" event={"ID":"b5ed732c-e04f-4542-860c-ef297ad9d847","Type":"ContainerDied","Data":"21718373de219bdb23e7f873e226c950c8c75ee535d15e7bc1b3b152a28ecb40"} Mar 14 07:20:22 crc kubenswrapper[4893]: I0314 07:20:22.239960 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-8rcbf-config-lp7xn" Mar 14 07:20:22 crc kubenswrapper[4893]: I0314 07:20:22.239973 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21718373de219bdb23e7f873e226c950c8c75ee535d15e7bc1b3b152a28ecb40" Mar 14 07:20:22 crc kubenswrapper[4893]: I0314 07:20:22.246012 4893 generic.go:334] "Generic (PLEG): container finished" podID="47f8afde-298d-42a9-a92e-dd3b4561b98c" containerID="6d895c5fb663549e3906c267cd862fb00a0c7b62a71de4e4499999dcb434b003" exitCode=0 Mar 14 07:20:22 crc kubenswrapper[4893]: I0314 07:20:22.246064 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-98sg2" event={"ID":"47f8afde-298d-42a9-a92e-dd3b4561b98c","Type":"ContainerDied","Data":"6d895c5fb663549e3906c267cd862fb00a0c7b62a71de4e4499999dcb434b003"} Mar 14 07:20:22 crc kubenswrapper[4893]: I0314 07:20:22.657548 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-8rcbf-config-lp7xn"] Mar 14 07:20:22 crc kubenswrapper[4893]: I0314 07:20:22.662602 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-8rcbf-config-lp7xn"] Mar 14 07:20:22 crc kubenswrapper[4893]: I0314 07:20:22.798457 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-8rcbf-config-qb9pq"] Mar 14 07:20:22 crc kubenswrapper[4893]: E0314 07:20:22.799272 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bb86b36-ba27-4788-9a19-451d42d8a4e2" containerName="mariadb-account-create-update" Mar 14 07:20:22 crc kubenswrapper[4893]: I0314 07:20:22.799423 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bb86b36-ba27-4788-9a19-451d42d8a4e2" containerName="mariadb-account-create-update" Mar 14 07:20:22 crc kubenswrapper[4893]: E0314 07:20:22.799618 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5ed732c-e04f-4542-860c-ef297ad9d847" containerName="ovn-config" Mar 14 07:20:22 crc kubenswrapper[4893]: I0314 07:20:22.799803 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5ed732c-e04f-4542-860c-ef297ad9d847" containerName="ovn-config" Mar 14 07:20:22 crc kubenswrapper[4893]: I0314 07:20:22.800268 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5ed732c-e04f-4542-860c-ef297ad9d847" containerName="ovn-config" Mar 14 07:20:22 crc kubenswrapper[4893]: I0314 07:20:22.800409 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bb86b36-ba27-4788-9a19-451d42d8a4e2" containerName="mariadb-account-create-update" Mar 14 07:20:22 crc kubenswrapper[4893]: I0314 07:20:22.801442 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-8rcbf-config-qb9pq" Mar 14 07:20:22 crc kubenswrapper[4893]: I0314 07:20:22.805567 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 14 07:20:22 crc kubenswrapper[4893]: I0314 07:20:22.829727 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-8rcbf-config-qb9pq"] Mar 14 07:20:22 crc kubenswrapper[4893]: I0314 07:20:22.938958 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8cwn\" (UniqueName: \"kubernetes.io/projected/b0a0f907-332e-4e2f-bf23-586e7728a26e-kube-api-access-f8cwn\") pod \"ovn-controller-8rcbf-config-qb9pq\" (UID: \"b0a0f907-332e-4e2f-bf23-586e7728a26e\") " pod="openstack/ovn-controller-8rcbf-config-qb9pq" Mar 14 07:20:22 crc kubenswrapper[4893]: I0314 07:20:22.939009 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b0a0f907-332e-4e2f-bf23-586e7728a26e-var-run\") pod \"ovn-controller-8rcbf-config-qb9pq\" (UID: \"b0a0f907-332e-4e2f-bf23-586e7728a26e\") " pod="openstack/ovn-controller-8rcbf-config-qb9pq" Mar 14 07:20:22 crc kubenswrapper[4893]: I0314 07:20:22.939147 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b0a0f907-332e-4e2f-bf23-586e7728a26e-scripts\") pod \"ovn-controller-8rcbf-config-qb9pq\" (UID: \"b0a0f907-332e-4e2f-bf23-586e7728a26e\") " pod="openstack/ovn-controller-8rcbf-config-qb9pq" Mar 14 07:20:22 crc kubenswrapper[4893]: I0314 07:20:22.939236 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b0a0f907-332e-4e2f-bf23-586e7728a26e-var-run-ovn\") pod \"ovn-controller-8rcbf-config-qb9pq\" (UID: \"b0a0f907-332e-4e2f-bf23-586e7728a26e\") " pod="openstack/ovn-controller-8rcbf-config-qb9pq" Mar 14 07:20:22 crc kubenswrapper[4893]: I0314 07:20:22.939283 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b0a0f907-332e-4e2f-bf23-586e7728a26e-additional-scripts\") pod \"ovn-controller-8rcbf-config-qb9pq\" (UID: \"b0a0f907-332e-4e2f-bf23-586e7728a26e\") " pod="openstack/ovn-controller-8rcbf-config-qb9pq" Mar 14 07:20:22 crc kubenswrapper[4893]: I0314 07:20:22.939353 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b0a0f907-332e-4e2f-bf23-586e7728a26e-var-log-ovn\") pod \"ovn-controller-8rcbf-config-qb9pq\" (UID: \"b0a0f907-332e-4e2f-bf23-586e7728a26e\") " pod="openstack/ovn-controller-8rcbf-config-qb9pq" Mar 14 07:20:23 crc kubenswrapper[4893]: I0314 07:20:23.040912 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b0a0f907-332e-4e2f-bf23-586e7728a26e-var-log-ovn\") pod \"ovn-controller-8rcbf-config-qb9pq\" (UID: \"b0a0f907-332e-4e2f-bf23-586e7728a26e\") " pod="openstack/ovn-controller-8rcbf-config-qb9pq" Mar 14 07:20:23 crc kubenswrapper[4893]: I0314 07:20:23.041184 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8cwn\" (UniqueName: \"kubernetes.io/projected/b0a0f907-332e-4e2f-bf23-586e7728a26e-kube-api-access-f8cwn\") pod \"ovn-controller-8rcbf-config-qb9pq\" (UID: \"b0a0f907-332e-4e2f-bf23-586e7728a26e\") " pod="openstack/ovn-controller-8rcbf-config-qb9pq" Mar 14 07:20:23 crc kubenswrapper[4893]: I0314 07:20:23.041290 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b0a0f907-332e-4e2f-bf23-586e7728a26e-var-log-ovn\") pod \"ovn-controller-8rcbf-config-qb9pq\" (UID: \"b0a0f907-332e-4e2f-bf23-586e7728a26e\") " pod="openstack/ovn-controller-8rcbf-config-qb9pq" Mar 14 07:20:23 crc kubenswrapper[4893]: I0314 07:20:23.041315 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b0a0f907-332e-4e2f-bf23-586e7728a26e-var-run\") pod \"ovn-controller-8rcbf-config-qb9pq\" (UID: \"b0a0f907-332e-4e2f-bf23-586e7728a26e\") " pod="openstack/ovn-controller-8rcbf-config-qb9pq" Mar 14 07:20:23 crc kubenswrapper[4893]: I0314 07:20:23.041547 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b0a0f907-332e-4e2f-bf23-586e7728a26e-var-run\") pod \"ovn-controller-8rcbf-config-qb9pq\" (UID: \"b0a0f907-332e-4e2f-bf23-586e7728a26e\") " pod="openstack/ovn-controller-8rcbf-config-qb9pq" Mar 14 07:20:23 crc kubenswrapper[4893]: I0314 07:20:23.041563 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b0a0f907-332e-4e2f-bf23-586e7728a26e-scripts\") pod \"ovn-controller-8rcbf-config-qb9pq\" (UID: \"b0a0f907-332e-4e2f-bf23-586e7728a26e\") " pod="openstack/ovn-controller-8rcbf-config-qb9pq" Mar 14 07:20:23 crc kubenswrapper[4893]: I0314 07:20:23.041770 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b0a0f907-332e-4e2f-bf23-586e7728a26e-var-run-ovn\") pod \"ovn-controller-8rcbf-config-qb9pq\" (UID: \"b0a0f907-332e-4e2f-bf23-586e7728a26e\") " pod="openstack/ovn-controller-8rcbf-config-qb9pq" Mar 14 07:20:23 crc kubenswrapper[4893]: I0314 07:20:23.041862 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b0a0f907-332e-4e2f-bf23-586e7728a26e-var-run-ovn\") pod \"ovn-controller-8rcbf-config-qb9pq\" (UID: \"b0a0f907-332e-4e2f-bf23-586e7728a26e\") " pod="openstack/ovn-controller-8rcbf-config-qb9pq" Mar 14 07:20:23 crc kubenswrapper[4893]: I0314 07:20:23.041882 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b0a0f907-332e-4e2f-bf23-586e7728a26e-additional-scripts\") pod \"ovn-controller-8rcbf-config-qb9pq\" (UID: \"b0a0f907-332e-4e2f-bf23-586e7728a26e\") " pod="openstack/ovn-controller-8rcbf-config-qb9pq" Mar 14 07:20:23 crc kubenswrapper[4893]: I0314 07:20:23.042588 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b0a0f907-332e-4e2f-bf23-586e7728a26e-additional-scripts\") pod \"ovn-controller-8rcbf-config-qb9pq\" (UID: \"b0a0f907-332e-4e2f-bf23-586e7728a26e\") " pod="openstack/ovn-controller-8rcbf-config-qb9pq" Mar 14 07:20:23 crc kubenswrapper[4893]: I0314 07:20:23.044183 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b0a0f907-332e-4e2f-bf23-586e7728a26e-scripts\") pod \"ovn-controller-8rcbf-config-qb9pq\" (UID: \"b0a0f907-332e-4e2f-bf23-586e7728a26e\") " pod="openstack/ovn-controller-8rcbf-config-qb9pq" Mar 14 07:20:23 crc kubenswrapper[4893]: I0314 07:20:23.075439 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8cwn\" (UniqueName: \"kubernetes.io/projected/b0a0f907-332e-4e2f-bf23-586e7728a26e-kube-api-access-f8cwn\") pod \"ovn-controller-8rcbf-config-qb9pq\" (UID: \"b0a0f907-332e-4e2f-bf23-586e7728a26e\") " pod="openstack/ovn-controller-8rcbf-config-qb9pq" Mar 14 07:20:23 crc kubenswrapper[4893]: I0314 07:20:23.141585 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-8rcbf-config-qb9pq" Mar 14 07:20:23 crc kubenswrapper[4893]: I0314 07:20:23.386759 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5ed732c-e04f-4542-860c-ef297ad9d847" path="/var/lib/kubelet/pods/b5ed732c-e04f-4542-860c-ef297ad9d847/volumes" Mar 14 07:20:23 crc kubenswrapper[4893]: I0314 07:20:23.614494 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-8rcbf-config-qb9pq"] Mar 14 07:20:23 crc kubenswrapper[4893]: I0314 07:20:23.714657 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-98sg2" Mar 14 07:20:23 crc kubenswrapper[4893]: I0314 07:20:23.852712 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6445s\" (UniqueName: \"kubernetes.io/projected/47f8afde-298d-42a9-a92e-dd3b4561b98c-kube-api-access-6445s\") pod \"47f8afde-298d-42a9-a92e-dd3b4561b98c\" (UID: \"47f8afde-298d-42a9-a92e-dd3b4561b98c\") " Mar 14 07:20:23 crc kubenswrapper[4893]: I0314 07:20:23.852758 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47f8afde-298d-42a9-a92e-dd3b4561b98c-config-data\") pod \"47f8afde-298d-42a9-a92e-dd3b4561b98c\" (UID: \"47f8afde-298d-42a9-a92e-dd3b4561b98c\") " Mar 14 07:20:23 crc kubenswrapper[4893]: I0314 07:20:23.852899 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/47f8afde-298d-42a9-a92e-dd3b4561b98c-db-sync-config-data\") pod \"47f8afde-298d-42a9-a92e-dd3b4561b98c\" (UID: \"47f8afde-298d-42a9-a92e-dd3b4561b98c\") " Mar 14 07:20:23 crc kubenswrapper[4893]: I0314 07:20:23.852988 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47f8afde-298d-42a9-a92e-dd3b4561b98c-combined-ca-bundle\") pod \"47f8afde-298d-42a9-a92e-dd3b4561b98c\" (UID: \"47f8afde-298d-42a9-a92e-dd3b4561b98c\") " Mar 14 07:20:23 crc kubenswrapper[4893]: I0314 07:20:23.858794 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47f8afde-298d-42a9-a92e-dd3b4561b98c-kube-api-access-6445s" (OuterVolumeSpecName: "kube-api-access-6445s") pod "47f8afde-298d-42a9-a92e-dd3b4561b98c" (UID: "47f8afde-298d-42a9-a92e-dd3b4561b98c"). InnerVolumeSpecName "kube-api-access-6445s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:20:23 crc kubenswrapper[4893]: I0314 07:20:23.859421 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47f8afde-298d-42a9-a92e-dd3b4561b98c-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "47f8afde-298d-42a9-a92e-dd3b4561b98c" (UID: "47f8afde-298d-42a9-a92e-dd3b4561b98c"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:20:23 crc kubenswrapper[4893]: I0314 07:20:23.890898 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47f8afde-298d-42a9-a92e-dd3b4561b98c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "47f8afde-298d-42a9-a92e-dd3b4561b98c" (UID: "47f8afde-298d-42a9-a92e-dd3b4561b98c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:20:23 crc kubenswrapper[4893]: I0314 07:20:23.908585 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47f8afde-298d-42a9-a92e-dd3b4561b98c-config-data" (OuterVolumeSpecName: "config-data") pod "47f8afde-298d-42a9-a92e-dd3b4561b98c" (UID: "47f8afde-298d-42a9-a92e-dd3b4561b98c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:20:23 crc kubenswrapper[4893]: I0314 07:20:23.954495 4893 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/47f8afde-298d-42a9-a92e-dd3b4561b98c-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:23 crc kubenswrapper[4893]: I0314 07:20:23.954550 4893 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47f8afde-298d-42a9-a92e-dd3b4561b98c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:23 crc kubenswrapper[4893]: I0314 07:20:23.954562 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6445s\" (UniqueName: \"kubernetes.io/projected/47f8afde-298d-42a9-a92e-dd3b4561b98c-kube-api-access-6445s\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:23 crc kubenswrapper[4893]: I0314 07:20:23.954573 4893 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47f8afde-298d-42a9-a92e-dd3b4561b98c-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:24 crc kubenswrapper[4893]: I0314 07:20:24.288200 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-98sg2" Mar 14 07:20:24 crc kubenswrapper[4893]: I0314 07:20:24.288212 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-98sg2" event={"ID":"47f8afde-298d-42a9-a92e-dd3b4561b98c","Type":"ContainerDied","Data":"dd59586a4186f8d0aaa06ad45d497f838321bcfb53ab8cd0efdb854d02ebc0e0"} Mar 14 07:20:24 crc kubenswrapper[4893]: I0314 07:20:24.288687 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd59586a4186f8d0aaa06ad45d497f838321bcfb53ab8cd0efdb854d02ebc0e0" Mar 14 07:20:24 crc kubenswrapper[4893]: I0314 07:20:24.290271 4893 generic.go:334] "Generic (PLEG): container finished" podID="b0a0f907-332e-4e2f-bf23-586e7728a26e" containerID="c474a796e7a35b3a593b6197965605c48c121d839c4564ec15637bd336060fca" exitCode=0 Mar 14 07:20:24 crc kubenswrapper[4893]: I0314 07:20:24.290320 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-8rcbf-config-qb9pq" event={"ID":"b0a0f907-332e-4e2f-bf23-586e7728a26e","Type":"ContainerDied","Data":"c474a796e7a35b3a593b6197965605c48c121d839c4564ec15637bd336060fca"} Mar 14 07:20:24 crc kubenswrapper[4893]: I0314 07:20:24.290354 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-8rcbf-config-qb9pq" event={"ID":"b0a0f907-332e-4e2f-bf23-586e7728a26e","Type":"ContainerStarted","Data":"351f2ae3d325d161a0485e4024d334b8ce92946c75c5d986a88d23ef43a51c59"} Mar 14 07:20:24 crc kubenswrapper[4893]: I0314 07:20:24.686929 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6cd86fcf7-fcmhz"] Mar 14 07:20:24 crc kubenswrapper[4893]: E0314 07:20:24.687354 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47f8afde-298d-42a9-a92e-dd3b4561b98c" containerName="glance-db-sync" Mar 14 07:20:24 crc kubenswrapper[4893]: I0314 07:20:24.687376 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="47f8afde-298d-42a9-a92e-dd3b4561b98c" containerName="glance-db-sync" Mar 14 07:20:24 crc kubenswrapper[4893]: I0314 07:20:24.687589 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="47f8afde-298d-42a9-a92e-dd3b4561b98c" containerName="glance-db-sync" Mar 14 07:20:24 crc kubenswrapper[4893]: I0314 07:20:24.688649 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cd86fcf7-fcmhz" Mar 14 07:20:24 crc kubenswrapper[4893]: I0314 07:20:24.711879 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cd86fcf7-fcmhz"] Mar 14 07:20:24 crc kubenswrapper[4893]: I0314 07:20:24.777182 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc2d4699-ea9a-426b-b441-ee0d9e32445c-config\") pod \"dnsmasq-dns-6cd86fcf7-fcmhz\" (UID: \"cc2d4699-ea9a-426b-b441-ee0d9e32445c\") " pod="openstack/dnsmasq-dns-6cd86fcf7-fcmhz" Mar 14 07:20:24 crc kubenswrapper[4893]: I0314 07:20:24.777289 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cc2d4699-ea9a-426b-b441-ee0d9e32445c-ovsdbserver-sb\") pod \"dnsmasq-dns-6cd86fcf7-fcmhz\" (UID: \"cc2d4699-ea9a-426b-b441-ee0d9e32445c\") " pod="openstack/dnsmasq-dns-6cd86fcf7-fcmhz" Mar 14 07:20:24 crc kubenswrapper[4893]: I0314 07:20:24.777334 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5g865\" (UniqueName: \"kubernetes.io/projected/cc2d4699-ea9a-426b-b441-ee0d9e32445c-kube-api-access-5g865\") pod \"dnsmasq-dns-6cd86fcf7-fcmhz\" (UID: \"cc2d4699-ea9a-426b-b441-ee0d9e32445c\") " pod="openstack/dnsmasq-dns-6cd86fcf7-fcmhz" Mar 14 07:20:24 crc kubenswrapper[4893]: I0314 07:20:24.777359 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc2d4699-ea9a-426b-b441-ee0d9e32445c-dns-svc\") pod \"dnsmasq-dns-6cd86fcf7-fcmhz\" (UID: \"cc2d4699-ea9a-426b-b441-ee0d9e32445c\") " pod="openstack/dnsmasq-dns-6cd86fcf7-fcmhz" Mar 14 07:20:24 crc kubenswrapper[4893]: I0314 07:20:24.777440 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cc2d4699-ea9a-426b-b441-ee0d9e32445c-ovsdbserver-nb\") pod \"dnsmasq-dns-6cd86fcf7-fcmhz\" (UID: \"cc2d4699-ea9a-426b-b441-ee0d9e32445c\") " pod="openstack/dnsmasq-dns-6cd86fcf7-fcmhz" Mar 14 07:20:24 crc kubenswrapper[4893]: I0314 07:20:24.878796 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc2d4699-ea9a-426b-b441-ee0d9e32445c-config\") pod \"dnsmasq-dns-6cd86fcf7-fcmhz\" (UID: \"cc2d4699-ea9a-426b-b441-ee0d9e32445c\") " pod="openstack/dnsmasq-dns-6cd86fcf7-fcmhz" Mar 14 07:20:24 crc kubenswrapper[4893]: I0314 07:20:24.878880 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cc2d4699-ea9a-426b-b441-ee0d9e32445c-ovsdbserver-sb\") pod \"dnsmasq-dns-6cd86fcf7-fcmhz\" (UID: \"cc2d4699-ea9a-426b-b441-ee0d9e32445c\") " pod="openstack/dnsmasq-dns-6cd86fcf7-fcmhz" Mar 14 07:20:24 crc kubenswrapper[4893]: I0314 07:20:24.878914 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5g865\" (UniqueName: \"kubernetes.io/projected/cc2d4699-ea9a-426b-b441-ee0d9e32445c-kube-api-access-5g865\") pod \"dnsmasq-dns-6cd86fcf7-fcmhz\" (UID: \"cc2d4699-ea9a-426b-b441-ee0d9e32445c\") " pod="openstack/dnsmasq-dns-6cd86fcf7-fcmhz" Mar 14 07:20:24 crc kubenswrapper[4893]: I0314 07:20:24.878933 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc2d4699-ea9a-426b-b441-ee0d9e32445c-dns-svc\") pod \"dnsmasq-dns-6cd86fcf7-fcmhz\" (UID: \"cc2d4699-ea9a-426b-b441-ee0d9e32445c\") " pod="openstack/dnsmasq-dns-6cd86fcf7-fcmhz" Mar 14 07:20:24 crc kubenswrapper[4893]: I0314 07:20:24.878973 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cc2d4699-ea9a-426b-b441-ee0d9e32445c-ovsdbserver-nb\") pod \"dnsmasq-dns-6cd86fcf7-fcmhz\" (UID: \"cc2d4699-ea9a-426b-b441-ee0d9e32445c\") " pod="openstack/dnsmasq-dns-6cd86fcf7-fcmhz" Mar 14 07:20:24 crc kubenswrapper[4893]: I0314 07:20:24.879710 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc2d4699-ea9a-426b-b441-ee0d9e32445c-config\") pod \"dnsmasq-dns-6cd86fcf7-fcmhz\" (UID: \"cc2d4699-ea9a-426b-b441-ee0d9e32445c\") " pod="openstack/dnsmasq-dns-6cd86fcf7-fcmhz" Mar 14 07:20:24 crc kubenswrapper[4893]: I0314 07:20:24.879719 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cc2d4699-ea9a-426b-b441-ee0d9e32445c-ovsdbserver-nb\") pod \"dnsmasq-dns-6cd86fcf7-fcmhz\" (UID: \"cc2d4699-ea9a-426b-b441-ee0d9e32445c\") " pod="openstack/dnsmasq-dns-6cd86fcf7-fcmhz" Mar 14 07:20:24 crc kubenswrapper[4893]: I0314 07:20:24.880205 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cc2d4699-ea9a-426b-b441-ee0d9e32445c-ovsdbserver-sb\") pod \"dnsmasq-dns-6cd86fcf7-fcmhz\" (UID: \"cc2d4699-ea9a-426b-b441-ee0d9e32445c\") " pod="openstack/dnsmasq-dns-6cd86fcf7-fcmhz" Mar 14 07:20:24 crc kubenswrapper[4893]: I0314 07:20:24.880709 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc2d4699-ea9a-426b-b441-ee0d9e32445c-dns-svc\") pod \"dnsmasq-dns-6cd86fcf7-fcmhz\" (UID: \"cc2d4699-ea9a-426b-b441-ee0d9e32445c\") " pod="openstack/dnsmasq-dns-6cd86fcf7-fcmhz" Mar 14 07:20:24 crc kubenswrapper[4893]: I0314 07:20:24.900343 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5g865\" (UniqueName: \"kubernetes.io/projected/cc2d4699-ea9a-426b-b441-ee0d9e32445c-kube-api-access-5g865\") pod \"dnsmasq-dns-6cd86fcf7-fcmhz\" (UID: \"cc2d4699-ea9a-426b-b441-ee0d9e32445c\") " pod="openstack/dnsmasq-dns-6cd86fcf7-fcmhz" Mar 14 07:20:25 crc kubenswrapper[4893]: I0314 07:20:25.007827 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cd86fcf7-fcmhz" Mar 14 07:20:25 crc kubenswrapper[4893]: I0314 07:20:25.225878 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cd86fcf7-fcmhz"] Mar 14 07:20:25 crc kubenswrapper[4893]: I0314 07:20:25.302414 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cd86fcf7-fcmhz" event={"ID":"cc2d4699-ea9a-426b-b441-ee0d9e32445c","Type":"ContainerStarted","Data":"6e57fccf3512fd79d83640ff4bea77de577747c0f24b33be1d3f2a5f8a1e07a2"} Mar 14 07:20:25 crc kubenswrapper[4893]: I0314 07:20:25.668700 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 14 07:20:25 crc kubenswrapper[4893]: I0314 07:20:25.941855 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-6g9t6"] Mar 14 07:20:25 crc kubenswrapper[4893]: I0314 07:20:25.942835 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-6g9t6" Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.024472 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-6g9t6"] Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.047895 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-f553-account-create-update-d5kx6"] Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.048995 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f553-account-create-update-d5kx6" Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.054889 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.071582 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-f553-account-create-update-d5kx6"] Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.104690 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b191a119-f244-4d99-98f2-e0ae52bd6613-operator-scripts\") pod \"cinder-db-create-6g9t6\" (UID: \"b191a119-f244-4d99-98f2-e0ae52bd6613\") " pod="openstack/cinder-db-create-6g9t6" Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.104756 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lk7vg\" (UniqueName: \"kubernetes.io/projected/b191a119-f244-4d99-98f2-e0ae52bd6613-kube-api-access-lk7vg\") pod \"cinder-db-create-6g9t6\" (UID: \"b191a119-f244-4d99-98f2-e0ae52bd6613\") " pod="openstack/cinder-db-create-6g9t6" Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.206425 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwtmb\" (UniqueName: \"kubernetes.io/projected/44268a3d-27a9-41ee-a0d7-38ba3b152ce5-kube-api-access-bwtmb\") pod \"cinder-f553-account-create-update-d5kx6\" (UID: \"44268a3d-27a9-41ee-a0d7-38ba3b152ce5\") " pod="openstack/cinder-f553-account-create-update-d5kx6" Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.206502 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b191a119-f244-4d99-98f2-e0ae52bd6613-operator-scripts\") pod \"cinder-db-create-6g9t6\" (UID: \"b191a119-f244-4d99-98f2-e0ae52bd6613\") " pod="openstack/cinder-db-create-6g9t6" Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.206550 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44268a3d-27a9-41ee-a0d7-38ba3b152ce5-operator-scripts\") pod \"cinder-f553-account-create-update-d5kx6\" (UID: \"44268a3d-27a9-41ee-a0d7-38ba3b152ce5\") " pod="openstack/cinder-f553-account-create-update-d5kx6" Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.206596 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lk7vg\" (UniqueName: \"kubernetes.io/projected/b191a119-f244-4d99-98f2-e0ae52bd6613-kube-api-access-lk7vg\") pod \"cinder-db-create-6g9t6\" (UID: \"b191a119-f244-4d99-98f2-e0ae52bd6613\") " pod="openstack/cinder-db-create-6g9t6" Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.207790 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b191a119-f244-4d99-98f2-e0ae52bd6613-operator-scripts\") pod \"cinder-db-create-6g9t6\" (UID: \"b191a119-f244-4d99-98f2-e0ae52bd6613\") " pod="openstack/cinder-db-create-6g9t6" Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.236242 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-ftbdk"] Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.237733 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-ftbdk" Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.247576 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-ftbdk"] Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.257460 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lk7vg\" (UniqueName: \"kubernetes.io/projected/b191a119-f244-4d99-98f2-e0ae52bd6613-kube-api-access-lk7vg\") pod \"cinder-db-create-6g9t6\" (UID: \"b191a119-f244-4d99-98f2-e0ae52bd6613\") " pod="openstack/cinder-db-create-6g9t6" Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.309685 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwtmb\" (UniqueName: \"kubernetes.io/projected/44268a3d-27a9-41ee-a0d7-38ba3b152ce5-kube-api-access-bwtmb\") pod \"cinder-f553-account-create-update-d5kx6\" (UID: \"44268a3d-27a9-41ee-a0d7-38ba3b152ce5\") " pod="openstack/cinder-f553-account-create-update-d5kx6" Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.309757 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44268a3d-27a9-41ee-a0d7-38ba3b152ce5-operator-scripts\") pod \"cinder-f553-account-create-update-d5kx6\" (UID: \"44268a3d-27a9-41ee-a0d7-38ba3b152ce5\") " pod="openstack/cinder-f553-account-create-update-d5kx6" Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.311258 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44268a3d-27a9-41ee-a0d7-38ba3b152ce5-operator-scripts\") pod \"cinder-f553-account-create-update-d5kx6\" (UID: \"44268a3d-27a9-41ee-a0d7-38ba3b152ce5\") " pod="openstack/cinder-f553-account-create-update-d5kx6" Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.314758 4893 generic.go:334] "Generic (PLEG): container finished" podID="e77e9706-a6b7-4f26-9897-8f5d66642a67" containerID="371538afb0b740f188caaef5fc1d7c03c20a31b7d8b9647370541a43e9085a8a" exitCode=0 Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.314797 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-wr6c4" event={"ID":"e77e9706-a6b7-4f26-9897-8f5d66642a67","Type":"ContainerDied","Data":"371538afb0b740f188caaef5fc1d7c03c20a31b7d8b9647370541a43e9085a8a"} Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.331178 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwtmb\" (UniqueName: \"kubernetes.io/projected/44268a3d-27a9-41ee-a0d7-38ba3b152ce5-kube-api-access-bwtmb\") pod \"cinder-f553-account-create-update-d5kx6\" (UID: \"44268a3d-27a9-41ee-a0d7-38ba3b152ce5\") " pod="openstack/cinder-f553-account-create-update-d5kx6" Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.348734 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-8knjw"] Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.349984 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-8knjw" Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.363184 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-78ff-account-create-update-8m2n9"] Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.364342 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-78ff-account-create-update-8m2n9" Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.364636 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f553-account-create-update-d5kx6" Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.366998 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.377215 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-8knjw"] Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.388738 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-78ff-account-create-update-8m2n9"] Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.404697 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-5gj2n"] Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.405717 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-5gj2n" Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.409866 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.410135 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.410214 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.411134 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e580c7ca-eac4-4dd6-bdd6-478814d7f65d-operator-scripts\") pod \"barbican-db-create-ftbdk\" (UID: \"e580c7ca-eac4-4dd6-bdd6-478814d7f65d\") " pod="openstack/barbican-db-create-ftbdk" Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.411222 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nv77\" (UniqueName: \"kubernetes.io/projected/e580c7ca-eac4-4dd6-bdd6-478814d7f65d-kube-api-access-6nv77\") pod \"barbican-db-create-ftbdk\" (UID: \"e580c7ca-eac4-4dd6-bdd6-478814d7f65d\") " pod="openstack/barbican-db-create-ftbdk" Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.412155 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-k497z" Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.419120 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-5gj2n"] Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.428320 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.453887 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-8d2e-account-create-update-db4t8"] Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.455293 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-8d2e-account-create-update-db4t8" Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.459891 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.470931 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-8d2e-account-create-update-db4t8"] Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.512898 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef8f0e2b-b3e6-406e-9ee7-7569256f6bc3-operator-scripts\") pod \"neutron-db-create-8knjw\" (UID: \"ef8f0e2b-b3e6-406e-9ee7-7569256f6bc3\") " pod="openstack/neutron-db-create-8knjw" Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.513064 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gbt4\" (UniqueName: \"kubernetes.io/projected/ef8f0e2b-b3e6-406e-9ee7-7569256f6bc3-kube-api-access-6gbt4\") pod \"neutron-db-create-8knjw\" (UID: \"ef8f0e2b-b3e6-406e-9ee7-7569256f6bc3\") " pod="openstack/neutron-db-create-8knjw" Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.513202 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05bc0af5-83ce-4dd0-b4b9-53e9307905ad-operator-scripts\") pod \"neutron-78ff-account-create-update-8m2n9\" (UID: \"05bc0af5-83ce-4dd0-b4b9-53e9307905ad\") " pod="openstack/neutron-78ff-account-create-update-8m2n9" Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.513260 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62b69960-585e-4e89-a290-e00ea2f20283-config-data\") pod \"keystone-db-sync-5gj2n\" (UID: \"62b69960-585e-4e89-a290-e00ea2f20283\") " pod="openstack/keystone-db-sync-5gj2n" Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.513355 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sfsp\" (UniqueName: \"kubernetes.io/projected/05bc0af5-83ce-4dd0-b4b9-53e9307905ad-kube-api-access-4sfsp\") pod \"neutron-78ff-account-create-update-8m2n9\" (UID: \"05bc0af5-83ce-4dd0-b4b9-53e9307905ad\") " pod="openstack/neutron-78ff-account-create-update-8m2n9" Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.513458 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e580c7ca-eac4-4dd6-bdd6-478814d7f65d-operator-scripts\") pod \"barbican-db-create-ftbdk\" (UID: \"e580c7ca-eac4-4dd6-bdd6-478814d7f65d\") " pod="openstack/barbican-db-create-ftbdk" Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.513483 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62b69960-585e-4e89-a290-e00ea2f20283-combined-ca-bundle\") pod \"keystone-db-sync-5gj2n\" (UID: \"62b69960-585e-4e89-a290-e00ea2f20283\") " pod="openstack/keystone-db-sync-5gj2n" Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.513561 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lmhf\" (UniqueName: \"kubernetes.io/projected/62b69960-585e-4e89-a290-e00ea2f20283-kube-api-access-6lmhf\") pod \"keystone-db-sync-5gj2n\" (UID: \"62b69960-585e-4e89-a290-e00ea2f20283\") " pod="openstack/keystone-db-sync-5gj2n" Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.514082 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nv77\" (UniqueName: \"kubernetes.io/projected/e580c7ca-eac4-4dd6-bdd6-478814d7f65d-kube-api-access-6nv77\") pod \"barbican-db-create-ftbdk\" (UID: \"e580c7ca-eac4-4dd6-bdd6-478814d7f65d\") " pod="openstack/barbican-db-create-ftbdk" Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.514394 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e580c7ca-eac4-4dd6-bdd6-478814d7f65d-operator-scripts\") pod \"barbican-db-create-ftbdk\" (UID: \"e580c7ca-eac4-4dd6-bdd6-478814d7f65d\") " pod="openstack/barbican-db-create-ftbdk" Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.534154 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nv77\" (UniqueName: \"kubernetes.io/projected/e580c7ca-eac4-4dd6-bdd6-478814d7f65d-kube-api-access-6nv77\") pod \"barbican-db-create-ftbdk\" (UID: \"e580c7ca-eac4-4dd6-bdd6-478814d7f65d\") " pod="openstack/barbican-db-create-ftbdk" Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.556423 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-6g9t6" Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.580409 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-ftbdk" Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.615208 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f25145f5-abb0-4f54-aea2-d23716f0af23-operator-scripts\") pod \"barbican-8d2e-account-create-update-db4t8\" (UID: \"f25145f5-abb0-4f54-aea2-d23716f0af23\") " pod="openstack/barbican-8d2e-account-create-update-db4t8" Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.615297 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef8f0e2b-b3e6-406e-9ee7-7569256f6bc3-operator-scripts\") pod \"neutron-db-create-8knjw\" (UID: \"ef8f0e2b-b3e6-406e-9ee7-7569256f6bc3\") " pod="openstack/neutron-db-create-8knjw" Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.615324 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gbt4\" (UniqueName: \"kubernetes.io/projected/ef8f0e2b-b3e6-406e-9ee7-7569256f6bc3-kube-api-access-6gbt4\") pod \"neutron-db-create-8knjw\" (UID: \"ef8f0e2b-b3e6-406e-9ee7-7569256f6bc3\") " pod="openstack/neutron-db-create-8knjw" Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.615349 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05bc0af5-83ce-4dd0-b4b9-53e9307905ad-operator-scripts\") pod \"neutron-78ff-account-create-update-8m2n9\" (UID: \"05bc0af5-83ce-4dd0-b4b9-53e9307905ad\") " pod="openstack/neutron-78ff-account-create-update-8m2n9" Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.615371 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62b69960-585e-4e89-a290-e00ea2f20283-config-data\") pod \"keystone-db-sync-5gj2n\" (UID: \"62b69960-585e-4e89-a290-e00ea2f20283\") " pod="openstack/keystone-db-sync-5gj2n" Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.615398 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4sfsp\" (UniqueName: \"kubernetes.io/projected/05bc0af5-83ce-4dd0-b4b9-53e9307905ad-kube-api-access-4sfsp\") pod \"neutron-78ff-account-create-update-8m2n9\" (UID: \"05bc0af5-83ce-4dd0-b4b9-53e9307905ad\") " pod="openstack/neutron-78ff-account-create-update-8m2n9" Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.615427 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62b69960-585e-4e89-a290-e00ea2f20283-combined-ca-bundle\") pod \"keystone-db-sync-5gj2n\" (UID: \"62b69960-585e-4e89-a290-e00ea2f20283\") " pod="openstack/keystone-db-sync-5gj2n" Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.615459 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lmhf\" (UniqueName: \"kubernetes.io/projected/62b69960-585e-4e89-a290-e00ea2f20283-kube-api-access-6lmhf\") pod \"keystone-db-sync-5gj2n\" (UID: \"62b69960-585e-4e89-a290-e00ea2f20283\") " pod="openstack/keystone-db-sync-5gj2n" Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.615494 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87wb9\" (UniqueName: \"kubernetes.io/projected/f25145f5-abb0-4f54-aea2-d23716f0af23-kube-api-access-87wb9\") pod \"barbican-8d2e-account-create-update-db4t8\" (UID: \"f25145f5-abb0-4f54-aea2-d23716f0af23\") " pod="openstack/barbican-8d2e-account-create-update-db4t8" Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.616145 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05bc0af5-83ce-4dd0-b4b9-53e9307905ad-operator-scripts\") pod \"neutron-78ff-account-create-update-8m2n9\" (UID: \"05bc0af5-83ce-4dd0-b4b9-53e9307905ad\") " pod="openstack/neutron-78ff-account-create-update-8m2n9" Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.617741 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef8f0e2b-b3e6-406e-9ee7-7569256f6bc3-operator-scripts\") pod \"neutron-db-create-8knjw\" (UID: \"ef8f0e2b-b3e6-406e-9ee7-7569256f6bc3\") " pod="openstack/neutron-db-create-8knjw" Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.619404 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62b69960-585e-4e89-a290-e00ea2f20283-config-data\") pod \"keystone-db-sync-5gj2n\" (UID: \"62b69960-585e-4e89-a290-e00ea2f20283\") " pod="openstack/keystone-db-sync-5gj2n" Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.626474 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62b69960-585e-4e89-a290-e00ea2f20283-combined-ca-bundle\") pod \"keystone-db-sync-5gj2n\" (UID: \"62b69960-585e-4e89-a290-e00ea2f20283\") " pod="openstack/keystone-db-sync-5gj2n" Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.631450 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gbt4\" (UniqueName: \"kubernetes.io/projected/ef8f0e2b-b3e6-406e-9ee7-7569256f6bc3-kube-api-access-6gbt4\") pod \"neutron-db-create-8knjw\" (UID: \"ef8f0e2b-b3e6-406e-9ee7-7569256f6bc3\") " pod="openstack/neutron-db-create-8knjw" Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.631884 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sfsp\" (UniqueName: \"kubernetes.io/projected/05bc0af5-83ce-4dd0-b4b9-53e9307905ad-kube-api-access-4sfsp\") pod \"neutron-78ff-account-create-update-8m2n9\" (UID: \"05bc0af5-83ce-4dd0-b4b9-53e9307905ad\") " pod="openstack/neutron-78ff-account-create-update-8m2n9" Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.633252 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lmhf\" (UniqueName: \"kubernetes.io/projected/62b69960-585e-4e89-a290-e00ea2f20283-kube-api-access-6lmhf\") pod \"keystone-db-sync-5gj2n\" (UID: \"62b69960-585e-4e89-a290-e00ea2f20283\") " pod="openstack/keystone-db-sync-5gj2n" Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.690392 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-8knjw" Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.705721 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-78ff-account-create-update-8m2n9" Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.717383 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87wb9\" (UniqueName: \"kubernetes.io/projected/f25145f5-abb0-4f54-aea2-d23716f0af23-kube-api-access-87wb9\") pod \"barbican-8d2e-account-create-update-db4t8\" (UID: \"f25145f5-abb0-4f54-aea2-d23716f0af23\") " pod="openstack/barbican-8d2e-account-create-update-db4t8" Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.717918 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f25145f5-abb0-4f54-aea2-d23716f0af23-operator-scripts\") pod \"barbican-8d2e-account-create-update-db4t8\" (UID: \"f25145f5-abb0-4f54-aea2-d23716f0af23\") " pod="openstack/barbican-8d2e-account-create-update-db4t8" Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.721258 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f25145f5-abb0-4f54-aea2-d23716f0af23-operator-scripts\") pod \"barbican-8d2e-account-create-update-db4t8\" (UID: \"f25145f5-abb0-4f54-aea2-d23716f0af23\") " pod="openstack/barbican-8d2e-account-create-update-db4t8" Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.733648 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-5gj2n" Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.733825 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87wb9\" (UniqueName: \"kubernetes.io/projected/f25145f5-abb0-4f54-aea2-d23716f0af23-kube-api-access-87wb9\") pod \"barbican-8d2e-account-create-update-db4t8\" (UID: \"f25145f5-abb0-4f54-aea2-d23716f0af23\") " pod="openstack/barbican-8d2e-account-create-update-db4t8" Mar 14 07:20:26 crc kubenswrapper[4893]: I0314 07:20:26.784976 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-8d2e-account-create-update-db4t8" Mar 14 07:20:27 crc kubenswrapper[4893]: I0314 07:20:27.188353 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-8rcbf-config-qb9pq" Mar 14 07:20:27 crc kubenswrapper[4893]: I0314 07:20:27.329078 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-8rcbf-config-qb9pq" Mar 14 07:20:27 crc kubenswrapper[4893]: I0314 07:20:27.329442 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-8rcbf-config-qb9pq" event={"ID":"b0a0f907-332e-4e2f-bf23-586e7728a26e","Type":"ContainerDied","Data":"351f2ae3d325d161a0485e4024d334b8ce92946c75c5d986a88d23ef43a51c59"} Mar 14 07:20:27 crc kubenswrapper[4893]: I0314 07:20:27.333904 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="351f2ae3d325d161a0485e4024d334b8ce92946c75c5d986a88d23ef43a51c59" Mar 14 07:20:27 crc kubenswrapper[4893]: I0314 07:20:27.340894 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b0a0f907-332e-4e2f-bf23-586e7728a26e-additional-scripts\") pod \"b0a0f907-332e-4e2f-bf23-586e7728a26e\" (UID: \"b0a0f907-332e-4e2f-bf23-586e7728a26e\") " Mar 14 07:20:27 crc kubenswrapper[4893]: I0314 07:20:27.340937 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b0a0f907-332e-4e2f-bf23-586e7728a26e-scripts\") pod \"b0a0f907-332e-4e2f-bf23-586e7728a26e\" (UID: \"b0a0f907-332e-4e2f-bf23-586e7728a26e\") " Mar 14 07:20:27 crc kubenswrapper[4893]: I0314 07:20:27.340998 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8cwn\" (UniqueName: \"kubernetes.io/projected/b0a0f907-332e-4e2f-bf23-586e7728a26e-kube-api-access-f8cwn\") pod \"b0a0f907-332e-4e2f-bf23-586e7728a26e\" (UID: \"b0a0f907-332e-4e2f-bf23-586e7728a26e\") " Mar 14 07:20:27 crc kubenswrapper[4893]: I0314 07:20:27.341142 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b0a0f907-332e-4e2f-bf23-586e7728a26e-var-run-ovn\") pod \"b0a0f907-332e-4e2f-bf23-586e7728a26e\" (UID: \"b0a0f907-332e-4e2f-bf23-586e7728a26e\") " Mar 14 07:20:27 crc kubenswrapper[4893]: I0314 07:20:27.341194 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b0a0f907-332e-4e2f-bf23-586e7728a26e-var-run\") pod \"b0a0f907-332e-4e2f-bf23-586e7728a26e\" (UID: \"b0a0f907-332e-4e2f-bf23-586e7728a26e\") " Mar 14 07:20:27 crc kubenswrapper[4893]: I0314 07:20:27.341219 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b0a0f907-332e-4e2f-bf23-586e7728a26e-var-log-ovn\") pod \"b0a0f907-332e-4e2f-bf23-586e7728a26e\" (UID: \"b0a0f907-332e-4e2f-bf23-586e7728a26e\") " Mar 14 07:20:27 crc kubenswrapper[4893]: I0314 07:20:27.341752 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b0a0f907-332e-4e2f-bf23-586e7728a26e-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "b0a0f907-332e-4e2f-bf23-586e7728a26e" (UID: "b0a0f907-332e-4e2f-bf23-586e7728a26e"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:20:27 crc kubenswrapper[4893]: I0314 07:20:27.342603 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b0a0f907-332e-4e2f-bf23-586e7728a26e-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "b0a0f907-332e-4e2f-bf23-586e7728a26e" (UID: "b0a0f907-332e-4e2f-bf23-586e7728a26e"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:20:27 crc kubenswrapper[4893]: I0314 07:20:27.342618 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b0a0f907-332e-4e2f-bf23-586e7728a26e-var-run" (OuterVolumeSpecName: "var-run") pod "b0a0f907-332e-4e2f-bf23-586e7728a26e" (UID: "b0a0f907-332e-4e2f-bf23-586e7728a26e"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:20:27 crc kubenswrapper[4893]: I0314 07:20:27.343231 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0a0f907-332e-4e2f-bf23-586e7728a26e-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "b0a0f907-332e-4e2f-bf23-586e7728a26e" (UID: "b0a0f907-332e-4e2f-bf23-586e7728a26e"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:20:27 crc kubenswrapper[4893]: I0314 07:20:27.343482 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0a0f907-332e-4e2f-bf23-586e7728a26e-scripts" (OuterVolumeSpecName: "scripts") pod "b0a0f907-332e-4e2f-bf23-586e7728a26e" (UID: "b0a0f907-332e-4e2f-bf23-586e7728a26e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:20:27 crc kubenswrapper[4893]: I0314 07:20:27.354671 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0a0f907-332e-4e2f-bf23-586e7728a26e-kube-api-access-f8cwn" (OuterVolumeSpecName: "kube-api-access-f8cwn") pod "b0a0f907-332e-4e2f-bf23-586e7728a26e" (UID: "b0a0f907-332e-4e2f-bf23-586e7728a26e"). InnerVolumeSpecName "kube-api-access-f8cwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:20:27 crc kubenswrapper[4893]: I0314 07:20:27.442868 4893 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b0a0f907-332e-4e2f-bf23-586e7728a26e-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:27 crc kubenswrapper[4893]: I0314 07:20:27.442895 4893 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b0a0f907-332e-4e2f-bf23-586e7728a26e-var-run\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:27 crc kubenswrapper[4893]: I0314 07:20:27.442903 4893 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b0a0f907-332e-4e2f-bf23-586e7728a26e-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:27 crc kubenswrapper[4893]: I0314 07:20:27.442912 4893 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b0a0f907-332e-4e2f-bf23-586e7728a26e-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:27 crc kubenswrapper[4893]: I0314 07:20:27.442923 4893 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b0a0f907-332e-4e2f-bf23-586e7728a26e-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:27 crc kubenswrapper[4893]: I0314 07:20:27.442931 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8cwn\" (UniqueName: \"kubernetes.io/projected/b0a0f907-332e-4e2f-bf23-586e7728a26e-kube-api-access-f8cwn\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:27 crc kubenswrapper[4893]: I0314 07:20:27.762104 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-f553-account-create-update-d5kx6"] Mar 14 07:20:27 crc kubenswrapper[4893]: I0314 07:20:27.873024 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-wr6c4" Mar 14 07:20:27 crc kubenswrapper[4893]: I0314 07:20:27.960217 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xdqz\" (UniqueName: \"kubernetes.io/projected/e77e9706-a6b7-4f26-9897-8f5d66642a67-kube-api-access-5xdqz\") pod \"e77e9706-a6b7-4f26-9897-8f5d66642a67\" (UID: \"e77e9706-a6b7-4f26-9897-8f5d66642a67\") " Mar 14 07:20:27 crc kubenswrapper[4893]: I0314 07:20:27.960499 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e77e9706-a6b7-4f26-9897-8f5d66642a67-swiftconf\") pod \"e77e9706-a6b7-4f26-9897-8f5d66642a67\" (UID: \"e77e9706-a6b7-4f26-9897-8f5d66642a67\") " Mar 14 07:20:27 crc kubenswrapper[4893]: I0314 07:20:27.960558 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e77e9706-a6b7-4f26-9897-8f5d66642a67-ring-data-devices\") pod \"e77e9706-a6b7-4f26-9897-8f5d66642a67\" (UID: \"e77e9706-a6b7-4f26-9897-8f5d66642a67\") " Mar 14 07:20:27 crc kubenswrapper[4893]: I0314 07:20:27.960619 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e77e9706-a6b7-4f26-9897-8f5d66642a67-combined-ca-bundle\") pod \"e77e9706-a6b7-4f26-9897-8f5d66642a67\" (UID: \"e77e9706-a6b7-4f26-9897-8f5d66642a67\") " Mar 14 07:20:27 crc kubenswrapper[4893]: I0314 07:20:27.960670 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e77e9706-a6b7-4f26-9897-8f5d66642a67-dispersionconf\") pod \"e77e9706-a6b7-4f26-9897-8f5d66642a67\" (UID: \"e77e9706-a6b7-4f26-9897-8f5d66642a67\") " Mar 14 07:20:27 crc kubenswrapper[4893]: I0314 07:20:27.960703 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e77e9706-a6b7-4f26-9897-8f5d66642a67-etc-swift\") pod \"e77e9706-a6b7-4f26-9897-8f5d66642a67\" (UID: \"e77e9706-a6b7-4f26-9897-8f5d66642a67\") " Mar 14 07:20:27 crc kubenswrapper[4893]: I0314 07:20:27.960729 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e77e9706-a6b7-4f26-9897-8f5d66642a67-scripts\") pod \"e77e9706-a6b7-4f26-9897-8f5d66642a67\" (UID: \"e77e9706-a6b7-4f26-9897-8f5d66642a67\") " Mar 14 07:20:27 crc kubenswrapper[4893]: I0314 07:20:27.964142 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e77e9706-a6b7-4f26-9897-8f5d66642a67-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "e77e9706-a6b7-4f26-9897-8f5d66642a67" (UID: "e77e9706-a6b7-4f26-9897-8f5d66642a67"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:20:27 crc kubenswrapper[4893]: I0314 07:20:27.965674 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e77e9706-a6b7-4f26-9897-8f5d66642a67-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "e77e9706-a6b7-4f26-9897-8f5d66642a67" (UID: "e77e9706-a6b7-4f26-9897-8f5d66642a67"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:20:27 crc kubenswrapper[4893]: I0314 07:20:27.972876 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e77e9706-a6b7-4f26-9897-8f5d66642a67-kube-api-access-5xdqz" (OuterVolumeSpecName: "kube-api-access-5xdqz") pod "e77e9706-a6b7-4f26-9897-8f5d66642a67" (UID: "e77e9706-a6b7-4f26-9897-8f5d66642a67"). InnerVolumeSpecName "kube-api-access-5xdqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:20:27 crc kubenswrapper[4893]: I0314 07:20:27.974837 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e77e9706-a6b7-4f26-9897-8f5d66642a67-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "e77e9706-a6b7-4f26-9897-8f5d66642a67" (UID: "e77e9706-a6b7-4f26-9897-8f5d66642a67"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:20:27 crc kubenswrapper[4893]: I0314 07:20:27.989820 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e77e9706-a6b7-4f26-9897-8f5d66642a67-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "e77e9706-a6b7-4f26-9897-8f5d66642a67" (UID: "e77e9706-a6b7-4f26-9897-8f5d66642a67"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:20:28 crc kubenswrapper[4893]: I0314 07:20:28.010207 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e77e9706-a6b7-4f26-9897-8f5d66642a67-scripts" (OuterVolumeSpecName: "scripts") pod "e77e9706-a6b7-4f26-9897-8f5d66642a67" (UID: "e77e9706-a6b7-4f26-9897-8f5d66642a67"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:20:28 crc kubenswrapper[4893]: I0314 07:20:28.013972 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e77e9706-a6b7-4f26-9897-8f5d66642a67-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e77e9706-a6b7-4f26-9897-8f5d66642a67" (UID: "e77e9706-a6b7-4f26-9897-8f5d66642a67"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:20:28 crc kubenswrapper[4893]: I0314 07:20:28.063205 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xdqz\" (UniqueName: \"kubernetes.io/projected/e77e9706-a6b7-4f26-9897-8f5d66642a67-kube-api-access-5xdqz\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:28 crc kubenswrapper[4893]: I0314 07:20:28.063243 4893 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e77e9706-a6b7-4f26-9897-8f5d66642a67-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:28 crc kubenswrapper[4893]: I0314 07:20:28.063256 4893 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e77e9706-a6b7-4f26-9897-8f5d66642a67-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:28 crc kubenswrapper[4893]: I0314 07:20:28.063267 4893 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e77e9706-a6b7-4f26-9897-8f5d66642a67-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:28 crc kubenswrapper[4893]: I0314 07:20:28.063278 4893 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e77e9706-a6b7-4f26-9897-8f5d66642a67-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:28 crc kubenswrapper[4893]: I0314 07:20:28.063288 4893 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e77e9706-a6b7-4f26-9897-8f5d66642a67-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:28 crc kubenswrapper[4893]: I0314 07:20:28.063298 4893 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e77e9706-a6b7-4f26-9897-8f5d66642a67-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:28 crc kubenswrapper[4893]: I0314 07:20:28.252824 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-6g9t6"] Mar 14 07:20:28 crc kubenswrapper[4893]: I0314 07:20:28.267972 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-8knjw"] Mar 14 07:20:28 crc kubenswrapper[4893]: W0314 07:20:28.268698 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb191a119_f244_4d99_98f2_e0ae52bd6613.slice/crio-b0534f93d1ee708e44bdae11cd409baba839496a058fbcf73c9d2f5876712f48 WatchSource:0}: Error finding container b0534f93d1ee708e44bdae11cd409baba839496a058fbcf73c9d2f5876712f48: Status 404 returned error can't find the container with id b0534f93d1ee708e44bdae11cd409baba839496a058fbcf73c9d2f5876712f48 Mar 14 07:20:28 crc kubenswrapper[4893]: I0314 07:20:28.274339 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-8d2e-account-create-update-db4t8"] Mar 14 07:20:28 crc kubenswrapper[4893]: I0314 07:20:28.335276 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-ftbdk"] Mar 14 07:20:28 crc kubenswrapper[4893]: W0314 07:20:28.354978 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode580c7ca_eac4_4dd6_bdd6_478814d7f65d.slice/crio-dcd6bb47ee9e4b1b1477c88da765bd2dde49a59eeb247f1af7a20a2ba459f552 WatchSource:0}: Error finding container dcd6bb47ee9e4b1b1477c88da765bd2dde49a59eeb247f1af7a20a2ba459f552: Status 404 returned error can't find the container with id dcd6bb47ee9e4b1b1477c88da765bd2dde49a59eeb247f1af7a20a2ba459f552 Mar 14 07:20:28 crc kubenswrapper[4893]: I0314 07:20:28.361772 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-8rcbf-config-qb9pq"] Mar 14 07:20:28 crc kubenswrapper[4893]: I0314 07:20:28.369848 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-8knjw" event={"ID":"ef8f0e2b-b3e6-406e-9ee7-7569256f6bc3","Type":"ContainerStarted","Data":"64364b8dbb4e912b50b183d3f62bb8beeaed0f6ecad1e98396f46ceb0b294ef7"} Mar 14 07:20:28 crc kubenswrapper[4893]: I0314 07:20:28.375589 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-8rcbf-config-qb9pq"] Mar 14 07:20:28 crc kubenswrapper[4893]: I0314 07:20:28.394138 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-6g9t6" event={"ID":"b191a119-f244-4d99-98f2-e0ae52bd6613","Type":"ContainerStarted","Data":"b0534f93d1ee708e44bdae11cd409baba839496a058fbcf73c9d2f5876712f48"} Mar 14 07:20:28 crc kubenswrapper[4893]: I0314 07:20:28.415625 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-5gj2n"] Mar 14 07:20:28 crc kubenswrapper[4893]: I0314 07:20:28.416151 4893 generic.go:334] "Generic (PLEG): container finished" podID="cc2d4699-ea9a-426b-b441-ee0d9e32445c" containerID="e6c99ae16d5df996f0d39ba2d0e7e0ea6c1dbdcf94fc8fa712802181c912483f" exitCode=0 Mar 14 07:20:28 crc kubenswrapper[4893]: I0314 07:20:28.416918 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cd86fcf7-fcmhz" event={"ID":"cc2d4699-ea9a-426b-b441-ee0d9e32445c","Type":"ContainerDied","Data":"e6c99ae16d5df996f0d39ba2d0e7e0ea6c1dbdcf94fc8fa712802181c912483f"} Mar 14 07:20:28 crc kubenswrapper[4893]: I0314 07:20:28.422662 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-78ff-account-create-update-8m2n9"] Mar 14 07:20:28 crc kubenswrapper[4893]: I0314 07:20:28.424350 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-8d2e-account-create-update-db4t8" event={"ID":"f25145f5-abb0-4f54-aea2-d23716f0af23","Type":"ContainerStarted","Data":"bad699226159786318e88611dbd83d9a2e0dce7ceff4ec265c309a12ce176fef"} Mar 14 07:20:28 crc kubenswrapper[4893]: I0314 07:20:28.425481 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-wr6c4" event={"ID":"e77e9706-a6b7-4f26-9897-8f5d66642a67","Type":"ContainerDied","Data":"6cc85b47370c01e18ef8e4b505fc6f861b34bbed5fa422f0d1a805a8d5e89187"} Mar 14 07:20:28 crc kubenswrapper[4893]: I0314 07:20:28.425503 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6cc85b47370c01e18ef8e4b505fc6f861b34bbed5fa422f0d1a805a8d5e89187" Mar 14 07:20:28 crc kubenswrapper[4893]: I0314 07:20:28.425568 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-wr6c4" Mar 14 07:20:28 crc kubenswrapper[4893]: I0314 07:20:28.455089 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-f553-account-create-update-d5kx6" event={"ID":"44268a3d-27a9-41ee-a0d7-38ba3b152ce5","Type":"ContainerStarted","Data":"b6856d1eab53d5c7defdef917b8853ec5b843c6ef2d675d742473e83cd281411"} Mar 14 07:20:28 crc kubenswrapper[4893]: I0314 07:20:28.455417 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-f553-account-create-update-d5kx6" event={"ID":"44268a3d-27a9-41ee-a0d7-38ba3b152ce5","Type":"ContainerStarted","Data":"2b8e3f04aea3f7fec150d4959f0bbbb8f5cb455a97702ddd70c6805443a80a65"} Mar 14 07:20:28 crc kubenswrapper[4893]: I0314 07:20:28.508328 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-f553-account-create-update-d5kx6" podStartSLOduration=2.508311737 podStartE2EDuration="2.508311737s" podCreationTimestamp="2026-03-14 07:20:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:20:28.507051987 +0000 UTC m=+1307.769228779" watchObservedRunningTime="2026-03-14 07:20:28.508311737 +0000 UTC m=+1307.770488529" Mar 14 07:20:29 crc kubenswrapper[4893]: I0314 07:20:29.391136 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0a0f907-332e-4e2f-bf23-586e7728a26e" path="/var/lib/kubelet/pods/b0a0f907-332e-4e2f-bf23-586e7728a26e/volumes" Mar 14 07:20:29 crc kubenswrapper[4893]: I0314 07:20:29.470911 4893 generic.go:334] "Generic (PLEG): container finished" podID="b191a119-f244-4d99-98f2-e0ae52bd6613" containerID="78dc1d4351b3b60a68c762a7af79d2853690910bf049ba88fbb38613b103188f" exitCode=0 Mar 14 07:20:29 crc kubenswrapper[4893]: I0314 07:20:29.470976 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-6g9t6" event={"ID":"b191a119-f244-4d99-98f2-e0ae52bd6613","Type":"ContainerDied","Data":"78dc1d4351b3b60a68c762a7af79d2853690910bf049ba88fbb38613b103188f"} Mar 14 07:20:29 crc kubenswrapper[4893]: I0314 07:20:29.475597 4893 generic.go:334] "Generic (PLEG): container finished" podID="05bc0af5-83ce-4dd0-b4b9-53e9307905ad" containerID="076d1362e0337a646e1d3cbbd38d27f9651cbd0d40bac4c48674ad108e71c423" exitCode=0 Mar 14 07:20:29 crc kubenswrapper[4893]: I0314 07:20:29.475710 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-78ff-account-create-update-8m2n9" event={"ID":"05bc0af5-83ce-4dd0-b4b9-53e9307905ad","Type":"ContainerDied","Data":"076d1362e0337a646e1d3cbbd38d27f9651cbd0d40bac4c48674ad108e71c423"} Mar 14 07:20:29 crc kubenswrapper[4893]: I0314 07:20:29.475741 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-78ff-account-create-update-8m2n9" event={"ID":"05bc0af5-83ce-4dd0-b4b9-53e9307905ad","Type":"ContainerStarted","Data":"8daa364ec3cac506a583121a7a2bec4c39a856bc9e67d6f3f1d57aac5c372ec0"} Mar 14 07:20:29 crc kubenswrapper[4893]: I0314 07:20:29.477969 4893 generic.go:334] "Generic (PLEG): container finished" podID="f25145f5-abb0-4f54-aea2-d23716f0af23" containerID="f1de5bb67f438364dc9e682d71951ebcde9019cf1468602db93d3be0fb8743d0" exitCode=0 Mar 14 07:20:29 crc kubenswrapper[4893]: I0314 07:20:29.478054 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-8d2e-account-create-update-db4t8" event={"ID":"f25145f5-abb0-4f54-aea2-d23716f0af23","Type":"ContainerDied","Data":"f1de5bb67f438364dc9e682d71951ebcde9019cf1468602db93d3be0fb8743d0"} Mar 14 07:20:29 crc kubenswrapper[4893]: I0314 07:20:29.480031 4893 generic.go:334] "Generic (PLEG): container finished" podID="44268a3d-27a9-41ee-a0d7-38ba3b152ce5" containerID="b6856d1eab53d5c7defdef917b8853ec5b843c6ef2d675d742473e83cd281411" exitCode=0 Mar 14 07:20:29 crc kubenswrapper[4893]: I0314 07:20:29.480107 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-f553-account-create-update-d5kx6" event={"ID":"44268a3d-27a9-41ee-a0d7-38ba3b152ce5","Type":"ContainerDied","Data":"b6856d1eab53d5c7defdef917b8853ec5b843c6ef2d675d742473e83cd281411"} Mar 14 07:20:29 crc kubenswrapper[4893]: I0314 07:20:29.488178 4893 generic.go:334] "Generic (PLEG): container finished" podID="e580c7ca-eac4-4dd6-bdd6-478814d7f65d" containerID="f5264743c8414ef7e28aed218c88f09a10839261e5063a2e3bd80f60820a76e0" exitCode=0 Mar 14 07:20:29 crc kubenswrapper[4893]: I0314 07:20:29.488253 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-ftbdk" event={"ID":"e580c7ca-eac4-4dd6-bdd6-478814d7f65d","Type":"ContainerDied","Data":"f5264743c8414ef7e28aed218c88f09a10839261e5063a2e3bd80f60820a76e0"} Mar 14 07:20:29 crc kubenswrapper[4893]: I0314 07:20:29.488278 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-ftbdk" event={"ID":"e580c7ca-eac4-4dd6-bdd6-478814d7f65d","Type":"ContainerStarted","Data":"dcd6bb47ee9e4b1b1477c88da765bd2dde49a59eeb247f1af7a20a2ba459f552"} Mar 14 07:20:29 crc kubenswrapper[4893]: I0314 07:20:29.492044 4893 generic.go:334] "Generic (PLEG): container finished" podID="ef8f0e2b-b3e6-406e-9ee7-7569256f6bc3" containerID="d721f52196064cd1c51902654dbbd16ab932abd7746c212327ec11b5046ddae0" exitCode=0 Mar 14 07:20:29 crc kubenswrapper[4893]: I0314 07:20:29.492094 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-8knjw" event={"ID":"ef8f0e2b-b3e6-406e-9ee7-7569256f6bc3","Type":"ContainerDied","Data":"d721f52196064cd1c51902654dbbd16ab932abd7746c212327ec11b5046ddae0"} Mar 14 07:20:29 crc kubenswrapper[4893]: I0314 07:20:29.493880 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-5gj2n" event={"ID":"62b69960-585e-4e89-a290-e00ea2f20283","Type":"ContainerStarted","Data":"b6c1b2f06666db6db3412315260b2bc3bbfee3a71be4ed630061353f9b7039fc"} Mar 14 07:20:29 crc kubenswrapper[4893]: I0314 07:20:29.507395 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cd86fcf7-fcmhz" event={"ID":"cc2d4699-ea9a-426b-b441-ee0d9e32445c","Type":"ContainerStarted","Data":"a2b53e42a4ed1c6046a99777cc0e3efb1372c8fc4318d274864d09b2c1d02052"} Mar 14 07:20:29 crc kubenswrapper[4893]: I0314 07:20:29.507720 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6cd86fcf7-fcmhz" Mar 14 07:20:29 crc kubenswrapper[4893]: I0314 07:20:29.581114 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6cd86fcf7-fcmhz" podStartSLOduration=5.581098054 podStartE2EDuration="5.581098054s" podCreationTimestamp="2026-03-14 07:20:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:20:29.563782093 +0000 UTC m=+1308.825958895" watchObservedRunningTime="2026-03-14 07:20:29.581098054 +0000 UTC m=+1308.843274846" Mar 14 07:20:32 crc kubenswrapper[4893]: I0314 07:20:32.559028 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-78ff-account-create-update-8m2n9" event={"ID":"05bc0af5-83ce-4dd0-b4b9-53e9307905ad","Type":"ContainerDied","Data":"8daa364ec3cac506a583121a7a2bec4c39a856bc9e67d6f3f1d57aac5c372ec0"} Mar 14 07:20:32 crc kubenswrapper[4893]: I0314 07:20:32.559526 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8daa364ec3cac506a583121a7a2bec4c39a856bc9e67d6f3f1d57aac5c372ec0" Mar 14 07:20:32 crc kubenswrapper[4893]: I0314 07:20:32.562939 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-8d2e-account-create-update-db4t8" event={"ID":"f25145f5-abb0-4f54-aea2-d23716f0af23","Type":"ContainerDied","Data":"bad699226159786318e88611dbd83d9a2e0dce7ceff4ec265c309a12ce176fef"} Mar 14 07:20:32 crc kubenswrapper[4893]: I0314 07:20:32.562989 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bad699226159786318e88611dbd83d9a2e0dce7ceff4ec265c309a12ce176fef" Mar 14 07:20:32 crc kubenswrapper[4893]: I0314 07:20:32.567009 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-f553-account-create-update-d5kx6" event={"ID":"44268a3d-27a9-41ee-a0d7-38ba3b152ce5","Type":"ContainerDied","Data":"2b8e3f04aea3f7fec150d4959f0bbbb8f5cb455a97702ddd70c6805443a80a65"} Mar 14 07:20:32 crc kubenswrapper[4893]: I0314 07:20:32.567036 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b8e3f04aea3f7fec150d4959f0bbbb8f5cb455a97702ddd70c6805443a80a65" Mar 14 07:20:32 crc kubenswrapper[4893]: I0314 07:20:32.568827 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-ftbdk" event={"ID":"e580c7ca-eac4-4dd6-bdd6-478814d7f65d","Type":"ContainerDied","Data":"dcd6bb47ee9e4b1b1477c88da765bd2dde49a59eeb247f1af7a20a2ba459f552"} Mar 14 07:20:32 crc kubenswrapper[4893]: I0314 07:20:32.568858 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dcd6bb47ee9e4b1b1477c88da765bd2dde49a59eeb247f1af7a20a2ba459f552" Mar 14 07:20:32 crc kubenswrapper[4893]: I0314 07:20:32.570671 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-8knjw" event={"ID":"ef8f0e2b-b3e6-406e-9ee7-7569256f6bc3","Type":"ContainerDied","Data":"64364b8dbb4e912b50b183d3f62bb8beeaed0f6ecad1e98396f46ceb0b294ef7"} Mar 14 07:20:32 crc kubenswrapper[4893]: I0314 07:20:32.570736 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="64364b8dbb4e912b50b183d3f62bb8beeaed0f6ecad1e98396f46ceb0b294ef7" Mar 14 07:20:32 crc kubenswrapper[4893]: I0314 07:20:32.573255 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-6g9t6" event={"ID":"b191a119-f244-4d99-98f2-e0ae52bd6613","Type":"ContainerDied","Data":"b0534f93d1ee708e44bdae11cd409baba839496a058fbcf73c9d2f5876712f48"} Mar 14 07:20:32 crc kubenswrapper[4893]: I0314 07:20:32.573302 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b0534f93d1ee708e44bdae11cd409baba839496a058fbcf73c9d2f5876712f48" Mar 14 07:20:32 crc kubenswrapper[4893]: I0314 07:20:32.634398 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-ftbdk" Mar 14 07:20:32 crc kubenswrapper[4893]: I0314 07:20:32.697404 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f553-account-create-update-d5kx6" Mar 14 07:20:32 crc kubenswrapper[4893]: I0314 07:20:32.709896 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-8knjw" Mar 14 07:20:32 crc kubenswrapper[4893]: I0314 07:20:32.737341 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-8d2e-account-create-update-db4t8" Mar 14 07:20:32 crc kubenswrapper[4893]: I0314 07:20:32.743701 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-78ff-account-create-update-8m2n9" Mar 14 07:20:32 crc kubenswrapper[4893]: I0314 07:20:32.752799 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e580c7ca-eac4-4dd6-bdd6-478814d7f65d-operator-scripts\") pod \"e580c7ca-eac4-4dd6-bdd6-478814d7f65d\" (UID: \"e580c7ca-eac4-4dd6-bdd6-478814d7f65d\") " Mar 14 07:20:32 crc kubenswrapper[4893]: I0314 07:20:32.753259 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44268a3d-27a9-41ee-a0d7-38ba3b152ce5-operator-scripts\") pod \"44268a3d-27a9-41ee-a0d7-38ba3b152ce5\" (UID: \"44268a3d-27a9-41ee-a0d7-38ba3b152ce5\") " Mar 14 07:20:32 crc kubenswrapper[4893]: I0314 07:20:32.753284 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bwtmb\" (UniqueName: \"kubernetes.io/projected/44268a3d-27a9-41ee-a0d7-38ba3b152ce5-kube-api-access-bwtmb\") pod \"44268a3d-27a9-41ee-a0d7-38ba3b152ce5\" (UID: \"44268a3d-27a9-41ee-a0d7-38ba3b152ce5\") " Mar 14 07:20:32 crc kubenswrapper[4893]: I0314 07:20:32.753353 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6nv77\" (UniqueName: \"kubernetes.io/projected/e580c7ca-eac4-4dd6-bdd6-478814d7f65d-kube-api-access-6nv77\") pod \"e580c7ca-eac4-4dd6-bdd6-478814d7f65d\" (UID: \"e580c7ca-eac4-4dd6-bdd6-478814d7f65d\") " Mar 14 07:20:32 crc kubenswrapper[4893]: I0314 07:20:32.753975 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44268a3d-27a9-41ee-a0d7-38ba3b152ce5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "44268a3d-27a9-41ee-a0d7-38ba3b152ce5" (UID: "44268a3d-27a9-41ee-a0d7-38ba3b152ce5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:20:32 crc kubenswrapper[4893]: I0314 07:20:32.754815 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e580c7ca-eac4-4dd6-bdd6-478814d7f65d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e580c7ca-eac4-4dd6-bdd6-478814d7f65d" (UID: "e580c7ca-eac4-4dd6-bdd6-478814d7f65d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:20:32 crc kubenswrapper[4893]: I0314 07:20:32.756015 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-6g9t6" Mar 14 07:20:32 crc kubenswrapper[4893]: I0314 07:20:32.757356 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e580c7ca-eac4-4dd6-bdd6-478814d7f65d-kube-api-access-6nv77" (OuterVolumeSpecName: "kube-api-access-6nv77") pod "e580c7ca-eac4-4dd6-bdd6-478814d7f65d" (UID: "e580c7ca-eac4-4dd6-bdd6-478814d7f65d"). InnerVolumeSpecName "kube-api-access-6nv77". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:20:32 crc kubenswrapper[4893]: I0314 07:20:32.759379 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44268a3d-27a9-41ee-a0d7-38ba3b152ce5-kube-api-access-bwtmb" (OuterVolumeSpecName: "kube-api-access-bwtmb") pod "44268a3d-27a9-41ee-a0d7-38ba3b152ce5" (UID: "44268a3d-27a9-41ee-a0d7-38ba3b152ce5"). InnerVolumeSpecName "kube-api-access-bwtmb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:20:32 crc kubenswrapper[4893]: I0314 07:20:32.854647 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4sfsp\" (UniqueName: \"kubernetes.io/projected/05bc0af5-83ce-4dd0-b4b9-53e9307905ad-kube-api-access-4sfsp\") pod \"05bc0af5-83ce-4dd0-b4b9-53e9307905ad\" (UID: \"05bc0af5-83ce-4dd0-b4b9-53e9307905ad\") " Mar 14 07:20:32 crc kubenswrapper[4893]: I0314 07:20:32.854718 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87wb9\" (UniqueName: \"kubernetes.io/projected/f25145f5-abb0-4f54-aea2-d23716f0af23-kube-api-access-87wb9\") pod \"f25145f5-abb0-4f54-aea2-d23716f0af23\" (UID: \"f25145f5-abb0-4f54-aea2-d23716f0af23\") " Mar 14 07:20:32 crc kubenswrapper[4893]: I0314 07:20:32.854786 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lk7vg\" (UniqueName: \"kubernetes.io/projected/b191a119-f244-4d99-98f2-e0ae52bd6613-kube-api-access-lk7vg\") pod \"b191a119-f244-4d99-98f2-e0ae52bd6613\" (UID: \"b191a119-f244-4d99-98f2-e0ae52bd6613\") " Mar 14 07:20:32 crc kubenswrapper[4893]: I0314 07:20:32.854807 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gbt4\" (UniqueName: \"kubernetes.io/projected/ef8f0e2b-b3e6-406e-9ee7-7569256f6bc3-kube-api-access-6gbt4\") pod \"ef8f0e2b-b3e6-406e-9ee7-7569256f6bc3\" (UID: \"ef8f0e2b-b3e6-406e-9ee7-7569256f6bc3\") " Mar 14 07:20:32 crc kubenswrapper[4893]: I0314 07:20:32.854856 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b191a119-f244-4d99-98f2-e0ae52bd6613-operator-scripts\") pod \"b191a119-f244-4d99-98f2-e0ae52bd6613\" (UID: \"b191a119-f244-4d99-98f2-e0ae52bd6613\") " Mar 14 07:20:32 crc kubenswrapper[4893]: I0314 07:20:32.854933 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f25145f5-abb0-4f54-aea2-d23716f0af23-operator-scripts\") pod \"f25145f5-abb0-4f54-aea2-d23716f0af23\" (UID: \"f25145f5-abb0-4f54-aea2-d23716f0af23\") " Mar 14 07:20:32 crc kubenswrapper[4893]: I0314 07:20:32.854973 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05bc0af5-83ce-4dd0-b4b9-53e9307905ad-operator-scripts\") pod \"05bc0af5-83ce-4dd0-b4b9-53e9307905ad\" (UID: \"05bc0af5-83ce-4dd0-b4b9-53e9307905ad\") " Mar 14 07:20:32 crc kubenswrapper[4893]: I0314 07:20:32.855018 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef8f0e2b-b3e6-406e-9ee7-7569256f6bc3-operator-scripts\") pod \"ef8f0e2b-b3e6-406e-9ee7-7569256f6bc3\" (UID: \"ef8f0e2b-b3e6-406e-9ee7-7569256f6bc3\") " Mar 14 07:20:32 crc kubenswrapper[4893]: I0314 07:20:32.855392 4893 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e580c7ca-eac4-4dd6-bdd6-478814d7f65d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:32 crc kubenswrapper[4893]: I0314 07:20:32.855410 4893 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44268a3d-27a9-41ee-a0d7-38ba3b152ce5-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:32 crc kubenswrapper[4893]: I0314 07:20:32.855420 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bwtmb\" (UniqueName: \"kubernetes.io/projected/44268a3d-27a9-41ee-a0d7-38ba3b152ce5-kube-api-access-bwtmb\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:32 crc kubenswrapper[4893]: I0314 07:20:32.855431 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6nv77\" (UniqueName: \"kubernetes.io/projected/e580c7ca-eac4-4dd6-bdd6-478814d7f65d-kube-api-access-6nv77\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:32 crc kubenswrapper[4893]: I0314 07:20:32.855777 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b191a119-f244-4d99-98f2-e0ae52bd6613-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b191a119-f244-4d99-98f2-e0ae52bd6613" (UID: "b191a119-f244-4d99-98f2-e0ae52bd6613"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:20:32 crc kubenswrapper[4893]: I0314 07:20:32.855808 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05bc0af5-83ce-4dd0-b4b9-53e9307905ad-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "05bc0af5-83ce-4dd0-b4b9-53e9307905ad" (UID: "05bc0af5-83ce-4dd0-b4b9-53e9307905ad"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:20:32 crc kubenswrapper[4893]: I0314 07:20:32.855837 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef8f0e2b-b3e6-406e-9ee7-7569256f6bc3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ef8f0e2b-b3e6-406e-9ee7-7569256f6bc3" (UID: "ef8f0e2b-b3e6-406e-9ee7-7569256f6bc3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:20:32 crc kubenswrapper[4893]: I0314 07:20:32.856198 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f25145f5-abb0-4f54-aea2-d23716f0af23-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f25145f5-abb0-4f54-aea2-d23716f0af23" (UID: "f25145f5-abb0-4f54-aea2-d23716f0af23"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:20:32 crc kubenswrapper[4893]: I0314 07:20:32.858243 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b191a119-f244-4d99-98f2-e0ae52bd6613-kube-api-access-lk7vg" (OuterVolumeSpecName: "kube-api-access-lk7vg") pod "b191a119-f244-4d99-98f2-e0ae52bd6613" (UID: "b191a119-f244-4d99-98f2-e0ae52bd6613"). InnerVolumeSpecName "kube-api-access-lk7vg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:20:32 crc kubenswrapper[4893]: I0314 07:20:32.858297 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05bc0af5-83ce-4dd0-b4b9-53e9307905ad-kube-api-access-4sfsp" (OuterVolumeSpecName: "kube-api-access-4sfsp") pod "05bc0af5-83ce-4dd0-b4b9-53e9307905ad" (UID: "05bc0af5-83ce-4dd0-b4b9-53e9307905ad"). InnerVolumeSpecName "kube-api-access-4sfsp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:20:32 crc kubenswrapper[4893]: I0314 07:20:32.859118 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f25145f5-abb0-4f54-aea2-d23716f0af23-kube-api-access-87wb9" (OuterVolumeSpecName: "kube-api-access-87wb9") pod "f25145f5-abb0-4f54-aea2-d23716f0af23" (UID: "f25145f5-abb0-4f54-aea2-d23716f0af23"). InnerVolumeSpecName "kube-api-access-87wb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:20:32 crc kubenswrapper[4893]: I0314 07:20:32.860128 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef8f0e2b-b3e6-406e-9ee7-7569256f6bc3-kube-api-access-6gbt4" (OuterVolumeSpecName: "kube-api-access-6gbt4") pod "ef8f0e2b-b3e6-406e-9ee7-7569256f6bc3" (UID: "ef8f0e2b-b3e6-406e-9ee7-7569256f6bc3"). InnerVolumeSpecName "kube-api-access-6gbt4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:20:32 crc kubenswrapper[4893]: I0314 07:20:32.956806 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87wb9\" (UniqueName: \"kubernetes.io/projected/f25145f5-abb0-4f54-aea2-d23716f0af23-kube-api-access-87wb9\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:32 crc kubenswrapper[4893]: I0314 07:20:32.956849 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lk7vg\" (UniqueName: \"kubernetes.io/projected/b191a119-f244-4d99-98f2-e0ae52bd6613-kube-api-access-lk7vg\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:32 crc kubenswrapper[4893]: I0314 07:20:32.956859 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6gbt4\" (UniqueName: \"kubernetes.io/projected/ef8f0e2b-b3e6-406e-9ee7-7569256f6bc3-kube-api-access-6gbt4\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:32 crc kubenswrapper[4893]: I0314 07:20:32.956869 4893 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b191a119-f244-4d99-98f2-e0ae52bd6613-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:32 crc kubenswrapper[4893]: I0314 07:20:32.956878 4893 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f25145f5-abb0-4f54-aea2-d23716f0af23-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:32 crc kubenswrapper[4893]: I0314 07:20:32.956886 4893 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05bc0af5-83ce-4dd0-b4b9-53e9307905ad-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:32 crc kubenswrapper[4893]: I0314 07:20:32.956895 4893 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef8f0e2b-b3e6-406e-9ee7-7569256f6bc3-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:32 crc kubenswrapper[4893]: I0314 07:20:32.956903 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4sfsp\" (UniqueName: \"kubernetes.io/projected/05bc0af5-83ce-4dd0-b4b9-53e9307905ad-kube-api-access-4sfsp\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:33 crc kubenswrapper[4893]: I0314 07:20:33.581798 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-6g9t6" Mar 14 07:20:33 crc kubenswrapper[4893]: I0314 07:20:33.581834 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-78ff-account-create-update-8m2n9" Mar 14 07:20:33 crc kubenswrapper[4893]: I0314 07:20:33.581869 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f553-account-create-update-d5kx6" Mar 14 07:20:33 crc kubenswrapper[4893]: I0314 07:20:33.581798 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-8d2e-account-create-update-db4t8" Mar 14 07:20:33 crc kubenswrapper[4893]: I0314 07:20:33.581871 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-5gj2n" event={"ID":"62b69960-585e-4e89-a290-e00ea2f20283","Type":"ContainerStarted","Data":"7ae0c82deca1cab7ae23b6448053023c9a433f9266f77cf656b48631609f0ca2"} Mar 14 07:20:33 crc kubenswrapper[4893]: I0314 07:20:33.581963 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-8knjw" Mar 14 07:20:33 crc kubenswrapper[4893]: I0314 07:20:33.583247 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-ftbdk" Mar 14 07:20:33 crc kubenswrapper[4893]: I0314 07:20:33.605746 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-5gj2n" podStartSLOduration=3.535547975 podStartE2EDuration="7.605726112s" podCreationTimestamp="2026-03-14 07:20:26 +0000 UTC" firstStartedPulling="2026-03-14 07:20:28.490946055 +0000 UTC m=+1307.753122847" lastFinishedPulling="2026-03-14 07:20:32.561124192 +0000 UTC m=+1311.823300984" observedRunningTime="2026-03-14 07:20:33.596775773 +0000 UTC m=+1312.858952585" watchObservedRunningTime="2026-03-14 07:20:33.605726112 +0000 UTC m=+1312.867902904" Mar 14 07:20:34 crc kubenswrapper[4893]: I0314 07:20:34.889808 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/079232b7-87bb-42cf-96ff-1eb2d1cfe2b5-etc-swift\") pod \"swift-storage-0\" (UID: \"079232b7-87bb-42cf-96ff-1eb2d1cfe2b5\") " pod="openstack/swift-storage-0" Mar 14 07:20:34 crc kubenswrapper[4893]: I0314 07:20:34.898527 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/079232b7-87bb-42cf-96ff-1eb2d1cfe2b5-etc-swift\") pod \"swift-storage-0\" (UID: \"079232b7-87bb-42cf-96ff-1eb2d1cfe2b5\") " pod="openstack/swift-storage-0" Mar 14 07:20:35 crc kubenswrapper[4893]: I0314 07:20:35.009663 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6cd86fcf7-fcmhz" Mar 14 07:20:35 crc kubenswrapper[4893]: I0314 07:20:35.052982 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 14 07:20:35 crc kubenswrapper[4893]: I0314 07:20:35.070224 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b9fd7d84c-b8mk8"] Mar 14 07:20:35 crc kubenswrapper[4893]: I0314 07:20:35.070840 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7b9fd7d84c-b8mk8" podUID="82962bac-4b7c-45ec-a46b-7c2a01f4db61" containerName="dnsmasq-dns" containerID="cri-o://e410af8fdac966c35f099c0771adc94f1028ee35fd0e61eb19085b6a9b8c1b83" gracePeriod=10 Mar 14 07:20:35 crc kubenswrapper[4893]: I0314 07:20:35.574015 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b9fd7d84c-b8mk8" Mar 14 07:20:35 crc kubenswrapper[4893]: I0314 07:20:35.613036 4893 generic.go:334] "Generic (PLEG): container finished" podID="82962bac-4b7c-45ec-a46b-7c2a01f4db61" containerID="e410af8fdac966c35f099c0771adc94f1028ee35fd0e61eb19085b6a9b8c1b83" exitCode=0 Mar 14 07:20:35 crc kubenswrapper[4893]: I0314 07:20:35.613101 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b9fd7d84c-b8mk8" event={"ID":"82962bac-4b7c-45ec-a46b-7c2a01f4db61","Type":"ContainerDied","Data":"e410af8fdac966c35f099c0771adc94f1028ee35fd0e61eb19085b6a9b8c1b83"} Mar 14 07:20:35 crc kubenswrapper[4893]: I0314 07:20:35.613115 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b9fd7d84c-b8mk8" Mar 14 07:20:35 crc kubenswrapper[4893]: I0314 07:20:35.613141 4893 scope.go:117] "RemoveContainer" containerID="e410af8fdac966c35f099c0771adc94f1028ee35fd0e61eb19085b6a9b8c1b83" Mar 14 07:20:35 crc kubenswrapper[4893]: I0314 07:20:35.613129 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b9fd7d84c-b8mk8" event={"ID":"82962bac-4b7c-45ec-a46b-7c2a01f4db61","Type":"ContainerDied","Data":"bfea1c41ccd77f44e0a66427622ac65c48195904248df159f994f9e9f4cd33c1"} Mar 14 07:20:35 crc kubenswrapper[4893]: I0314 07:20:35.618791 4893 generic.go:334] "Generic (PLEG): container finished" podID="62b69960-585e-4e89-a290-e00ea2f20283" containerID="7ae0c82deca1cab7ae23b6448053023c9a433f9266f77cf656b48631609f0ca2" exitCode=0 Mar 14 07:20:35 crc kubenswrapper[4893]: I0314 07:20:35.618855 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-5gj2n" event={"ID":"62b69960-585e-4e89-a290-e00ea2f20283","Type":"ContainerDied","Data":"7ae0c82deca1cab7ae23b6448053023c9a433f9266f77cf656b48631609f0ca2"} Mar 14 07:20:35 crc kubenswrapper[4893]: I0314 07:20:35.630274 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 14 07:20:35 crc kubenswrapper[4893]: I0314 07:20:35.643977 4893 scope.go:117] "RemoveContainer" containerID="98de0aee6c451067a2b02027ded48994f7fca5c71d35a372b7006271043700be" Mar 14 07:20:35 crc kubenswrapper[4893]: W0314 07:20:35.648064 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod079232b7_87bb_42cf_96ff_1eb2d1cfe2b5.slice/crio-6e826cc186438c3a3cd288097294739cb23b36306cb0e69a85d3829ae1ec9589 WatchSource:0}: Error finding container 6e826cc186438c3a3cd288097294739cb23b36306cb0e69a85d3829ae1ec9589: Status 404 returned error can't find the container with id 6e826cc186438c3a3cd288097294739cb23b36306cb0e69a85d3829ae1ec9589 Mar 14 07:20:35 crc kubenswrapper[4893]: I0314 07:20:35.663323 4893 scope.go:117] "RemoveContainer" containerID="e410af8fdac966c35f099c0771adc94f1028ee35fd0e61eb19085b6a9b8c1b83" Mar 14 07:20:35 crc kubenswrapper[4893]: E0314 07:20:35.663764 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e410af8fdac966c35f099c0771adc94f1028ee35fd0e61eb19085b6a9b8c1b83\": container with ID starting with e410af8fdac966c35f099c0771adc94f1028ee35fd0e61eb19085b6a9b8c1b83 not found: ID does not exist" containerID="e410af8fdac966c35f099c0771adc94f1028ee35fd0e61eb19085b6a9b8c1b83" Mar 14 07:20:35 crc kubenswrapper[4893]: I0314 07:20:35.663789 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e410af8fdac966c35f099c0771adc94f1028ee35fd0e61eb19085b6a9b8c1b83"} err="failed to get container status \"e410af8fdac966c35f099c0771adc94f1028ee35fd0e61eb19085b6a9b8c1b83\": rpc error: code = NotFound desc = could not find container \"e410af8fdac966c35f099c0771adc94f1028ee35fd0e61eb19085b6a9b8c1b83\": container with ID starting with e410af8fdac966c35f099c0771adc94f1028ee35fd0e61eb19085b6a9b8c1b83 not found: ID does not exist" Mar 14 07:20:35 crc kubenswrapper[4893]: I0314 07:20:35.663807 4893 scope.go:117] "RemoveContainer" containerID="98de0aee6c451067a2b02027ded48994f7fca5c71d35a372b7006271043700be" Mar 14 07:20:35 crc kubenswrapper[4893]: E0314 07:20:35.664273 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98de0aee6c451067a2b02027ded48994f7fca5c71d35a372b7006271043700be\": container with ID starting with 98de0aee6c451067a2b02027ded48994f7fca5c71d35a372b7006271043700be not found: ID does not exist" containerID="98de0aee6c451067a2b02027ded48994f7fca5c71d35a372b7006271043700be" Mar 14 07:20:35 crc kubenswrapper[4893]: I0314 07:20:35.664310 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98de0aee6c451067a2b02027ded48994f7fca5c71d35a372b7006271043700be"} err="failed to get container status \"98de0aee6c451067a2b02027ded48994f7fca5c71d35a372b7006271043700be\": rpc error: code = NotFound desc = could not find container \"98de0aee6c451067a2b02027ded48994f7fca5c71d35a372b7006271043700be\": container with ID starting with 98de0aee6c451067a2b02027ded48994f7fca5c71d35a372b7006271043700be not found: ID does not exist" Mar 14 07:20:35 crc kubenswrapper[4893]: I0314 07:20:35.707591 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/82962bac-4b7c-45ec-a46b-7c2a01f4db61-ovsdbserver-sb\") pod \"82962bac-4b7c-45ec-a46b-7c2a01f4db61\" (UID: \"82962bac-4b7c-45ec-a46b-7c2a01f4db61\") " Mar 14 07:20:35 crc kubenswrapper[4893]: I0314 07:20:35.707698 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/82962bac-4b7c-45ec-a46b-7c2a01f4db61-ovsdbserver-nb\") pod \"82962bac-4b7c-45ec-a46b-7c2a01f4db61\" (UID: \"82962bac-4b7c-45ec-a46b-7c2a01f4db61\") " Mar 14 07:20:35 crc kubenswrapper[4893]: I0314 07:20:35.707812 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bglfz\" (UniqueName: \"kubernetes.io/projected/82962bac-4b7c-45ec-a46b-7c2a01f4db61-kube-api-access-bglfz\") pod \"82962bac-4b7c-45ec-a46b-7c2a01f4db61\" (UID: \"82962bac-4b7c-45ec-a46b-7c2a01f4db61\") " Mar 14 07:20:35 crc kubenswrapper[4893]: I0314 07:20:35.707842 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82962bac-4b7c-45ec-a46b-7c2a01f4db61-dns-svc\") pod \"82962bac-4b7c-45ec-a46b-7c2a01f4db61\" (UID: \"82962bac-4b7c-45ec-a46b-7c2a01f4db61\") " Mar 14 07:20:35 crc kubenswrapper[4893]: I0314 07:20:35.707880 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82962bac-4b7c-45ec-a46b-7c2a01f4db61-config\") pod \"82962bac-4b7c-45ec-a46b-7c2a01f4db61\" (UID: \"82962bac-4b7c-45ec-a46b-7c2a01f4db61\") " Mar 14 07:20:35 crc kubenswrapper[4893]: I0314 07:20:35.713981 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82962bac-4b7c-45ec-a46b-7c2a01f4db61-kube-api-access-bglfz" (OuterVolumeSpecName: "kube-api-access-bglfz") pod "82962bac-4b7c-45ec-a46b-7c2a01f4db61" (UID: "82962bac-4b7c-45ec-a46b-7c2a01f4db61"). InnerVolumeSpecName "kube-api-access-bglfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:20:35 crc kubenswrapper[4893]: I0314 07:20:35.746013 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82962bac-4b7c-45ec-a46b-7c2a01f4db61-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "82962bac-4b7c-45ec-a46b-7c2a01f4db61" (UID: "82962bac-4b7c-45ec-a46b-7c2a01f4db61"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:20:35 crc kubenswrapper[4893]: I0314 07:20:35.748411 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82962bac-4b7c-45ec-a46b-7c2a01f4db61-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "82962bac-4b7c-45ec-a46b-7c2a01f4db61" (UID: "82962bac-4b7c-45ec-a46b-7c2a01f4db61"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:20:35 crc kubenswrapper[4893]: I0314 07:20:35.748771 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82962bac-4b7c-45ec-a46b-7c2a01f4db61-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "82962bac-4b7c-45ec-a46b-7c2a01f4db61" (UID: "82962bac-4b7c-45ec-a46b-7c2a01f4db61"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:20:35 crc kubenswrapper[4893]: I0314 07:20:35.748966 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82962bac-4b7c-45ec-a46b-7c2a01f4db61-config" (OuterVolumeSpecName: "config") pod "82962bac-4b7c-45ec-a46b-7c2a01f4db61" (UID: "82962bac-4b7c-45ec-a46b-7c2a01f4db61"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:20:35 crc kubenswrapper[4893]: I0314 07:20:35.809837 4893 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/82962bac-4b7c-45ec-a46b-7c2a01f4db61-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:35 crc kubenswrapper[4893]: I0314 07:20:35.809868 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bglfz\" (UniqueName: \"kubernetes.io/projected/82962bac-4b7c-45ec-a46b-7c2a01f4db61-kube-api-access-bglfz\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:35 crc kubenswrapper[4893]: I0314 07:20:35.809880 4893 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82962bac-4b7c-45ec-a46b-7c2a01f4db61-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:35 crc kubenswrapper[4893]: I0314 07:20:35.809889 4893 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82962bac-4b7c-45ec-a46b-7c2a01f4db61-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:35 crc kubenswrapper[4893]: I0314 07:20:35.809898 4893 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/82962bac-4b7c-45ec-a46b-7c2a01f4db61-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:35 crc kubenswrapper[4893]: I0314 07:20:35.987824 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b9fd7d84c-b8mk8"] Mar 14 07:20:35 crc kubenswrapper[4893]: I0314 07:20:35.993838 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b9fd7d84c-b8mk8"] Mar 14 07:20:36 crc kubenswrapper[4893]: I0314 07:20:36.628784 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"079232b7-87bb-42cf-96ff-1eb2d1cfe2b5","Type":"ContainerStarted","Data":"6e826cc186438c3a3cd288097294739cb23b36306cb0e69a85d3829ae1ec9589"} Mar 14 07:20:36 crc kubenswrapper[4893]: I0314 07:20:36.993953 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-5gj2n" Mar 14 07:20:37 crc kubenswrapper[4893]: I0314 07:20:37.128712 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lmhf\" (UniqueName: \"kubernetes.io/projected/62b69960-585e-4e89-a290-e00ea2f20283-kube-api-access-6lmhf\") pod \"62b69960-585e-4e89-a290-e00ea2f20283\" (UID: \"62b69960-585e-4e89-a290-e00ea2f20283\") " Mar 14 07:20:37 crc kubenswrapper[4893]: I0314 07:20:37.128785 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62b69960-585e-4e89-a290-e00ea2f20283-combined-ca-bundle\") pod \"62b69960-585e-4e89-a290-e00ea2f20283\" (UID: \"62b69960-585e-4e89-a290-e00ea2f20283\") " Mar 14 07:20:37 crc kubenswrapper[4893]: I0314 07:20:37.128839 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62b69960-585e-4e89-a290-e00ea2f20283-config-data\") pod \"62b69960-585e-4e89-a290-e00ea2f20283\" (UID: \"62b69960-585e-4e89-a290-e00ea2f20283\") " Mar 14 07:20:37 crc kubenswrapper[4893]: I0314 07:20:37.133942 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62b69960-585e-4e89-a290-e00ea2f20283-kube-api-access-6lmhf" (OuterVolumeSpecName: "kube-api-access-6lmhf") pod "62b69960-585e-4e89-a290-e00ea2f20283" (UID: "62b69960-585e-4e89-a290-e00ea2f20283"). InnerVolumeSpecName "kube-api-access-6lmhf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:20:37 crc kubenswrapper[4893]: I0314 07:20:37.163033 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62b69960-585e-4e89-a290-e00ea2f20283-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "62b69960-585e-4e89-a290-e00ea2f20283" (UID: "62b69960-585e-4e89-a290-e00ea2f20283"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:20:37 crc kubenswrapper[4893]: I0314 07:20:37.195200 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62b69960-585e-4e89-a290-e00ea2f20283-config-data" (OuterVolumeSpecName: "config-data") pod "62b69960-585e-4e89-a290-e00ea2f20283" (UID: "62b69960-585e-4e89-a290-e00ea2f20283"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:20:37 crc kubenswrapper[4893]: I0314 07:20:37.234828 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lmhf\" (UniqueName: \"kubernetes.io/projected/62b69960-585e-4e89-a290-e00ea2f20283-kube-api-access-6lmhf\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:37 crc kubenswrapper[4893]: I0314 07:20:37.234871 4893 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62b69960-585e-4e89-a290-e00ea2f20283-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:37 crc kubenswrapper[4893]: I0314 07:20:37.234880 4893 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62b69960-585e-4e89-a290-e00ea2f20283-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:37 crc kubenswrapper[4893]: I0314 07:20:37.390833 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82962bac-4b7c-45ec-a46b-7c2a01f4db61" path="/var/lib/kubelet/pods/82962bac-4b7c-45ec-a46b-7c2a01f4db61/volumes" Mar 14 07:20:37 crc kubenswrapper[4893]: I0314 07:20:37.639692 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-5gj2n" event={"ID":"62b69960-585e-4e89-a290-e00ea2f20283","Type":"ContainerDied","Data":"b6c1b2f06666db6db3412315260b2bc3bbfee3a71be4ed630061353f9b7039fc"} Mar 14 07:20:37 crc kubenswrapper[4893]: I0314 07:20:37.639729 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6c1b2f06666db6db3412315260b2bc3bbfee3a71be4ed630061353f9b7039fc" Mar 14 07:20:37 crc kubenswrapper[4893]: I0314 07:20:37.640177 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-5gj2n" Mar 14 07:20:37 crc kubenswrapper[4893]: I0314 07:20:37.642082 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"079232b7-87bb-42cf-96ff-1eb2d1cfe2b5","Type":"ContainerStarted","Data":"10212c44819536c44b3626faff9ed9fd4a4329b9deab21102b0375f256bb23d2"} Mar 14 07:20:37 crc kubenswrapper[4893]: I0314 07:20:37.642145 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"079232b7-87bb-42cf-96ff-1eb2d1cfe2b5","Type":"ContainerStarted","Data":"8181ccf3acc345fc42f8615ca13010757b2d59b421f286383ec54a73844363c3"} Mar 14 07:20:37 crc kubenswrapper[4893]: I0314 07:20:37.642155 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"079232b7-87bb-42cf-96ff-1eb2d1cfe2b5","Type":"ContainerStarted","Data":"e6185392ccc78feefed6979a46060f7ab1bf26d3d5dd3f3682205a3151cc587f"} Mar 14 07:20:37 crc kubenswrapper[4893]: I0314 07:20:37.819185 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56dddd9f87-rr9l5"] Mar 14 07:20:37 crc kubenswrapper[4893]: E0314 07:20:37.819506 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0a0f907-332e-4e2f-bf23-586e7728a26e" containerName="ovn-config" Mar 14 07:20:37 crc kubenswrapper[4893]: I0314 07:20:37.819526 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0a0f907-332e-4e2f-bf23-586e7728a26e" containerName="ovn-config" Mar 14 07:20:37 crc kubenswrapper[4893]: E0314 07:20:37.819552 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82962bac-4b7c-45ec-a46b-7c2a01f4db61" containerName="init" Mar 14 07:20:37 crc kubenswrapper[4893]: I0314 07:20:37.819559 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="82962bac-4b7c-45ec-a46b-7c2a01f4db61" containerName="init" Mar 14 07:20:37 crc kubenswrapper[4893]: E0314 07:20:37.819573 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82962bac-4b7c-45ec-a46b-7c2a01f4db61" containerName="dnsmasq-dns" Mar 14 07:20:37 crc kubenswrapper[4893]: I0314 07:20:37.819579 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="82962bac-4b7c-45ec-a46b-7c2a01f4db61" containerName="dnsmasq-dns" Mar 14 07:20:37 crc kubenswrapper[4893]: E0314 07:20:37.819588 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b191a119-f244-4d99-98f2-e0ae52bd6613" containerName="mariadb-database-create" Mar 14 07:20:37 crc kubenswrapper[4893]: I0314 07:20:37.819593 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="b191a119-f244-4d99-98f2-e0ae52bd6613" containerName="mariadb-database-create" Mar 14 07:20:37 crc kubenswrapper[4893]: E0314 07:20:37.819607 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e77e9706-a6b7-4f26-9897-8f5d66642a67" containerName="swift-ring-rebalance" Mar 14 07:20:37 crc kubenswrapper[4893]: I0314 07:20:37.819613 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="e77e9706-a6b7-4f26-9897-8f5d66642a67" containerName="swift-ring-rebalance" Mar 14 07:20:37 crc kubenswrapper[4893]: E0314 07:20:37.819624 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f25145f5-abb0-4f54-aea2-d23716f0af23" containerName="mariadb-account-create-update" Mar 14 07:20:37 crc kubenswrapper[4893]: I0314 07:20:37.819631 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="f25145f5-abb0-4f54-aea2-d23716f0af23" containerName="mariadb-account-create-update" Mar 14 07:20:37 crc kubenswrapper[4893]: E0314 07:20:37.819641 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef8f0e2b-b3e6-406e-9ee7-7569256f6bc3" containerName="mariadb-database-create" Mar 14 07:20:37 crc kubenswrapper[4893]: I0314 07:20:37.819647 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef8f0e2b-b3e6-406e-9ee7-7569256f6bc3" containerName="mariadb-database-create" Mar 14 07:20:37 crc kubenswrapper[4893]: E0314 07:20:37.819657 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05bc0af5-83ce-4dd0-b4b9-53e9307905ad" containerName="mariadb-account-create-update" Mar 14 07:20:37 crc kubenswrapper[4893]: I0314 07:20:37.819662 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="05bc0af5-83ce-4dd0-b4b9-53e9307905ad" containerName="mariadb-account-create-update" Mar 14 07:20:37 crc kubenswrapper[4893]: E0314 07:20:37.819677 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62b69960-585e-4e89-a290-e00ea2f20283" containerName="keystone-db-sync" Mar 14 07:20:37 crc kubenswrapper[4893]: I0314 07:20:37.819685 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="62b69960-585e-4e89-a290-e00ea2f20283" containerName="keystone-db-sync" Mar 14 07:20:37 crc kubenswrapper[4893]: E0314 07:20:37.819704 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e580c7ca-eac4-4dd6-bdd6-478814d7f65d" containerName="mariadb-database-create" Mar 14 07:20:37 crc kubenswrapper[4893]: I0314 07:20:37.819712 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="e580c7ca-eac4-4dd6-bdd6-478814d7f65d" containerName="mariadb-database-create" Mar 14 07:20:37 crc kubenswrapper[4893]: E0314 07:20:37.819727 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44268a3d-27a9-41ee-a0d7-38ba3b152ce5" containerName="mariadb-account-create-update" Mar 14 07:20:37 crc kubenswrapper[4893]: I0314 07:20:37.819735 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="44268a3d-27a9-41ee-a0d7-38ba3b152ce5" containerName="mariadb-account-create-update" Mar 14 07:20:37 crc kubenswrapper[4893]: I0314 07:20:37.819893 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef8f0e2b-b3e6-406e-9ee7-7569256f6bc3" containerName="mariadb-database-create" Mar 14 07:20:37 crc kubenswrapper[4893]: I0314 07:20:37.819909 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="f25145f5-abb0-4f54-aea2-d23716f0af23" containerName="mariadb-account-create-update" Mar 14 07:20:37 crc kubenswrapper[4893]: I0314 07:20:37.819919 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="82962bac-4b7c-45ec-a46b-7c2a01f4db61" containerName="dnsmasq-dns" Mar 14 07:20:37 crc kubenswrapper[4893]: I0314 07:20:37.819928 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="44268a3d-27a9-41ee-a0d7-38ba3b152ce5" containerName="mariadb-account-create-update" Mar 14 07:20:37 crc kubenswrapper[4893]: I0314 07:20:37.819934 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="b191a119-f244-4d99-98f2-e0ae52bd6613" containerName="mariadb-database-create" Mar 14 07:20:37 crc kubenswrapper[4893]: I0314 07:20:37.819948 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="05bc0af5-83ce-4dd0-b4b9-53e9307905ad" containerName="mariadb-account-create-update" Mar 14 07:20:37 crc kubenswrapper[4893]: I0314 07:20:37.819955 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="e77e9706-a6b7-4f26-9897-8f5d66642a67" containerName="swift-ring-rebalance" Mar 14 07:20:37 crc kubenswrapper[4893]: I0314 07:20:37.819964 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0a0f907-332e-4e2f-bf23-586e7728a26e" containerName="ovn-config" Mar 14 07:20:37 crc kubenswrapper[4893]: I0314 07:20:37.819973 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="62b69960-585e-4e89-a290-e00ea2f20283" containerName="keystone-db-sync" Mar 14 07:20:37 crc kubenswrapper[4893]: I0314 07:20:37.819981 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="e580c7ca-eac4-4dd6-bdd6-478814d7f65d" containerName="mariadb-database-create" Mar 14 07:20:37 crc kubenswrapper[4893]: I0314 07:20:37.820832 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56dddd9f87-rr9l5" Mar 14 07:20:37 crc kubenswrapper[4893]: I0314 07:20:37.840937 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56dddd9f87-rr9l5"] Mar 14 07:20:37 crc kubenswrapper[4893]: I0314 07:20:37.888798 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-ktdtt"] Mar 14 07:20:37 crc kubenswrapper[4893]: I0314 07:20:37.889866 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ktdtt" Mar 14 07:20:37 crc kubenswrapper[4893]: I0314 07:20:37.893427 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 14 07:20:37 crc kubenswrapper[4893]: I0314 07:20:37.893471 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 14 07:20:37 crc kubenswrapper[4893]: I0314 07:20:37.893705 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 14 07:20:37 crc kubenswrapper[4893]: I0314 07:20:37.893810 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 14 07:20:37 crc kubenswrapper[4893]: I0314 07:20:37.893993 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-k497z" Mar 14 07:20:37 crc kubenswrapper[4893]: I0314 07:20:37.898153 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-ktdtt"] Mar 14 07:20:37 crc kubenswrapper[4893]: I0314 07:20:37.949796 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0-ovsdbserver-nb\") pod \"dnsmasq-dns-56dddd9f87-rr9l5\" (UID: \"d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0\") " pod="openstack/dnsmasq-dns-56dddd9f87-rr9l5" Mar 14 07:20:37 crc kubenswrapper[4893]: I0314 07:20:37.950119 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ded78544-1cd7-41a9-b454-068d1024efd5-scripts\") pod \"keystone-bootstrap-ktdtt\" (UID: \"ded78544-1cd7-41a9-b454-068d1024efd5\") " pod="openstack/keystone-bootstrap-ktdtt" Mar 14 07:20:37 crc kubenswrapper[4893]: I0314 07:20:37.950259 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0-ovsdbserver-sb\") pod \"dnsmasq-dns-56dddd9f87-rr9l5\" (UID: \"d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0\") " pod="openstack/dnsmasq-dns-56dddd9f87-rr9l5" Mar 14 07:20:37 crc kubenswrapper[4893]: I0314 07:20:37.950416 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0-config\") pod \"dnsmasq-dns-56dddd9f87-rr9l5\" (UID: \"d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0\") " pod="openstack/dnsmasq-dns-56dddd9f87-rr9l5" Mar 14 07:20:37 crc kubenswrapper[4893]: I0314 07:20:37.950551 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0-dns-svc\") pod \"dnsmasq-dns-56dddd9f87-rr9l5\" (UID: \"d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0\") " pod="openstack/dnsmasq-dns-56dddd9f87-rr9l5" Mar 14 07:20:37 crc kubenswrapper[4893]: I0314 07:20:37.950661 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ded78544-1cd7-41a9-b454-068d1024efd5-config-data\") pod \"keystone-bootstrap-ktdtt\" (UID: \"ded78544-1cd7-41a9-b454-068d1024efd5\") " pod="openstack/keystone-bootstrap-ktdtt" Mar 14 07:20:37 crc kubenswrapper[4893]: I0314 07:20:37.950780 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ded78544-1cd7-41a9-b454-068d1024efd5-combined-ca-bundle\") pod \"keystone-bootstrap-ktdtt\" (UID: \"ded78544-1cd7-41a9-b454-068d1024efd5\") " pod="openstack/keystone-bootstrap-ktdtt" Mar 14 07:20:37 crc kubenswrapper[4893]: I0314 07:20:37.950886 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t26qz\" (UniqueName: \"kubernetes.io/projected/d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0-kube-api-access-t26qz\") pod \"dnsmasq-dns-56dddd9f87-rr9l5\" (UID: \"d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0\") " pod="openstack/dnsmasq-dns-56dddd9f87-rr9l5" Mar 14 07:20:37 crc kubenswrapper[4893]: I0314 07:20:37.951072 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ded78544-1cd7-41a9-b454-068d1024efd5-credential-keys\") pod \"keystone-bootstrap-ktdtt\" (UID: \"ded78544-1cd7-41a9-b454-068d1024efd5\") " pod="openstack/keystone-bootstrap-ktdtt" Mar 14 07:20:37 crc kubenswrapper[4893]: I0314 07:20:37.951221 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ded78544-1cd7-41a9-b454-068d1024efd5-fernet-keys\") pod \"keystone-bootstrap-ktdtt\" (UID: \"ded78544-1cd7-41a9-b454-068d1024efd5\") " pod="openstack/keystone-bootstrap-ktdtt" Mar 14 07:20:37 crc kubenswrapper[4893]: I0314 07:20:37.952470 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzs7t\" (UniqueName: \"kubernetes.io/projected/ded78544-1cd7-41a9-b454-068d1024efd5-kube-api-access-fzs7t\") pod \"keystone-bootstrap-ktdtt\" (UID: \"ded78544-1cd7-41a9-b454-068d1024efd5\") " pod="openstack/keystone-bootstrap-ktdtt" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.054498 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0-config\") pod \"dnsmasq-dns-56dddd9f87-rr9l5\" (UID: \"d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0\") " pod="openstack/dnsmasq-dns-56dddd9f87-rr9l5" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.055592 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0-dns-svc\") pod \"dnsmasq-dns-56dddd9f87-rr9l5\" (UID: \"d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0\") " pod="openstack/dnsmasq-dns-56dddd9f87-rr9l5" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.057383 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ded78544-1cd7-41a9-b454-068d1024efd5-config-data\") pod \"keystone-bootstrap-ktdtt\" (UID: \"ded78544-1cd7-41a9-b454-068d1024efd5\") " pod="openstack/keystone-bootstrap-ktdtt" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.058111 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ded78544-1cd7-41a9-b454-068d1024efd5-combined-ca-bundle\") pod \"keystone-bootstrap-ktdtt\" (UID: \"ded78544-1cd7-41a9-b454-068d1024efd5\") " pod="openstack/keystone-bootstrap-ktdtt" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.058230 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t26qz\" (UniqueName: \"kubernetes.io/projected/d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0-kube-api-access-t26qz\") pod \"dnsmasq-dns-56dddd9f87-rr9l5\" (UID: \"d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0\") " pod="openstack/dnsmasq-dns-56dddd9f87-rr9l5" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.058415 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ded78544-1cd7-41a9-b454-068d1024efd5-credential-keys\") pod \"keystone-bootstrap-ktdtt\" (UID: \"ded78544-1cd7-41a9-b454-068d1024efd5\") " pod="openstack/keystone-bootstrap-ktdtt" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.058557 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ded78544-1cd7-41a9-b454-068d1024efd5-fernet-keys\") pod \"keystone-bootstrap-ktdtt\" (UID: \"ded78544-1cd7-41a9-b454-068d1024efd5\") " pod="openstack/keystone-bootstrap-ktdtt" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.058653 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzs7t\" (UniqueName: \"kubernetes.io/projected/ded78544-1cd7-41a9-b454-068d1024efd5-kube-api-access-fzs7t\") pod \"keystone-bootstrap-ktdtt\" (UID: \"ded78544-1cd7-41a9-b454-068d1024efd5\") " pod="openstack/keystone-bootstrap-ktdtt" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.058806 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0-ovsdbserver-nb\") pod \"dnsmasq-dns-56dddd9f87-rr9l5\" (UID: \"d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0\") " pod="openstack/dnsmasq-dns-56dddd9f87-rr9l5" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.058951 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ded78544-1cd7-41a9-b454-068d1024efd5-scripts\") pod \"keystone-bootstrap-ktdtt\" (UID: \"ded78544-1cd7-41a9-b454-068d1024efd5\") " pod="openstack/keystone-bootstrap-ktdtt" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.059076 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0-ovsdbserver-sb\") pod \"dnsmasq-dns-56dddd9f87-rr9l5\" (UID: \"d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0\") " pod="openstack/dnsmasq-dns-56dddd9f87-rr9l5" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.059968 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0-ovsdbserver-sb\") pod \"dnsmasq-dns-56dddd9f87-rr9l5\" (UID: \"d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0\") " pod="openstack/dnsmasq-dns-56dddd9f87-rr9l5" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.057335 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0-config\") pod \"dnsmasq-dns-56dddd9f87-rr9l5\" (UID: \"d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0\") " pod="openstack/dnsmasq-dns-56dddd9f87-rr9l5" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.056624 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0-dns-svc\") pod \"dnsmasq-dns-56dddd9f87-rr9l5\" (UID: \"d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0\") " pod="openstack/dnsmasq-dns-56dddd9f87-rr9l5" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.061666 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0-ovsdbserver-nb\") pod \"dnsmasq-dns-56dddd9f87-rr9l5\" (UID: \"d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0\") " pod="openstack/dnsmasq-dns-56dddd9f87-rr9l5" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.067643 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ded78544-1cd7-41a9-b454-068d1024efd5-config-data\") pod \"keystone-bootstrap-ktdtt\" (UID: \"ded78544-1cd7-41a9-b454-068d1024efd5\") " pod="openstack/keystone-bootstrap-ktdtt" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.069244 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ded78544-1cd7-41a9-b454-068d1024efd5-combined-ca-bundle\") pod \"keystone-bootstrap-ktdtt\" (UID: \"ded78544-1cd7-41a9-b454-068d1024efd5\") " pod="openstack/keystone-bootstrap-ktdtt" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.069897 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ded78544-1cd7-41a9-b454-068d1024efd5-fernet-keys\") pod \"keystone-bootstrap-ktdtt\" (UID: \"ded78544-1cd7-41a9-b454-068d1024efd5\") " pod="openstack/keystone-bootstrap-ktdtt" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.070206 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ded78544-1cd7-41a9-b454-068d1024efd5-scripts\") pod \"keystone-bootstrap-ktdtt\" (UID: \"ded78544-1cd7-41a9-b454-068d1024efd5\") " pod="openstack/keystone-bootstrap-ktdtt" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.092819 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ded78544-1cd7-41a9-b454-068d1024efd5-credential-keys\") pod \"keystone-bootstrap-ktdtt\" (UID: \"ded78544-1cd7-41a9-b454-068d1024efd5\") " pod="openstack/keystone-bootstrap-ktdtt" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.110072 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzs7t\" (UniqueName: \"kubernetes.io/projected/ded78544-1cd7-41a9-b454-068d1024efd5-kube-api-access-fzs7t\") pod \"keystone-bootstrap-ktdtt\" (UID: \"ded78544-1cd7-41a9-b454-068d1024efd5\") " pod="openstack/keystone-bootstrap-ktdtt" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.113652 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-pmw52"] Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.114664 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-pmw52" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.117159 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.126303 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t26qz\" (UniqueName: \"kubernetes.io/projected/d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0-kube-api-access-t26qz\") pod \"dnsmasq-dns-56dddd9f87-rr9l5\" (UID: \"d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0\") " pod="openstack/dnsmasq-dns-56dddd9f87-rr9l5" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.130479 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-ngkl4" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.130721 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.137993 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56dddd9f87-rr9l5" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.146204 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-8vwkl"] Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.147465 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-8vwkl" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.150794 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.151055 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.154411 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-bqpbr" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.175183 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-pmw52"] Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.208836 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-8vwkl"] Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.209035 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ktdtt" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.216579 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.238033 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.262766 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gptp8\" (UniqueName: \"kubernetes.io/projected/8e3d2794-7a14-49ad-9ca9-a73ec58d2981-kube-api-access-gptp8\") pod \"neutron-db-sync-8vwkl\" (UID: \"8e3d2794-7a14-49ad-9ca9-a73ec58d2981\") " pod="openstack/neutron-db-sync-8vwkl" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.262949 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7e3cdf1f-7963-494d-86f8-699e7401fe91-etc-machine-id\") pod \"cinder-db-sync-pmw52\" (UID: \"7e3cdf1f-7963-494d-86f8-699e7401fe91\") " pod="openstack/cinder-db-sync-pmw52" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.263054 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtwm9\" (UniqueName: \"kubernetes.io/projected/7e3cdf1f-7963-494d-86f8-699e7401fe91-kube-api-access-vtwm9\") pod \"cinder-db-sync-pmw52\" (UID: \"7e3cdf1f-7963-494d-86f8-699e7401fe91\") " pod="openstack/cinder-db-sync-pmw52" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.263181 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e3cdf1f-7963-494d-86f8-699e7401fe91-combined-ca-bundle\") pod \"cinder-db-sync-pmw52\" (UID: \"7e3cdf1f-7963-494d-86f8-699e7401fe91\") " pod="openstack/cinder-db-sync-pmw52" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.263278 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e3cdf1f-7963-494d-86f8-699e7401fe91-config-data\") pod \"cinder-db-sync-pmw52\" (UID: \"7e3cdf1f-7963-494d-86f8-699e7401fe91\") " pod="openstack/cinder-db-sync-pmw52" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.263377 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7e3cdf1f-7963-494d-86f8-699e7401fe91-db-sync-config-data\") pod \"cinder-db-sync-pmw52\" (UID: \"7e3cdf1f-7963-494d-86f8-699e7401fe91\") " pod="openstack/cinder-db-sync-pmw52" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.263489 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e3cdf1f-7963-494d-86f8-699e7401fe91-scripts\") pod \"cinder-db-sync-pmw52\" (UID: \"7e3cdf1f-7963-494d-86f8-699e7401fe91\") " pod="openstack/cinder-db-sync-pmw52" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.263651 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e3d2794-7a14-49ad-9ca9-a73ec58d2981-combined-ca-bundle\") pod \"neutron-db-sync-8vwkl\" (UID: \"8e3d2794-7a14-49ad-9ca9-a73ec58d2981\") " pod="openstack/neutron-db-sync-8vwkl" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.263779 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8e3d2794-7a14-49ad-9ca9-a73ec58d2981-config\") pod \"neutron-db-sync-8vwkl\" (UID: \"8e3d2794-7a14-49ad-9ca9-a73ec58d2981\") " pod="openstack/neutron-db-sync-8vwkl" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.264081 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.274678 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.292848 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.365189 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aa99848c-b3da-4307-afd1-f37484e648d6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"aa99848c-b3da-4307-afd1-f37484e648d6\") " pod="openstack/ceilometer-0" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.365229 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gptp8\" (UniqueName: \"kubernetes.io/projected/8e3d2794-7a14-49ad-9ca9-a73ec58d2981-kube-api-access-gptp8\") pod \"neutron-db-sync-8vwkl\" (UID: \"8e3d2794-7a14-49ad-9ca9-a73ec58d2981\") " pod="openstack/neutron-db-sync-8vwkl" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.365268 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7e3cdf1f-7963-494d-86f8-699e7401fe91-etc-machine-id\") pod \"cinder-db-sync-pmw52\" (UID: \"7e3cdf1f-7963-494d-86f8-699e7401fe91\") " pod="openstack/cinder-db-sync-pmw52" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.365287 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtwm9\" (UniqueName: \"kubernetes.io/projected/7e3cdf1f-7963-494d-86f8-699e7401fe91-kube-api-access-vtwm9\") pod \"cinder-db-sync-pmw52\" (UID: \"7e3cdf1f-7963-494d-86f8-699e7401fe91\") " pod="openstack/cinder-db-sync-pmw52" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.365313 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa99848c-b3da-4307-afd1-f37484e648d6-scripts\") pod \"ceilometer-0\" (UID: \"aa99848c-b3da-4307-afd1-f37484e648d6\") " pod="openstack/ceilometer-0" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.365341 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e3cdf1f-7963-494d-86f8-699e7401fe91-combined-ca-bundle\") pod \"cinder-db-sync-pmw52\" (UID: \"7e3cdf1f-7963-494d-86f8-699e7401fe91\") " pod="openstack/cinder-db-sync-pmw52" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.365357 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e3cdf1f-7963-494d-86f8-699e7401fe91-config-data\") pod \"cinder-db-sync-pmw52\" (UID: \"7e3cdf1f-7963-494d-86f8-699e7401fe91\") " pod="openstack/cinder-db-sync-pmw52" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.365381 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7e3cdf1f-7963-494d-86f8-699e7401fe91-db-sync-config-data\") pod \"cinder-db-sync-pmw52\" (UID: \"7e3cdf1f-7963-494d-86f8-699e7401fe91\") " pod="openstack/cinder-db-sync-pmw52" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.365403 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa99848c-b3da-4307-afd1-f37484e648d6-log-httpd\") pod \"ceilometer-0\" (UID: \"aa99848c-b3da-4307-afd1-f37484e648d6\") " pod="openstack/ceilometer-0" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.365430 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa99848c-b3da-4307-afd1-f37484e648d6-config-data\") pod \"ceilometer-0\" (UID: \"aa99848c-b3da-4307-afd1-f37484e648d6\") " pod="openstack/ceilometer-0" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.365449 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e3cdf1f-7963-494d-86f8-699e7401fe91-scripts\") pod \"cinder-db-sync-pmw52\" (UID: \"7e3cdf1f-7963-494d-86f8-699e7401fe91\") " pod="openstack/cinder-db-sync-pmw52" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.365478 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e3d2794-7a14-49ad-9ca9-a73ec58d2981-combined-ca-bundle\") pod \"neutron-db-sync-8vwkl\" (UID: \"8e3d2794-7a14-49ad-9ca9-a73ec58d2981\") " pod="openstack/neutron-db-sync-8vwkl" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.365494 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa99848c-b3da-4307-afd1-f37484e648d6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"aa99848c-b3da-4307-afd1-f37484e648d6\") " pod="openstack/ceilometer-0" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.365516 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8e3d2794-7a14-49ad-9ca9-a73ec58d2981-config\") pod \"neutron-db-sync-8vwkl\" (UID: \"8e3d2794-7a14-49ad-9ca9-a73ec58d2981\") " pod="openstack/neutron-db-sync-8vwkl" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.365565 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvddj\" (UniqueName: \"kubernetes.io/projected/aa99848c-b3da-4307-afd1-f37484e648d6-kube-api-access-zvddj\") pod \"ceilometer-0\" (UID: \"aa99848c-b3da-4307-afd1-f37484e648d6\") " pod="openstack/ceilometer-0" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.365589 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa99848c-b3da-4307-afd1-f37484e648d6-run-httpd\") pod \"ceilometer-0\" (UID: \"aa99848c-b3da-4307-afd1-f37484e648d6\") " pod="openstack/ceilometer-0" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.365878 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7e3cdf1f-7963-494d-86f8-699e7401fe91-etc-machine-id\") pod \"cinder-db-sync-pmw52\" (UID: \"7e3cdf1f-7963-494d-86f8-699e7401fe91\") " pod="openstack/cinder-db-sync-pmw52" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.373905 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e3cdf1f-7963-494d-86f8-699e7401fe91-combined-ca-bundle\") pod \"cinder-db-sync-pmw52\" (UID: \"7e3cdf1f-7963-494d-86f8-699e7401fe91\") " pod="openstack/cinder-db-sync-pmw52" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.376380 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e3cdf1f-7963-494d-86f8-699e7401fe91-config-data\") pod \"cinder-db-sync-pmw52\" (UID: \"7e3cdf1f-7963-494d-86f8-699e7401fe91\") " pod="openstack/cinder-db-sync-pmw52" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.376778 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e3cdf1f-7963-494d-86f8-699e7401fe91-scripts\") pod \"cinder-db-sync-pmw52\" (UID: \"7e3cdf1f-7963-494d-86f8-699e7401fe91\") " pod="openstack/cinder-db-sync-pmw52" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.400302 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7e3cdf1f-7963-494d-86f8-699e7401fe91-db-sync-config-data\") pod \"cinder-db-sync-pmw52\" (UID: \"7e3cdf1f-7963-494d-86f8-699e7401fe91\") " pod="openstack/cinder-db-sync-pmw52" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.400309 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e3d2794-7a14-49ad-9ca9-a73ec58d2981-combined-ca-bundle\") pod \"neutron-db-sync-8vwkl\" (UID: \"8e3d2794-7a14-49ad-9ca9-a73ec58d2981\") " pod="openstack/neutron-db-sync-8vwkl" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.402702 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-xwpzh"] Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.403557 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/8e3d2794-7a14-49ad-9ca9-a73ec58d2981-config\") pod \"neutron-db-sync-8vwkl\" (UID: \"8e3d2794-7a14-49ad-9ca9-a73ec58d2981\") " pod="openstack/neutron-db-sync-8vwkl" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.403746 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-xwpzh" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.404045 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gptp8\" (UniqueName: \"kubernetes.io/projected/8e3d2794-7a14-49ad-9ca9-a73ec58d2981-kube-api-access-gptp8\") pod \"neutron-db-sync-8vwkl\" (UID: \"8e3d2794-7a14-49ad-9ca9-a73ec58d2981\") " pod="openstack/neutron-db-sync-8vwkl" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.408463 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtwm9\" (UniqueName: \"kubernetes.io/projected/7e3cdf1f-7963-494d-86f8-699e7401fe91-kube-api-access-vtwm9\") pod \"cinder-db-sync-pmw52\" (UID: \"7e3cdf1f-7963-494d-86f8-699e7401fe91\") " pod="openstack/cinder-db-sync-pmw52" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.422323 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-w74g5" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.422586 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.457321 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-mdkpp"] Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.458381 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-mdkpp" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.464941 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.467257 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-h869l" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.469845 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8a25b2a2-2d1a-4e0c-8114-c796ed4fe1e2-db-sync-config-data\") pod \"barbican-db-sync-xwpzh\" (UID: \"8a25b2a2-2d1a-4e0c-8114-c796ed4fe1e2\") " pod="openstack/barbican-db-sync-xwpzh" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.469894 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa99848c-b3da-4307-afd1-f37484e648d6-log-httpd\") pod \"ceilometer-0\" (UID: \"aa99848c-b3da-4307-afd1-f37484e648d6\") " pod="openstack/ceilometer-0" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.469948 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa99848c-b3da-4307-afd1-f37484e648d6-config-data\") pod \"ceilometer-0\" (UID: \"aa99848c-b3da-4307-afd1-f37484e648d6\") " pod="openstack/ceilometer-0" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.469994 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa99848c-b3da-4307-afd1-f37484e648d6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"aa99848c-b3da-4307-afd1-f37484e648d6\") " pod="openstack/ceilometer-0" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.470029 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvddj\" (UniqueName: \"kubernetes.io/projected/aa99848c-b3da-4307-afd1-f37484e648d6-kube-api-access-zvddj\") pod \"ceilometer-0\" (UID: \"aa99848c-b3da-4307-afd1-f37484e648d6\") " pod="openstack/ceilometer-0" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.470049 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78vcx\" (UniqueName: \"kubernetes.io/projected/8a25b2a2-2d1a-4e0c-8114-c796ed4fe1e2-kube-api-access-78vcx\") pod \"barbican-db-sync-xwpzh\" (UID: \"8a25b2a2-2d1a-4e0c-8114-c796ed4fe1e2\") " pod="openstack/barbican-db-sync-xwpzh" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.470065 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa99848c-b3da-4307-afd1-f37484e648d6-run-httpd\") pod \"ceilometer-0\" (UID: \"aa99848c-b3da-4307-afd1-f37484e648d6\") " pod="openstack/ceilometer-0" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.470092 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aa99848c-b3da-4307-afd1-f37484e648d6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"aa99848c-b3da-4307-afd1-f37484e648d6\") " pod="openstack/ceilometer-0" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.470133 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a25b2a2-2d1a-4e0c-8114-c796ed4fe1e2-combined-ca-bundle\") pod \"barbican-db-sync-xwpzh\" (UID: \"8a25b2a2-2d1a-4e0c-8114-c796ed4fe1e2\") " pod="openstack/barbican-db-sync-xwpzh" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.470178 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa99848c-b3da-4307-afd1-f37484e648d6-scripts\") pod \"ceilometer-0\" (UID: \"aa99848c-b3da-4307-afd1-f37484e648d6\") " pod="openstack/ceilometer-0" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.471648 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa99848c-b3da-4307-afd1-f37484e648d6-log-httpd\") pod \"ceilometer-0\" (UID: \"aa99848c-b3da-4307-afd1-f37484e648d6\") " pod="openstack/ceilometer-0" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.473616 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa99848c-b3da-4307-afd1-f37484e648d6-scripts\") pod \"ceilometer-0\" (UID: \"aa99848c-b3da-4307-afd1-f37484e648d6\") " pod="openstack/ceilometer-0" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.475013 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa99848c-b3da-4307-afd1-f37484e648d6-run-httpd\") pod \"ceilometer-0\" (UID: \"aa99848c-b3da-4307-afd1-f37484e648d6\") " pod="openstack/ceilometer-0" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.477135 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.478367 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aa99848c-b3da-4307-afd1-f37484e648d6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"aa99848c-b3da-4307-afd1-f37484e648d6\") " pod="openstack/ceilometer-0" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.486300 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa99848c-b3da-4307-afd1-f37484e648d6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"aa99848c-b3da-4307-afd1-f37484e648d6\") " pod="openstack/ceilometer-0" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.488121 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa99848c-b3da-4307-afd1-f37484e648d6-config-data\") pod \"ceilometer-0\" (UID: \"aa99848c-b3da-4307-afd1-f37484e648d6\") " pod="openstack/ceilometer-0" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.492904 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-mdkpp"] Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.533312 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvddj\" (UniqueName: \"kubernetes.io/projected/aa99848c-b3da-4307-afd1-f37484e648d6-kube-api-access-zvddj\") pod \"ceilometer-0\" (UID: \"aa99848c-b3da-4307-afd1-f37484e648d6\") " pod="openstack/ceilometer-0" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.562523 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-xwpzh"] Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.562613 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56dddd9f87-rr9l5"] Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.571398 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkqc8\" (UniqueName: \"kubernetes.io/projected/0b4b8032-3d08-4574-9987-fc159aa8c506-kube-api-access-gkqc8\") pod \"placement-db-sync-mdkpp\" (UID: \"0b4b8032-3d08-4574-9987-fc159aa8c506\") " pod="openstack/placement-db-sync-mdkpp" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.571527 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b4b8032-3d08-4574-9987-fc159aa8c506-scripts\") pod \"placement-db-sync-mdkpp\" (UID: \"0b4b8032-3d08-4574-9987-fc159aa8c506\") " pod="openstack/placement-db-sync-mdkpp" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.571627 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78vcx\" (UniqueName: \"kubernetes.io/projected/8a25b2a2-2d1a-4e0c-8114-c796ed4fe1e2-kube-api-access-78vcx\") pod \"barbican-db-sync-xwpzh\" (UID: \"8a25b2a2-2d1a-4e0c-8114-c796ed4fe1e2\") " pod="openstack/barbican-db-sync-xwpzh" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.571816 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b4b8032-3d08-4574-9987-fc159aa8c506-combined-ca-bundle\") pod \"placement-db-sync-mdkpp\" (UID: \"0b4b8032-3d08-4574-9987-fc159aa8c506\") " pod="openstack/placement-db-sync-mdkpp" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.571906 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a25b2a2-2d1a-4e0c-8114-c796ed4fe1e2-combined-ca-bundle\") pod \"barbican-db-sync-xwpzh\" (UID: \"8a25b2a2-2d1a-4e0c-8114-c796ed4fe1e2\") " pod="openstack/barbican-db-sync-xwpzh" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.571960 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b4b8032-3d08-4574-9987-fc159aa8c506-logs\") pod \"placement-db-sync-mdkpp\" (UID: \"0b4b8032-3d08-4574-9987-fc159aa8c506\") " pod="openstack/placement-db-sync-mdkpp" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.572076 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8a25b2a2-2d1a-4e0c-8114-c796ed4fe1e2-db-sync-config-data\") pod \"barbican-db-sync-xwpzh\" (UID: \"8a25b2a2-2d1a-4e0c-8114-c796ed4fe1e2\") " pod="openstack/barbican-db-sync-xwpzh" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.572137 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b4b8032-3d08-4574-9987-fc159aa8c506-config-data\") pod \"placement-db-sync-mdkpp\" (UID: \"0b4b8032-3d08-4574-9987-fc159aa8c506\") " pod="openstack/placement-db-sync-mdkpp" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.587319 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55f778bb97-s6z9c"] Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.592143 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f778bb97-s6z9c" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.593121 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-pmw52" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.593279 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a25b2a2-2d1a-4e0c-8114-c796ed4fe1e2-combined-ca-bundle\") pod \"barbican-db-sync-xwpzh\" (UID: \"8a25b2a2-2d1a-4e0c-8114-c796ed4fe1e2\") " pod="openstack/barbican-db-sync-xwpzh" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.593617 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8a25b2a2-2d1a-4e0c-8114-c796ed4fe1e2-db-sync-config-data\") pod \"barbican-db-sync-xwpzh\" (UID: \"8a25b2a2-2d1a-4e0c-8114-c796ed4fe1e2\") " pod="openstack/barbican-db-sync-xwpzh" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.598570 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f778bb97-s6z9c"] Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.603557 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78vcx\" (UniqueName: \"kubernetes.io/projected/8a25b2a2-2d1a-4e0c-8114-c796ed4fe1e2-kube-api-access-78vcx\") pod \"barbican-db-sync-xwpzh\" (UID: \"8a25b2a2-2d1a-4e0c-8114-c796ed4fe1e2\") " pod="openstack/barbican-db-sync-xwpzh" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.660362 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"079232b7-87bb-42cf-96ff-1eb2d1cfe2b5","Type":"ContainerStarted","Data":"265624aa91f386fecc5aba702935646c1883c30507f6b5633f11a004e1345c10"} Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.662963 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-8vwkl" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.673951 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f453ebfe-e64b-475d-b37b-3cbb5f564a14-dns-svc\") pod \"dnsmasq-dns-55f778bb97-s6z9c\" (UID: \"f453ebfe-e64b-475d-b37b-3cbb5f564a14\") " pod="openstack/dnsmasq-dns-55f778bb97-s6z9c" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.674009 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b4b8032-3d08-4574-9987-fc159aa8c506-combined-ca-bundle\") pod \"placement-db-sync-mdkpp\" (UID: \"0b4b8032-3d08-4574-9987-fc159aa8c506\") " pod="openstack/placement-db-sync-mdkpp" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.674041 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b4b8032-3d08-4574-9987-fc159aa8c506-logs\") pod \"placement-db-sync-mdkpp\" (UID: \"0b4b8032-3d08-4574-9987-fc159aa8c506\") " pod="openstack/placement-db-sync-mdkpp" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.674070 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f453ebfe-e64b-475d-b37b-3cbb5f564a14-ovsdbserver-nb\") pod \"dnsmasq-dns-55f778bb97-s6z9c\" (UID: \"f453ebfe-e64b-475d-b37b-3cbb5f564a14\") " pod="openstack/dnsmasq-dns-55f778bb97-s6z9c" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.674112 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b4b8032-3d08-4574-9987-fc159aa8c506-config-data\") pod \"placement-db-sync-mdkpp\" (UID: \"0b4b8032-3d08-4574-9987-fc159aa8c506\") " pod="openstack/placement-db-sync-mdkpp" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.674155 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f453ebfe-e64b-475d-b37b-3cbb5f564a14-config\") pod \"dnsmasq-dns-55f778bb97-s6z9c\" (UID: \"f453ebfe-e64b-475d-b37b-3cbb5f564a14\") " pod="openstack/dnsmasq-dns-55f778bb97-s6z9c" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.674187 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bd84l\" (UniqueName: \"kubernetes.io/projected/f453ebfe-e64b-475d-b37b-3cbb5f564a14-kube-api-access-bd84l\") pod \"dnsmasq-dns-55f778bb97-s6z9c\" (UID: \"f453ebfe-e64b-475d-b37b-3cbb5f564a14\") " pod="openstack/dnsmasq-dns-55f778bb97-s6z9c" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.674220 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkqc8\" (UniqueName: \"kubernetes.io/projected/0b4b8032-3d08-4574-9987-fc159aa8c506-kube-api-access-gkqc8\") pod \"placement-db-sync-mdkpp\" (UID: \"0b4b8032-3d08-4574-9987-fc159aa8c506\") " pod="openstack/placement-db-sync-mdkpp" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.674237 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f453ebfe-e64b-475d-b37b-3cbb5f564a14-ovsdbserver-sb\") pod \"dnsmasq-dns-55f778bb97-s6z9c\" (UID: \"f453ebfe-e64b-475d-b37b-3cbb5f564a14\") " pod="openstack/dnsmasq-dns-55f778bb97-s6z9c" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.674267 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b4b8032-3d08-4574-9987-fc159aa8c506-scripts\") pod \"placement-db-sync-mdkpp\" (UID: \"0b4b8032-3d08-4574-9987-fc159aa8c506\") " pod="openstack/placement-db-sync-mdkpp" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.675819 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b4b8032-3d08-4574-9987-fc159aa8c506-logs\") pod \"placement-db-sync-mdkpp\" (UID: \"0b4b8032-3d08-4574-9987-fc159aa8c506\") " pod="openstack/placement-db-sync-mdkpp" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.678456 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b4b8032-3d08-4574-9987-fc159aa8c506-combined-ca-bundle\") pod \"placement-db-sync-mdkpp\" (UID: \"0b4b8032-3d08-4574-9987-fc159aa8c506\") " pod="openstack/placement-db-sync-mdkpp" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.679153 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b4b8032-3d08-4574-9987-fc159aa8c506-config-data\") pod \"placement-db-sync-mdkpp\" (UID: \"0b4b8032-3d08-4574-9987-fc159aa8c506\") " pod="openstack/placement-db-sync-mdkpp" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.679459 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b4b8032-3d08-4574-9987-fc159aa8c506-scripts\") pod \"placement-db-sync-mdkpp\" (UID: \"0b4b8032-3d08-4574-9987-fc159aa8c506\") " pod="openstack/placement-db-sync-mdkpp" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.708076 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.717281 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkqc8\" (UniqueName: \"kubernetes.io/projected/0b4b8032-3d08-4574-9987-fc159aa8c506-kube-api-access-gkqc8\") pod \"placement-db-sync-mdkpp\" (UID: \"0b4b8032-3d08-4574-9987-fc159aa8c506\") " pod="openstack/placement-db-sync-mdkpp" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.754280 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-xwpzh" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.776297 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f453ebfe-e64b-475d-b37b-3cbb5f564a14-dns-svc\") pod \"dnsmasq-dns-55f778bb97-s6z9c\" (UID: \"f453ebfe-e64b-475d-b37b-3cbb5f564a14\") " pod="openstack/dnsmasq-dns-55f778bb97-s6z9c" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.776373 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f453ebfe-e64b-475d-b37b-3cbb5f564a14-ovsdbserver-nb\") pod \"dnsmasq-dns-55f778bb97-s6z9c\" (UID: \"f453ebfe-e64b-475d-b37b-3cbb5f564a14\") " pod="openstack/dnsmasq-dns-55f778bb97-s6z9c" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.776429 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f453ebfe-e64b-475d-b37b-3cbb5f564a14-config\") pod \"dnsmasq-dns-55f778bb97-s6z9c\" (UID: \"f453ebfe-e64b-475d-b37b-3cbb5f564a14\") " pod="openstack/dnsmasq-dns-55f778bb97-s6z9c" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.776450 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bd84l\" (UniqueName: \"kubernetes.io/projected/f453ebfe-e64b-475d-b37b-3cbb5f564a14-kube-api-access-bd84l\") pod \"dnsmasq-dns-55f778bb97-s6z9c\" (UID: \"f453ebfe-e64b-475d-b37b-3cbb5f564a14\") " pod="openstack/dnsmasq-dns-55f778bb97-s6z9c" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.776475 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f453ebfe-e64b-475d-b37b-3cbb5f564a14-ovsdbserver-sb\") pod \"dnsmasq-dns-55f778bb97-s6z9c\" (UID: \"f453ebfe-e64b-475d-b37b-3cbb5f564a14\") " pod="openstack/dnsmasq-dns-55f778bb97-s6z9c" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.778359 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f453ebfe-e64b-475d-b37b-3cbb5f564a14-ovsdbserver-sb\") pod \"dnsmasq-dns-55f778bb97-s6z9c\" (UID: \"f453ebfe-e64b-475d-b37b-3cbb5f564a14\") " pod="openstack/dnsmasq-dns-55f778bb97-s6z9c" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.779071 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f453ebfe-e64b-475d-b37b-3cbb5f564a14-dns-svc\") pod \"dnsmasq-dns-55f778bb97-s6z9c\" (UID: \"f453ebfe-e64b-475d-b37b-3cbb5f564a14\") " pod="openstack/dnsmasq-dns-55f778bb97-s6z9c" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.779650 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f453ebfe-e64b-475d-b37b-3cbb5f564a14-ovsdbserver-nb\") pod \"dnsmasq-dns-55f778bb97-s6z9c\" (UID: \"f453ebfe-e64b-475d-b37b-3cbb5f564a14\") " pod="openstack/dnsmasq-dns-55f778bb97-s6z9c" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.780543 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f453ebfe-e64b-475d-b37b-3cbb5f564a14-config\") pod \"dnsmasq-dns-55f778bb97-s6z9c\" (UID: \"f453ebfe-e64b-475d-b37b-3cbb5f564a14\") " pod="openstack/dnsmasq-dns-55f778bb97-s6z9c" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.801088 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bd84l\" (UniqueName: \"kubernetes.io/projected/f453ebfe-e64b-475d-b37b-3cbb5f564a14-kube-api-access-bd84l\") pod \"dnsmasq-dns-55f778bb97-s6z9c\" (UID: \"f453ebfe-e64b-475d-b37b-3cbb5f564a14\") " pod="openstack/dnsmasq-dns-55f778bb97-s6z9c" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.809887 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-mdkpp" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.885334 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56dddd9f87-rr9l5"] Mar 14 07:20:38 crc kubenswrapper[4893]: W0314 07:20:38.898780 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd9e6e58a_ec43_4dd9_8dae_eddbcfe782e0.slice/crio-f12e8194b949c99fade8546f1ecea09fe62970c76a7723b0883a6507d367c147 WatchSource:0}: Error finding container f12e8194b949c99fade8546f1ecea09fe62970c76a7723b0883a6507d367c147: Status 404 returned error can't find the container with id f12e8194b949c99fade8546f1ecea09fe62970c76a7723b0883a6507d367c147 Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.930193 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f778bb97-s6z9c" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.993766 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.995085 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.999828 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 14 07:20:38 crc kubenswrapper[4893]: I0314 07:20:38.999937 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-74rpk" Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.000238 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.000347 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.002204 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.052044 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-ktdtt"] Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.079456 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.081283 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.083322 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b6e14bd-ad75-4482-aaf2-af8cfae25e2f-config-data\") pod \"glance-default-external-api-0\" (UID: \"2b6e14bd-ad75-4482-aaf2-af8cfae25e2f\") " pod="openstack/glance-default-external-api-0" Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.083378 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b6e14bd-ad75-4482-aaf2-af8cfae25e2f-logs\") pod \"glance-default-external-api-0\" (UID: \"2b6e14bd-ad75-4482-aaf2-af8cfae25e2f\") " pod="openstack/glance-default-external-api-0" Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.083441 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2b6e14bd-ad75-4482-aaf2-af8cfae25e2f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2b6e14bd-ad75-4482-aaf2-af8cfae25e2f\") " pod="openstack/glance-default-external-api-0" Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.083468 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b6e14bd-ad75-4482-aaf2-af8cfae25e2f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2b6e14bd-ad75-4482-aaf2-af8cfae25e2f\") " pod="openstack/glance-default-external-api-0" Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.083488 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"2b6e14bd-ad75-4482-aaf2-af8cfae25e2f\") " pod="openstack/glance-default-external-api-0" Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.083608 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b6e14bd-ad75-4482-aaf2-af8cfae25e2f-scripts\") pod \"glance-default-external-api-0\" (UID: \"2b6e14bd-ad75-4482-aaf2-af8cfae25e2f\") " pod="openstack/glance-default-external-api-0" Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.083650 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b6e14bd-ad75-4482-aaf2-af8cfae25e2f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2b6e14bd-ad75-4482-aaf2-af8cfae25e2f\") " pod="openstack/glance-default-external-api-0" Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.083702 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvrvv\" (UniqueName: \"kubernetes.io/projected/2b6e14bd-ad75-4482-aaf2-af8cfae25e2f-kube-api-access-zvrvv\") pod \"glance-default-external-api-0\" (UID: \"2b6e14bd-ad75-4482-aaf2-af8cfae25e2f\") " pod="openstack/glance-default-external-api-0" Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.089552 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.090087 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.099810 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.172038 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-pmw52"] Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.189769 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b6e14bd-ad75-4482-aaf2-af8cfae25e2f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2b6e14bd-ad75-4482-aaf2-af8cfae25e2f\") " pod="openstack/glance-default-external-api-0" Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.189805 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eed2e960-779c-43b9-b2b3-fe08a1957351-scripts\") pod \"glance-default-internal-api-0\" (UID: \"eed2e960-779c-43b9-b2b3-fe08a1957351\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.189829 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvrvv\" (UniqueName: \"kubernetes.io/projected/2b6e14bd-ad75-4482-aaf2-af8cfae25e2f-kube-api-access-zvrvv\") pod \"glance-default-external-api-0\" (UID: \"2b6e14bd-ad75-4482-aaf2-af8cfae25e2f\") " pod="openstack/glance-default-external-api-0" Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.189848 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eed2e960-779c-43b9-b2b3-fe08a1957351-config-data\") pod \"glance-default-internal-api-0\" (UID: \"eed2e960-779c-43b9-b2b3-fe08a1957351\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.189868 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eed2e960-779c-43b9-b2b3-fe08a1957351-logs\") pod \"glance-default-internal-api-0\" (UID: \"eed2e960-779c-43b9-b2b3-fe08a1957351\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.189886 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b6e14bd-ad75-4482-aaf2-af8cfae25e2f-config-data\") pod \"glance-default-external-api-0\" (UID: \"2b6e14bd-ad75-4482-aaf2-af8cfae25e2f\") " pod="openstack/glance-default-external-api-0" Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.189908 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b6e14bd-ad75-4482-aaf2-af8cfae25e2f-logs\") pod \"glance-default-external-api-0\" (UID: \"2b6e14bd-ad75-4482-aaf2-af8cfae25e2f\") " pod="openstack/glance-default-external-api-0" Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.189940 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2b6e14bd-ad75-4482-aaf2-af8cfae25e2f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2b6e14bd-ad75-4482-aaf2-af8cfae25e2f\") " pod="openstack/glance-default-external-api-0" Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.189957 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgnhl\" (UniqueName: \"kubernetes.io/projected/eed2e960-779c-43b9-b2b3-fe08a1957351-kube-api-access-wgnhl\") pod \"glance-default-internal-api-0\" (UID: \"eed2e960-779c-43b9-b2b3-fe08a1957351\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.189982 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b6e14bd-ad75-4482-aaf2-af8cfae25e2f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2b6e14bd-ad75-4482-aaf2-af8cfae25e2f\") " pod="openstack/glance-default-external-api-0" Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.190002 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"eed2e960-779c-43b9-b2b3-fe08a1957351\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.190017 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"2b6e14bd-ad75-4482-aaf2-af8cfae25e2f\") " pod="openstack/glance-default-external-api-0" Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.190063 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eed2e960-779c-43b9-b2b3-fe08a1957351-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"eed2e960-779c-43b9-b2b3-fe08a1957351\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.190082 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eed2e960-779c-43b9-b2b3-fe08a1957351-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"eed2e960-779c-43b9-b2b3-fe08a1957351\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.190101 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b6e14bd-ad75-4482-aaf2-af8cfae25e2f-scripts\") pod \"glance-default-external-api-0\" (UID: \"2b6e14bd-ad75-4482-aaf2-af8cfae25e2f\") " pod="openstack/glance-default-external-api-0" Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.190129 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eed2e960-779c-43b9-b2b3-fe08a1957351-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"eed2e960-779c-43b9-b2b3-fe08a1957351\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.195060 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b6e14bd-ad75-4482-aaf2-af8cfae25e2f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2b6e14bd-ad75-4482-aaf2-af8cfae25e2f\") " pod="openstack/glance-default-external-api-0" Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.199919 4893 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"2b6e14bd-ad75-4482-aaf2-af8cfae25e2f\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.199940 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2b6e14bd-ad75-4482-aaf2-af8cfae25e2f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2b6e14bd-ad75-4482-aaf2-af8cfae25e2f\") " pod="openstack/glance-default-external-api-0" Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.200017 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b6e14bd-ad75-4482-aaf2-af8cfae25e2f-config-data\") pod \"glance-default-external-api-0\" (UID: \"2b6e14bd-ad75-4482-aaf2-af8cfae25e2f\") " pod="openstack/glance-default-external-api-0" Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.200311 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b6e14bd-ad75-4482-aaf2-af8cfae25e2f-logs\") pod \"glance-default-external-api-0\" (UID: \"2b6e14bd-ad75-4482-aaf2-af8cfae25e2f\") " pod="openstack/glance-default-external-api-0" Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.205811 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b6e14bd-ad75-4482-aaf2-af8cfae25e2f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2b6e14bd-ad75-4482-aaf2-af8cfae25e2f\") " pod="openstack/glance-default-external-api-0" Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.206867 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b6e14bd-ad75-4482-aaf2-af8cfae25e2f-scripts\") pod \"glance-default-external-api-0\" (UID: \"2b6e14bd-ad75-4482-aaf2-af8cfae25e2f\") " pod="openstack/glance-default-external-api-0" Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.213522 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvrvv\" (UniqueName: \"kubernetes.io/projected/2b6e14bd-ad75-4482-aaf2-af8cfae25e2f-kube-api-access-zvrvv\") pod \"glance-default-external-api-0\" (UID: \"2b6e14bd-ad75-4482-aaf2-af8cfae25e2f\") " pod="openstack/glance-default-external-api-0" Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.237368 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"2b6e14bd-ad75-4482-aaf2-af8cfae25e2f\") " pod="openstack/glance-default-external-api-0" Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.292926 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"eed2e960-779c-43b9-b2b3-fe08a1957351\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.293111 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eed2e960-779c-43b9-b2b3-fe08a1957351-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"eed2e960-779c-43b9-b2b3-fe08a1957351\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.293185 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eed2e960-779c-43b9-b2b3-fe08a1957351-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"eed2e960-779c-43b9-b2b3-fe08a1957351\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.293271 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eed2e960-779c-43b9-b2b3-fe08a1957351-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"eed2e960-779c-43b9-b2b3-fe08a1957351\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.293365 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eed2e960-779c-43b9-b2b3-fe08a1957351-scripts\") pod \"glance-default-internal-api-0\" (UID: \"eed2e960-779c-43b9-b2b3-fe08a1957351\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.293450 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eed2e960-779c-43b9-b2b3-fe08a1957351-config-data\") pod \"glance-default-internal-api-0\" (UID: \"eed2e960-779c-43b9-b2b3-fe08a1957351\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.293517 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eed2e960-779c-43b9-b2b3-fe08a1957351-logs\") pod \"glance-default-internal-api-0\" (UID: \"eed2e960-779c-43b9-b2b3-fe08a1957351\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.293669 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgnhl\" (UniqueName: \"kubernetes.io/projected/eed2e960-779c-43b9-b2b3-fe08a1957351-kube-api-access-wgnhl\") pod \"glance-default-internal-api-0\" (UID: \"eed2e960-779c-43b9-b2b3-fe08a1957351\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.294140 4893 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"eed2e960-779c-43b9-b2b3-fe08a1957351\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.302883 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eed2e960-779c-43b9-b2b3-fe08a1957351-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"eed2e960-779c-43b9-b2b3-fe08a1957351\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.303098 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eed2e960-779c-43b9-b2b3-fe08a1957351-scripts\") pod \"glance-default-internal-api-0\" (UID: \"eed2e960-779c-43b9-b2b3-fe08a1957351\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.304124 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eed2e960-779c-43b9-b2b3-fe08a1957351-logs\") pod \"glance-default-internal-api-0\" (UID: \"eed2e960-779c-43b9-b2b3-fe08a1957351\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.309187 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eed2e960-779c-43b9-b2b3-fe08a1957351-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"eed2e960-779c-43b9-b2b3-fe08a1957351\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.309441 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eed2e960-779c-43b9-b2b3-fe08a1957351-config-data\") pod \"glance-default-internal-api-0\" (UID: \"eed2e960-779c-43b9-b2b3-fe08a1957351\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.311939 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eed2e960-779c-43b9-b2b3-fe08a1957351-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"eed2e960-779c-43b9-b2b3-fe08a1957351\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.316067 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgnhl\" (UniqueName: \"kubernetes.io/projected/eed2e960-779c-43b9-b2b3-fe08a1957351-kube-api-access-wgnhl\") pod \"glance-default-internal-api-0\" (UID: \"eed2e960-779c-43b9-b2b3-fe08a1957351\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.355099 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"eed2e960-779c-43b9-b2b3-fe08a1957351\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.370048 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.376014 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-8vwkl"] Mar 14 07:20:39 crc kubenswrapper[4893]: W0314 07:20:39.390581 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa99848c_b3da_4307_afd1_f37484e648d6.slice/crio-6657b5601190b53d82e0b35bb8a5d46cd2e10e5b4b843181cbcd41dbc0349859 WatchSource:0}: Error finding container 6657b5601190b53d82e0b35bb8a5d46cd2e10e5b4b843181cbcd41dbc0349859: Status 404 returned error can't find the container with id 6657b5601190b53d82e0b35bb8a5d46cd2e10e5b4b843181cbcd41dbc0349859 Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.488635 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-mdkpp"] Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.493846 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-xwpzh"] Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.531388 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.561200 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.641570 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f778bb97-s6z9c"] Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.677416 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-pmw52" event={"ID":"7e3cdf1f-7963-494d-86f8-699e7401fe91","Type":"ContainerStarted","Data":"7e5ba0e85aa01810f970671e1773e20c17d177c13ca8b33514f7d251ab9a0481"} Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.678626 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aa99848c-b3da-4307-afd1-f37484e648d6","Type":"ContainerStarted","Data":"6657b5601190b53d82e0b35bb8a5d46cd2e10e5b4b843181cbcd41dbc0349859"} Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.679979 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-8vwkl" event={"ID":"8e3d2794-7a14-49ad-9ca9-a73ec58d2981","Type":"ContainerStarted","Data":"5b7e5c52c542dd7f93b569a371b5ccfcf5517d9ae865ac9c63de27f5cae741bd"} Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.682493 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ktdtt" event={"ID":"ded78544-1cd7-41a9-b454-068d1024efd5","Type":"ContainerStarted","Data":"af0bc49ff538ba4ee33598bcf5c437e12efb9cbf76333f072b88487e74333566"} Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.682540 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ktdtt" event={"ID":"ded78544-1cd7-41a9-b454-068d1024efd5","Type":"ContainerStarted","Data":"f317b96e3aa65dcb06a64b4b7b0260e8cb634313f64f538b941f7d5c4eff9f93"} Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.684198 4893 generic.go:334] "Generic (PLEG): container finished" podID="d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0" containerID="1e3fa598f44c1c25f871366d7f96e07083ae7af95e7679bd5be8d7434c3a266c" exitCode=0 Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.684234 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56dddd9f87-rr9l5" event={"ID":"d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0","Type":"ContainerDied","Data":"1e3fa598f44c1c25f871366d7f96e07083ae7af95e7679bd5be8d7434c3a266c"} Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.684255 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56dddd9f87-rr9l5" event={"ID":"d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0","Type":"ContainerStarted","Data":"f12e8194b949c99fade8546f1ecea09fe62970c76a7723b0883a6507d367c147"} Mar 14 07:20:39 crc kubenswrapper[4893]: I0314 07:20:39.707374 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-ktdtt" podStartSLOduration=2.707341032 podStartE2EDuration="2.707341032s" podCreationTimestamp="2026-03-14 07:20:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:20:39.700904025 +0000 UTC m=+1318.963080817" watchObservedRunningTime="2026-03-14 07:20:39.707341032 +0000 UTC m=+1318.969517824" Mar 14 07:20:39 crc kubenswrapper[4893]: W0314 07:20:39.764822 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0b4b8032_3d08_4574_9987_fc159aa8c506.slice/crio-ca062e024534d6308c9c6fe94a63e91ea2018dc1d51de4d15ba7e0da8e82ace0 WatchSource:0}: Error finding container ca062e024534d6308c9c6fe94a63e91ea2018dc1d51de4d15ba7e0da8e82ace0: Status 404 returned error can't find the container with id ca062e024534d6308c9c6fe94a63e91ea2018dc1d51de4d15ba7e0da8e82ace0 Mar 14 07:20:40 crc kubenswrapper[4893]: I0314 07:20:40.170509 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56dddd9f87-rr9l5" Mar 14 07:20:40 crc kubenswrapper[4893]: I0314 07:20:40.222505 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0-config\") pod \"d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0\" (UID: \"d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0\") " Mar 14 07:20:40 crc kubenswrapper[4893]: I0314 07:20:40.222579 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0-dns-svc\") pod \"d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0\" (UID: \"d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0\") " Mar 14 07:20:40 crc kubenswrapper[4893]: I0314 07:20:40.222702 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0-ovsdbserver-nb\") pod \"d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0\" (UID: \"d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0\") " Mar 14 07:20:40 crc kubenswrapper[4893]: I0314 07:20:40.222745 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0-ovsdbserver-sb\") pod \"d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0\" (UID: \"d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0\") " Mar 14 07:20:40 crc kubenswrapper[4893]: I0314 07:20:40.222771 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t26qz\" (UniqueName: \"kubernetes.io/projected/d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0-kube-api-access-t26qz\") pod \"d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0\" (UID: \"d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0\") " Mar 14 07:20:40 crc kubenswrapper[4893]: I0314 07:20:40.232014 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0-kube-api-access-t26qz" (OuterVolumeSpecName: "kube-api-access-t26qz") pod "d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0" (UID: "d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0"). InnerVolumeSpecName "kube-api-access-t26qz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:20:40 crc kubenswrapper[4893]: I0314 07:20:40.323433 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0-config" (OuterVolumeSpecName: "config") pod "d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0" (UID: "d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:20:40 crc kubenswrapper[4893]: I0314 07:20:40.323945 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0" (UID: "d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:20:40 crc kubenswrapper[4893]: I0314 07:20:40.324321 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0-ovsdbserver-sb\") pod \"d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0\" (UID: \"d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0\") " Mar 14 07:20:40 crc kubenswrapper[4893]: I0314 07:20:40.324684 4893 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:40 crc kubenswrapper[4893]: I0314 07:20:40.324697 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t26qz\" (UniqueName: \"kubernetes.io/projected/d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0-kube-api-access-t26qz\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:40 crc kubenswrapper[4893]: W0314 07:20:40.324746 4893 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0/volumes/kubernetes.io~configmap/ovsdbserver-sb Mar 14 07:20:40 crc kubenswrapper[4893]: I0314 07:20:40.324753 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0" (UID: "d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:20:40 crc kubenswrapper[4893]: I0314 07:20:40.328153 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0" (UID: "d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:20:40 crc kubenswrapper[4893]: I0314 07:20:40.346236 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0" (UID: "d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:20:40 crc kubenswrapper[4893]: I0314 07:20:40.426511 4893 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:40 crc kubenswrapper[4893]: I0314 07:20:40.428068 4893 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:40 crc kubenswrapper[4893]: I0314 07:20:40.428080 4893 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:40 crc kubenswrapper[4893]: I0314 07:20:40.479974 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 07:20:40 crc kubenswrapper[4893]: I0314 07:20:40.555733 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 07:20:40 crc kubenswrapper[4893]: I0314 07:20:40.706806 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"eed2e960-779c-43b9-b2b3-fe08a1957351","Type":"ContainerStarted","Data":"3b20b4b82b0f3862a5ef9a083c980796036f59c722735a26b1f2bb0ac387214d"} Mar 14 07:20:40 crc kubenswrapper[4893]: I0314 07:20:40.724438 4893 generic.go:334] "Generic (PLEG): container finished" podID="f453ebfe-e64b-475d-b37b-3cbb5f564a14" containerID="27dc77b8acd6b38d20435727dd8faf044af0c588eb456abcf368c4493ca34d71" exitCode=0 Mar 14 07:20:40 crc kubenswrapper[4893]: I0314 07:20:40.724566 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f778bb97-s6z9c" event={"ID":"f453ebfe-e64b-475d-b37b-3cbb5f564a14","Type":"ContainerDied","Data":"27dc77b8acd6b38d20435727dd8faf044af0c588eb456abcf368c4493ca34d71"} Mar 14 07:20:40 crc kubenswrapper[4893]: I0314 07:20:40.724592 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f778bb97-s6z9c" event={"ID":"f453ebfe-e64b-475d-b37b-3cbb5f564a14","Type":"ContainerStarted","Data":"ce5c6ace3a8f884128f120bb6c35ee374e0c4c285ef3a0f072dcdd71ae03bee6"} Mar 14 07:20:40 crc kubenswrapper[4893]: I0314 07:20:40.729501 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56dddd9f87-rr9l5" event={"ID":"d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0","Type":"ContainerDied","Data":"f12e8194b949c99fade8546f1ecea09fe62970c76a7723b0883a6507d367c147"} Mar 14 07:20:40 crc kubenswrapper[4893]: I0314 07:20:40.729564 4893 scope.go:117] "RemoveContainer" containerID="1e3fa598f44c1c25f871366d7f96e07083ae7af95e7679bd5be8d7434c3a266c" Mar 14 07:20:40 crc kubenswrapper[4893]: I0314 07:20:40.729660 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56dddd9f87-rr9l5" Mar 14 07:20:40 crc kubenswrapper[4893]: I0314 07:20:40.758512 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2b6e14bd-ad75-4482-aaf2-af8cfae25e2f","Type":"ContainerStarted","Data":"76815bfa055d1dbf086e76ecc9e3e45f285d54ab82251adc1e828a5afa883948"} Mar 14 07:20:40 crc kubenswrapper[4893]: I0314 07:20:40.802762 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-mdkpp" event={"ID":"0b4b8032-3d08-4574-9987-fc159aa8c506","Type":"ContainerStarted","Data":"ca062e024534d6308c9c6fe94a63e91ea2018dc1d51de4d15ba7e0da8e82ace0"} Mar 14 07:20:40 crc kubenswrapper[4893]: I0314 07:20:40.817118 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-xwpzh" event={"ID":"8a25b2a2-2d1a-4e0c-8114-c796ed4fe1e2","Type":"ContainerStarted","Data":"bb32cdbe87fb65aeb08da10ad9ae68aa3c26620126b3351d4bce40bac1066108"} Mar 14 07:20:40 crc kubenswrapper[4893]: I0314 07:20:40.849424 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 07:20:40 crc kubenswrapper[4893]: I0314 07:20:40.880875 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"079232b7-87bb-42cf-96ff-1eb2d1cfe2b5","Type":"ContainerStarted","Data":"68d10b4deb6f11784dbbc34682c64f702addc8c78d192179a6921060f8643b7d"} Mar 14 07:20:40 crc kubenswrapper[4893]: I0314 07:20:40.880928 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"079232b7-87bb-42cf-96ff-1eb2d1cfe2b5","Type":"ContainerStarted","Data":"9c507561a2ac6d068928170405d1a9bf41f0375a26767e40cbfa1697910cfdd9"} Mar 14 07:20:40 crc kubenswrapper[4893]: I0314 07:20:40.912999 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 07:20:40 crc kubenswrapper[4893]: I0314 07:20:40.926148 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-8vwkl" event={"ID":"8e3d2794-7a14-49ad-9ca9-a73ec58d2981","Type":"ContainerStarted","Data":"a99af99d9bf92b3ba08ac70ddfbb37967dcaee5680fcad631e5bfa22ff771102"} Mar 14 07:20:40 crc kubenswrapper[4893]: I0314 07:20:40.964586 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56dddd9f87-rr9l5"] Mar 14 07:20:40 crc kubenswrapper[4893]: I0314 07:20:40.975623 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56dddd9f87-rr9l5"] Mar 14 07:20:40 crc kubenswrapper[4893]: I0314 07:20:40.980570 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-8vwkl" podStartSLOduration=2.9805462179999997 podStartE2EDuration="2.980546218s" podCreationTimestamp="2026-03-14 07:20:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:20:40.952892114 +0000 UTC m=+1320.215068906" watchObservedRunningTime="2026-03-14 07:20:40.980546218 +0000 UTC m=+1320.242723010" Mar 14 07:20:41 crc kubenswrapper[4893]: I0314 07:20:41.032632 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:20:41 crc kubenswrapper[4893]: I0314 07:20:41.396453 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0" path="/var/lib/kubelet/pods/d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0/volumes" Mar 14 07:20:41 crc kubenswrapper[4893]: I0314 07:20:41.935030 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"eed2e960-779c-43b9-b2b3-fe08a1957351","Type":"ContainerStarted","Data":"291929403e59b2a44e70b293f5731bdb5fc0ad2be12737e89eaef4c4dd970f28"} Mar 14 07:20:41 crc kubenswrapper[4893]: I0314 07:20:41.939998 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f778bb97-s6z9c" event={"ID":"f453ebfe-e64b-475d-b37b-3cbb5f564a14","Type":"ContainerStarted","Data":"17b7c3a5ef23a2a562a9326a114880d79bf48b80260dab9ac2904d926ee63ab9"} Mar 14 07:20:41 crc kubenswrapper[4893]: I0314 07:20:41.941628 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55f778bb97-s6z9c" Mar 14 07:20:41 crc kubenswrapper[4893]: I0314 07:20:41.945869 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2b6e14bd-ad75-4482-aaf2-af8cfae25e2f","Type":"ContainerStarted","Data":"5371a646ca7b34db41e62759f3abe4915dd902f47672f17726766dc58c5c245d"} Mar 14 07:20:41 crc kubenswrapper[4893]: I0314 07:20:41.961762 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55f778bb97-s6z9c" podStartSLOduration=3.961748045 podStartE2EDuration="3.961748045s" podCreationTimestamp="2026-03-14 07:20:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:20:41.957992043 +0000 UTC m=+1321.220168845" watchObservedRunningTime="2026-03-14 07:20:41.961748045 +0000 UTC m=+1321.223924837" Mar 14 07:20:41 crc kubenswrapper[4893]: I0314 07:20:41.976701 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"079232b7-87bb-42cf-96ff-1eb2d1cfe2b5","Type":"ContainerStarted","Data":"273a5c886b9d88d838c2d721fc03cc49f0a45e38b5a0c201ed139fec6b5f6ee1"} Mar 14 07:20:41 crc kubenswrapper[4893]: I0314 07:20:41.976734 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"079232b7-87bb-42cf-96ff-1eb2d1cfe2b5","Type":"ContainerStarted","Data":"c23ee49f390ead1ae4f7fca83deb5f84581ceae59f8e8ae8455a1874dfe7b051"} Mar 14 07:20:42 crc kubenswrapper[4893]: I0314 07:20:42.987789 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"eed2e960-779c-43b9-b2b3-fe08a1957351","Type":"ContainerStarted","Data":"c36d3670bb91e6e29a07be9fd393599df904a63d7a071b64a167eee6273077ec"} Mar 14 07:20:42 crc kubenswrapper[4893]: I0314 07:20:42.988219 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="eed2e960-779c-43b9-b2b3-fe08a1957351" containerName="glance-log" containerID="cri-o://291929403e59b2a44e70b293f5731bdb5fc0ad2be12737e89eaef4c4dd970f28" gracePeriod=30 Mar 14 07:20:42 crc kubenswrapper[4893]: I0314 07:20:42.988745 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="eed2e960-779c-43b9-b2b3-fe08a1957351" containerName="glance-httpd" containerID="cri-o://c36d3670bb91e6e29a07be9fd393599df904a63d7a071b64a167eee6273077ec" gracePeriod=30 Mar 14 07:20:42 crc kubenswrapper[4893]: I0314 07:20:42.997042 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="2b6e14bd-ad75-4482-aaf2-af8cfae25e2f" containerName="glance-log" containerID="cri-o://5371a646ca7b34db41e62759f3abe4915dd902f47672f17726766dc58c5c245d" gracePeriod=30 Mar 14 07:20:42 crc kubenswrapper[4893]: I0314 07:20:42.997245 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2b6e14bd-ad75-4482-aaf2-af8cfae25e2f","Type":"ContainerStarted","Data":"1bf61af8d794348db7eae8d549c5cff81c4796b8d03d07ad5c720faa969e1df8"} Mar 14 07:20:42 crc kubenswrapper[4893]: I0314 07:20:42.997298 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="2b6e14bd-ad75-4482-aaf2-af8cfae25e2f" containerName="glance-httpd" containerID="cri-o://1bf61af8d794348db7eae8d549c5cff81c4796b8d03d07ad5c720faa969e1df8" gracePeriod=30 Mar 14 07:20:43 crc kubenswrapper[4893]: I0314 07:20:43.042447 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.042429312 podStartE2EDuration="5.042429312s" podCreationTimestamp="2026-03-14 07:20:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:20:43.007823551 +0000 UTC m=+1322.270000343" watchObservedRunningTime="2026-03-14 07:20:43.042429312 +0000 UTC m=+1322.304606104" Mar 14 07:20:43 crc kubenswrapper[4893]: I0314 07:20:43.062116 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.062008639 podStartE2EDuration="6.062008639s" podCreationTimestamp="2026-03-14 07:20:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:20:43.034600972 +0000 UTC m=+1322.296777764" watchObservedRunningTime="2026-03-14 07:20:43.062008639 +0000 UTC m=+1322.324185431" Mar 14 07:20:43 crc kubenswrapper[4893]: E0314 07:20:43.068541 4893 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeed2e960_779c_43b9_b2b3_fe08a1957351.slice/crio-conmon-291929403e59b2a44e70b293f5731bdb5fc0ad2be12737e89eaef4c4dd970f28.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeed2e960_779c_43b9_b2b3_fe08a1957351.slice/crio-291929403e59b2a44e70b293f5731bdb5fc0ad2be12737e89eaef4c4dd970f28.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeed2e960_779c_43b9_b2b3_fe08a1957351.slice/crio-c36d3670bb91e6e29a07be9fd393599df904a63d7a071b64a167eee6273077ec.scope\": RecentStats: unable to find data in memory cache]" Mar 14 07:20:44 crc kubenswrapper[4893]: I0314 07:20:44.008985 4893 generic.go:334] "Generic (PLEG): container finished" podID="eed2e960-779c-43b9-b2b3-fe08a1957351" containerID="c36d3670bb91e6e29a07be9fd393599df904a63d7a071b64a167eee6273077ec" exitCode=143 Mar 14 07:20:44 crc kubenswrapper[4893]: I0314 07:20:44.009024 4893 generic.go:334] "Generic (PLEG): container finished" podID="eed2e960-779c-43b9-b2b3-fe08a1957351" containerID="291929403e59b2a44e70b293f5731bdb5fc0ad2be12737e89eaef4c4dd970f28" exitCode=143 Mar 14 07:20:44 crc kubenswrapper[4893]: I0314 07:20:44.009027 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"eed2e960-779c-43b9-b2b3-fe08a1957351","Type":"ContainerDied","Data":"c36d3670bb91e6e29a07be9fd393599df904a63d7a071b64a167eee6273077ec"} Mar 14 07:20:44 crc kubenswrapper[4893]: I0314 07:20:44.009071 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"eed2e960-779c-43b9-b2b3-fe08a1957351","Type":"ContainerDied","Data":"291929403e59b2a44e70b293f5731bdb5fc0ad2be12737e89eaef4c4dd970f28"} Mar 14 07:20:44 crc kubenswrapper[4893]: I0314 07:20:44.011905 4893 generic.go:334] "Generic (PLEG): container finished" podID="ded78544-1cd7-41a9-b454-068d1024efd5" containerID="af0bc49ff538ba4ee33598bcf5c437e12efb9cbf76333f072b88487e74333566" exitCode=0 Mar 14 07:20:44 crc kubenswrapper[4893]: I0314 07:20:44.011975 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ktdtt" event={"ID":"ded78544-1cd7-41a9-b454-068d1024efd5","Type":"ContainerDied","Data":"af0bc49ff538ba4ee33598bcf5c437e12efb9cbf76333f072b88487e74333566"} Mar 14 07:20:44 crc kubenswrapper[4893]: I0314 07:20:44.015580 4893 generic.go:334] "Generic (PLEG): container finished" podID="2b6e14bd-ad75-4482-aaf2-af8cfae25e2f" containerID="1bf61af8d794348db7eae8d549c5cff81c4796b8d03d07ad5c720faa969e1df8" exitCode=143 Mar 14 07:20:44 crc kubenswrapper[4893]: I0314 07:20:44.015610 4893 generic.go:334] "Generic (PLEG): container finished" podID="2b6e14bd-ad75-4482-aaf2-af8cfae25e2f" containerID="5371a646ca7b34db41e62759f3abe4915dd902f47672f17726766dc58c5c245d" exitCode=143 Mar 14 07:20:44 crc kubenswrapper[4893]: I0314 07:20:44.015651 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2b6e14bd-ad75-4482-aaf2-af8cfae25e2f","Type":"ContainerDied","Data":"1bf61af8d794348db7eae8d549c5cff81c4796b8d03d07ad5c720faa969e1df8"} Mar 14 07:20:44 crc kubenswrapper[4893]: I0314 07:20:44.015685 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2b6e14bd-ad75-4482-aaf2-af8cfae25e2f","Type":"ContainerDied","Data":"5371a646ca7b34db41e62759f3abe4915dd902f47672f17726766dc58c5c245d"} Mar 14 07:20:44 crc kubenswrapper[4893]: I0314 07:20:44.642641 4893 scope.go:117] "RemoveContainer" containerID="d61478f864ddff1c56163078581ecd2c7ef46fb5f00c3a2cbc2635474642b2fd" Mar 14 07:20:46 crc kubenswrapper[4893]: I0314 07:20:46.250405 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ktdtt" Mar 14 07:20:46 crc kubenswrapper[4893]: I0314 07:20:46.366107 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ded78544-1cd7-41a9-b454-068d1024efd5-config-data\") pod \"ded78544-1cd7-41a9-b454-068d1024efd5\" (UID: \"ded78544-1cd7-41a9-b454-068d1024efd5\") " Mar 14 07:20:46 crc kubenswrapper[4893]: I0314 07:20:46.366215 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ded78544-1cd7-41a9-b454-068d1024efd5-scripts\") pod \"ded78544-1cd7-41a9-b454-068d1024efd5\" (UID: \"ded78544-1cd7-41a9-b454-068d1024efd5\") " Mar 14 07:20:46 crc kubenswrapper[4893]: I0314 07:20:46.366261 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzs7t\" (UniqueName: \"kubernetes.io/projected/ded78544-1cd7-41a9-b454-068d1024efd5-kube-api-access-fzs7t\") pod \"ded78544-1cd7-41a9-b454-068d1024efd5\" (UID: \"ded78544-1cd7-41a9-b454-068d1024efd5\") " Mar 14 07:20:46 crc kubenswrapper[4893]: I0314 07:20:46.366330 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ded78544-1cd7-41a9-b454-068d1024efd5-fernet-keys\") pod \"ded78544-1cd7-41a9-b454-068d1024efd5\" (UID: \"ded78544-1cd7-41a9-b454-068d1024efd5\") " Mar 14 07:20:46 crc kubenswrapper[4893]: I0314 07:20:46.366351 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ded78544-1cd7-41a9-b454-068d1024efd5-combined-ca-bundle\") pod \"ded78544-1cd7-41a9-b454-068d1024efd5\" (UID: \"ded78544-1cd7-41a9-b454-068d1024efd5\") " Mar 14 07:20:46 crc kubenswrapper[4893]: I0314 07:20:46.366367 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ded78544-1cd7-41a9-b454-068d1024efd5-credential-keys\") pod \"ded78544-1cd7-41a9-b454-068d1024efd5\" (UID: \"ded78544-1cd7-41a9-b454-068d1024efd5\") " Mar 14 07:20:46 crc kubenswrapper[4893]: I0314 07:20:46.378362 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ded78544-1cd7-41a9-b454-068d1024efd5-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "ded78544-1cd7-41a9-b454-068d1024efd5" (UID: "ded78544-1cd7-41a9-b454-068d1024efd5"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:20:46 crc kubenswrapper[4893]: I0314 07:20:46.378461 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ded78544-1cd7-41a9-b454-068d1024efd5-kube-api-access-fzs7t" (OuterVolumeSpecName: "kube-api-access-fzs7t") pod "ded78544-1cd7-41a9-b454-068d1024efd5" (UID: "ded78544-1cd7-41a9-b454-068d1024efd5"). InnerVolumeSpecName "kube-api-access-fzs7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:20:46 crc kubenswrapper[4893]: I0314 07:20:46.378650 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ded78544-1cd7-41a9-b454-068d1024efd5-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "ded78544-1cd7-41a9-b454-068d1024efd5" (UID: "ded78544-1cd7-41a9-b454-068d1024efd5"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:20:46 crc kubenswrapper[4893]: I0314 07:20:46.388421 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ded78544-1cd7-41a9-b454-068d1024efd5-scripts" (OuterVolumeSpecName: "scripts") pod "ded78544-1cd7-41a9-b454-068d1024efd5" (UID: "ded78544-1cd7-41a9-b454-068d1024efd5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:20:46 crc kubenswrapper[4893]: I0314 07:20:46.397233 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ded78544-1cd7-41a9-b454-068d1024efd5-config-data" (OuterVolumeSpecName: "config-data") pod "ded78544-1cd7-41a9-b454-068d1024efd5" (UID: "ded78544-1cd7-41a9-b454-068d1024efd5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:20:46 crc kubenswrapper[4893]: I0314 07:20:46.399675 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ded78544-1cd7-41a9-b454-068d1024efd5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ded78544-1cd7-41a9-b454-068d1024efd5" (UID: "ded78544-1cd7-41a9-b454-068d1024efd5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:20:46 crc kubenswrapper[4893]: I0314 07:20:46.468121 4893 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ded78544-1cd7-41a9-b454-068d1024efd5-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:46 crc kubenswrapper[4893]: I0314 07:20:46.468156 4893 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ded78544-1cd7-41a9-b454-068d1024efd5-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:46 crc kubenswrapper[4893]: I0314 07:20:46.468170 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzs7t\" (UniqueName: \"kubernetes.io/projected/ded78544-1cd7-41a9-b454-068d1024efd5-kube-api-access-fzs7t\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:46 crc kubenswrapper[4893]: I0314 07:20:46.468181 4893 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ded78544-1cd7-41a9-b454-068d1024efd5-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:46 crc kubenswrapper[4893]: I0314 07:20:46.468190 4893 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ded78544-1cd7-41a9-b454-068d1024efd5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:46 crc kubenswrapper[4893]: I0314 07:20:46.468200 4893 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ded78544-1cd7-41a9-b454-068d1024efd5-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:47 crc kubenswrapper[4893]: I0314 07:20:47.043095 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ktdtt" event={"ID":"ded78544-1cd7-41a9-b454-068d1024efd5","Type":"ContainerDied","Data":"f317b96e3aa65dcb06a64b4b7b0260e8cb634313f64f538b941f7d5c4eff9f93"} Mar 14 07:20:47 crc kubenswrapper[4893]: I0314 07:20:47.043403 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f317b96e3aa65dcb06a64b4b7b0260e8cb634313f64f538b941f7d5c4eff9f93" Mar 14 07:20:47 crc kubenswrapper[4893]: I0314 07:20:47.043156 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ktdtt" Mar 14 07:20:47 crc kubenswrapper[4893]: I0314 07:20:47.342466 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-ktdtt"] Mar 14 07:20:47 crc kubenswrapper[4893]: I0314 07:20:47.350994 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-ktdtt"] Mar 14 07:20:47 crc kubenswrapper[4893]: I0314 07:20:47.391860 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ded78544-1cd7-41a9-b454-068d1024efd5" path="/var/lib/kubelet/pods/ded78544-1cd7-41a9-b454-068d1024efd5/volumes" Mar 14 07:20:47 crc kubenswrapper[4893]: I0314 07:20:47.440828 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-vc8cn"] Mar 14 07:20:47 crc kubenswrapper[4893]: E0314 07:20:47.442464 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0" containerName="init" Mar 14 07:20:47 crc kubenswrapper[4893]: I0314 07:20:47.442491 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0" containerName="init" Mar 14 07:20:47 crc kubenswrapper[4893]: E0314 07:20:47.442510 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ded78544-1cd7-41a9-b454-068d1024efd5" containerName="keystone-bootstrap" Mar 14 07:20:47 crc kubenswrapper[4893]: I0314 07:20:47.442534 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="ded78544-1cd7-41a9-b454-068d1024efd5" containerName="keystone-bootstrap" Mar 14 07:20:47 crc kubenswrapper[4893]: I0314 07:20:47.442905 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9e6e58a-ec43-4dd9-8dae-eddbcfe782e0" containerName="init" Mar 14 07:20:47 crc kubenswrapper[4893]: I0314 07:20:47.442929 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="ded78544-1cd7-41a9-b454-068d1024efd5" containerName="keystone-bootstrap" Mar 14 07:20:47 crc kubenswrapper[4893]: I0314 07:20:47.443640 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vc8cn" Mar 14 07:20:47 crc kubenswrapper[4893]: I0314 07:20:47.445746 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 14 07:20:47 crc kubenswrapper[4893]: I0314 07:20:47.445915 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 14 07:20:47 crc kubenswrapper[4893]: I0314 07:20:47.446021 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 14 07:20:47 crc kubenswrapper[4893]: I0314 07:20:47.446346 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 14 07:20:47 crc kubenswrapper[4893]: I0314 07:20:47.447550 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-k497z" Mar 14 07:20:47 crc kubenswrapper[4893]: I0314 07:20:47.450348 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-vc8cn"] Mar 14 07:20:47 crc kubenswrapper[4893]: I0314 07:20:47.507336 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/81acf3d1-58a7-43ed-808d-411467094efe-credential-keys\") pod \"keystone-bootstrap-vc8cn\" (UID: \"81acf3d1-58a7-43ed-808d-411467094efe\") " pod="openstack/keystone-bootstrap-vc8cn" Mar 14 07:20:47 crc kubenswrapper[4893]: I0314 07:20:47.507402 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81acf3d1-58a7-43ed-808d-411467094efe-scripts\") pod \"keystone-bootstrap-vc8cn\" (UID: \"81acf3d1-58a7-43ed-808d-411467094efe\") " pod="openstack/keystone-bootstrap-vc8cn" Mar 14 07:20:47 crc kubenswrapper[4893]: I0314 07:20:47.507429 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81acf3d1-58a7-43ed-808d-411467094efe-combined-ca-bundle\") pod \"keystone-bootstrap-vc8cn\" (UID: \"81acf3d1-58a7-43ed-808d-411467094efe\") " pod="openstack/keystone-bootstrap-vc8cn" Mar 14 07:20:47 crc kubenswrapper[4893]: I0314 07:20:47.507462 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2x5nc\" (UniqueName: \"kubernetes.io/projected/81acf3d1-58a7-43ed-808d-411467094efe-kube-api-access-2x5nc\") pod \"keystone-bootstrap-vc8cn\" (UID: \"81acf3d1-58a7-43ed-808d-411467094efe\") " pod="openstack/keystone-bootstrap-vc8cn" Mar 14 07:20:47 crc kubenswrapper[4893]: I0314 07:20:47.507503 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/81acf3d1-58a7-43ed-808d-411467094efe-fernet-keys\") pod \"keystone-bootstrap-vc8cn\" (UID: \"81acf3d1-58a7-43ed-808d-411467094efe\") " pod="openstack/keystone-bootstrap-vc8cn" Mar 14 07:20:47 crc kubenswrapper[4893]: I0314 07:20:47.507584 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81acf3d1-58a7-43ed-808d-411467094efe-config-data\") pod \"keystone-bootstrap-vc8cn\" (UID: \"81acf3d1-58a7-43ed-808d-411467094efe\") " pod="openstack/keystone-bootstrap-vc8cn" Mar 14 07:20:47 crc kubenswrapper[4893]: I0314 07:20:47.608802 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/81acf3d1-58a7-43ed-808d-411467094efe-credential-keys\") pod \"keystone-bootstrap-vc8cn\" (UID: \"81acf3d1-58a7-43ed-808d-411467094efe\") " pod="openstack/keystone-bootstrap-vc8cn" Mar 14 07:20:47 crc kubenswrapper[4893]: I0314 07:20:47.608854 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81acf3d1-58a7-43ed-808d-411467094efe-scripts\") pod \"keystone-bootstrap-vc8cn\" (UID: \"81acf3d1-58a7-43ed-808d-411467094efe\") " pod="openstack/keystone-bootstrap-vc8cn" Mar 14 07:20:47 crc kubenswrapper[4893]: I0314 07:20:47.608875 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81acf3d1-58a7-43ed-808d-411467094efe-combined-ca-bundle\") pod \"keystone-bootstrap-vc8cn\" (UID: \"81acf3d1-58a7-43ed-808d-411467094efe\") " pod="openstack/keystone-bootstrap-vc8cn" Mar 14 07:20:47 crc kubenswrapper[4893]: I0314 07:20:47.608898 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2x5nc\" (UniqueName: \"kubernetes.io/projected/81acf3d1-58a7-43ed-808d-411467094efe-kube-api-access-2x5nc\") pod \"keystone-bootstrap-vc8cn\" (UID: \"81acf3d1-58a7-43ed-808d-411467094efe\") " pod="openstack/keystone-bootstrap-vc8cn" Mar 14 07:20:47 crc kubenswrapper[4893]: I0314 07:20:47.608927 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/81acf3d1-58a7-43ed-808d-411467094efe-fernet-keys\") pod \"keystone-bootstrap-vc8cn\" (UID: \"81acf3d1-58a7-43ed-808d-411467094efe\") " pod="openstack/keystone-bootstrap-vc8cn" Mar 14 07:20:47 crc kubenswrapper[4893]: I0314 07:20:47.608975 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81acf3d1-58a7-43ed-808d-411467094efe-config-data\") pod \"keystone-bootstrap-vc8cn\" (UID: \"81acf3d1-58a7-43ed-808d-411467094efe\") " pod="openstack/keystone-bootstrap-vc8cn" Mar 14 07:20:47 crc kubenswrapper[4893]: I0314 07:20:47.615653 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81acf3d1-58a7-43ed-808d-411467094efe-scripts\") pod \"keystone-bootstrap-vc8cn\" (UID: \"81acf3d1-58a7-43ed-808d-411467094efe\") " pod="openstack/keystone-bootstrap-vc8cn" Mar 14 07:20:47 crc kubenswrapper[4893]: I0314 07:20:47.616191 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/81acf3d1-58a7-43ed-808d-411467094efe-credential-keys\") pod \"keystone-bootstrap-vc8cn\" (UID: \"81acf3d1-58a7-43ed-808d-411467094efe\") " pod="openstack/keystone-bootstrap-vc8cn" Mar 14 07:20:47 crc kubenswrapper[4893]: I0314 07:20:47.617058 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81acf3d1-58a7-43ed-808d-411467094efe-config-data\") pod \"keystone-bootstrap-vc8cn\" (UID: \"81acf3d1-58a7-43ed-808d-411467094efe\") " pod="openstack/keystone-bootstrap-vc8cn" Mar 14 07:20:47 crc kubenswrapper[4893]: I0314 07:20:47.617135 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81acf3d1-58a7-43ed-808d-411467094efe-combined-ca-bundle\") pod \"keystone-bootstrap-vc8cn\" (UID: \"81acf3d1-58a7-43ed-808d-411467094efe\") " pod="openstack/keystone-bootstrap-vc8cn" Mar 14 07:20:47 crc kubenswrapper[4893]: I0314 07:20:47.618846 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/81acf3d1-58a7-43ed-808d-411467094efe-fernet-keys\") pod \"keystone-bootstrap-vc8cn\" (UID: \"81acf3d1-58a7-43ed-808d-411467094efe\") " pod="openstack/keystone-bootstrap-vc8cn" Mar 14 07:20:47 crc kubenswrapper[4893]: I0314 07:20:47.626489 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2x5nc\" (UniqueName: \"kubernetes.io/projected/81acf3d1-58a7-43ed-808d-411467094efe-kube-api-access-2x5nc\") pod \"keystone-bootstrap-vc8cn\" (UID: \"81acf3d1-58a7-43ed-808d-411467094efe\") " pod="openstack/keystone-bootstrap-vc8cn" Mar 14 07:20:47 crc kubenswrapper[4893]: I0314 07:20:47.822638 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vc8cn" Mar 14 07:20:48 crc kubenswrapper[4893]: I0314 07:20:48.932684 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55f778bb97-s6z9c" Mar 14 07:20:49 crc kubenswrapper[4893]: I0314 07:20:49.012179 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cd86fcf7-fcmhz"] Mar 14 07:20:49 crc kubenswrapper[4893]: I0314 07:20:49.012863 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6cd86fcf7-fcmhz" podUID="cc2d4699-ea9a-426b-b441-ee0d9e32445c" containerName="dnsmasq-dns" containerID="cri-o://a2b53e42a4ed1c6046a99777cc0e3efb1372c8fc4318d274864d09b2c1d02052" gracePeriod=10 Mar 14 07:20:50 crc kubenswrapper[4893]: I0314 07:20:50.009259 4893 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6cd86fcf7-fcmhz" podUID="cc2d4699-ea9a-426b-b441-ee0d9e32445c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.136:5353: connect: connection refused" Mar 14 07:20:50 crc kubenswrapper[4893]: I0314 07:20:50.086805 4893 generic.go:334] "Generic (PLEG): container finished" podID="cc2d4699-ea9a-426b-b441-ee0d9e32445c" containerID="a2b53e42a4ed1c6046a99777cc0e3efb1372c8fc4318d274864d09b2c1d02052" exitCode=0 Mar 14 07:20:50 crc kubenswrapper[4893]: I0314 07:20:50.086857 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cd86fcf7-fcmhz" event={"ID":"cc2d4699-ea9a-426b-b441-ee0d9e32445c","Type":"ContainerDied","Data":"a2b53e42a4ed1c6046a99777cc0e3efb1372c8fc4318d274864d09b2c1d02052"} Mar 14 07:20:50 crc kubenswrapper[4893]: I0314 07:20:50.885241 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 14 07:20:50 crc kubenswrapper[4893]: I0314 07:20:50.971035 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eed2e960-779c-43b9-b2b3-fe08a1957351-config-data\") pod \"eed2e960-779c-43b9-b2b3-fe08a1957351\" (UID: \"eed2e960-779c-43b9-b2b3-fe08a1957351\") " Mar 14 07:20:50 crc kubenswrapper[4893]: I0314 07:20:50.971101 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eed2e960-779c-43b9-b2b3-fe08a1957351-httpd-run\") pod \"eed2e960-779c-43b9-b2b3-fe08a1957351\" (UID: \"eed2e960-779c-43b9-b2b3-fe08a1957351\") " Mar 14 07:20:50 crc kubenswrapper[4893]: I0314 07:20:50.971193 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eed2e960-779c-43b9-b2b3-fe08a1957351-combined-ca-bundle\") pod \"eed2e960-779c-43b9-b2b3-fe08a1957351\" (UID: \"eed2e960-779c-43b9-b2b3-fe08a1957351\") " Mar 14 07:20:50 crc kubenswrapper[4893]: I0314 07:20:50.971236 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eed2e960-779c-43b9-b2b3-fe08a1957351-logs\") pod \"eed2e960-779c-43b9-b2b3-fe08a1957351\" (UID: \"eed2e960-779c-43b9-b2b3-fe08a1957351\") " Mar 14 07:20:50 crc kubenswrapper[4893]: I0314 07:20:50.971349 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eed2e960-779c-43b9-b2b3-fe08a1957351-internal-tls-certs\") pod \"eed2e960-779c-43b9-b2b3-fe08a1957351\" (UID: \"eed2e960-779c-43b9-b2b3-fe08a1957351\") " Mar 14 07:20:50 crc kubenswrapper[4893]: I0314 07:20:50.971414 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eed2e960-779c-43b9-b2b3-fe08a1957351-scripts\") pod \"eed2e960-779c-43b9-b2b3-fe08a1957351\" (UID: \"eed2e960-779c-43b9-b2b3-fe08a1957351\") " Mar 14 07:20:50 crc kubenswrapper[4893]: I0314 07:20:50.971564 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"eed2e960-779c-43b9-b2b3-fe08a1957351\" (UID: \"eed2e960-779c-43b9-b2b3-fe08a1957351\") " Mar 14 07:20:50 crc kubenswrapper[4893]: I0314 07:20:50.971598 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wgnhl\" (UniqueName: \"kubernetes.io/projected/eed2e960-779c-43b9-b2b3-fe08a1957351-kube-api-access-wgnhl\") pod \"eed2e960-779c-43b9-b2b3-fe08a1957351\" (UID: \"eed2e960-779c-43b9-b2b3-fe08a1957351\") " Mar 14 07:20:50 crc kubenswrapper[4893]: I0314 07:20:50.971831 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eed2e960-779c-43b9-b2b3-fe08a1957351-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "eed2e960-779c-43b9-b2b3-fe08a1957351" (UID: "eed2e960-779c-43b9-b2b3-fe08a1957351"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:20:50 crc kubenswrapper[4893]: I0314 07:20:50.972644 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eed2e960-779c-43b9-b2b3-fe08a1957351-logs" (OuterVolumeSpecName: "logs") pod "eed2e960-779c-43b9-b2b3-fe08a1957351" (UID: "eed2e960-779c-43b9-b2b3-fe08a1957351"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:20:50 crc kubenswrapper[4893]: I0314 07:20:50.972940 4893 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eed2e960-779c-43b9-b2b3-fe08a1957351-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:50 crc kubenswrapper[4893]: I0314 07:20:50.972965 4893 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eed2e960-779c-43b9-b2b3-fe08a1957351-logs\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:50 crc kubenswrapper[4893]: I0314 07:20:50.977001 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eed2e960-779c-43b9-b2b3-fe08a1957351-scripts" (OuterVolumeSpecName: "scripts") pod "eed2e960-779c-43b9-b2b3-fe08a1957351" (UID: "eed2e960-779c-43b9-b2b3-fe08a1957351"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:20:50 crc kubenswrapper[4893]: I0314 07:20:50.977570 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "eed2e960-779c-43b9-b2b3-fe08a1957351" (UID: "eed2e960-779c-43b9-b2b3-fe08a1957351"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 14 07:20:50 crc kubenswrapper[4893]: I0314 07:20:50.991293 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eed2e960-779c-43b9-b2b3-fe08a1957351-kube-api-access-wgnhl" (OuterVolumeSpecName: "kube-api-access-wgnhl") pod "eed2e960-779c-43b9-b2b3-fe08a1957351" (UID: "eed2e960-779c-43b9-b2b3-fe08a1957351"). InnerVolumeSpecName "kube-api-access-wgnhl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:20:50 crc kubenswrapper[4893]: I0314 07:20:50.997763 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eed2e960-779c-43b9-b2b3-fe08a1957351-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eed2e960-779c-43b9-b2b3-fe08a1957351" (UID: "eed2e960-779c-43b9-b2b3-fe08a1957351"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:20:51 crc kubenswrapper[4893]: I0314 07:20:51.017610 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eed2e960-779c-43b9-b2b3-fe08a1957351-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "eed2e960-779c-43b9-b2b3-fe08a1957351" (UID: "eed2e960-779c-43b9-b2b3-fe08a1957351"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:20:51 crc kubenswrapper[4893]: I0314 07:20:51.037116 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eed2e960-779c-43b9-b2b3-fe08a1957351-config-data" (OuterVolumeSpecName: "config-data") pod "eed2e960-779c-43b9-b2b3-fe08a1957351" (UID: "eed2e960-779c-43b9-b2b3-fe08a1957351"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:20:51 crc kubenswrapper[4893]: I0314 07:20:51.074724 4893 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eed2e960-779c-43b9-b2b3-fe08a1957351-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:51 crc kubenswrapper[4893]: I0314 07:20:51.074765 4893 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eed2e960-779c-43b9-b2b3-fe08a1957351-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:51 crc kubenswrapper[4893]: I0314 07:20:51.074779 4893 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eed2e960-779c-43b9-b2b3-fe08a1957351-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:51 crc kubenswrapper[4893]: I0314 07:20:51.074791 4893 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eed2e960-779c-43b9-b2b3-fe08a1957351-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:51 crc kubenswrapper[4893]: I0314 07:20:51.074835 4893 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Mar 14 07:20:51 crc kubenswrapper[4893]: I0314 07:20:51.074850 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wgnhl\" (UniqueName: \"kubernetes.io/projected/eed2e960-779c-43b9-b2b3-fe08a1957351-kube-api-access-wgnhl\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:51 crc kubenswrapper[4893]: I0314 07:20:51.094955 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"eed2e960-779c-43b9-b2b3-fe08a1957351","Type":"ContainerDied","Data":"3b20b4b82b0f3862a5ef9a083c980796036f59c722735a26b1f2bb0ac387214d"} Mar 14 07:20:51 crc kubenswrapper[4893]: I0314 07:20:51.095002 4893 scope.go:117] "RemoveContainer" containerID="c36d3670bb91e6e29a07be9fd393599df904a63d7a071b64a167eee6273077ec" Mar 14 07:20:51 crc kubenswrapper[4893]: I0314 07:20:51.095047 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 14 07:20:51 crc kubenswrapper[4893]: I0314 07:20:51.095347 4893 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Mar 14 07:20:51 crc kubenswrapper[4893]: I0314 07:20:51.145911 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 07:20:51 crc kubenswrapper[4893]: I0314 07:20:51.175393 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 07:20:51 crc kubenswrapper[4893]: I0314 07:20:51.175828 4893 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:51 crc kubenswrapper[4893]: I0314 07:20:51.188979 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 07:20:51 crc kubenswrapper[4893]: E0314 07:20:51.189396 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eed2e960-779c-43b9-b2b3-fe08a1957351" containerName="glance-httpd" Mar 14 07:20:51 crc kubenswrapper[4893]: I0314 07:20:51.189410 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="eed2e960-779c-43b9-b2b3-fe08a1957351" containerName="glance-httpd" Mar 14 07:20:51 crc kubenswrapper[4893]: E0314 07:20:51.189419 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eed2e960-779c-43b9-b2b3-fe08a1957351" containerName="glance-log" Mar 14 07:20:51 crc kubenswrapper[4893]: I0314 07:20:51.189427 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="eed2e960-779c-43b9-b2b3-fe08a1957351" containerName="glance-log" Mar 14 07:20:51 crc kubenswrapper[4893]: I0314 07:20:51.189651 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="eed2e960-779c-43b9-b2b3-fe08a1957351" containerName="glance-log" Mar 14 07:20:51 crc kubenswrapper[4893]: I0314 07:20:51.189670 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="eed2e960-779c-43b9-b2b3-fe08a1957351" containerName="glance-httpd" Mar 14 07:20:51 crc kubenswrapper[4893]: I0314 07:20:51.190568 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 14 07:20:51 crc kubenswrapper[4893]: I0314 07:20:51.192511 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 14 07:20:51 crc kubenswrapper[4893]: I0314 07:20:51.195086 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 14 07:20:51 crc kubenswrapper[4893]: I0314 07:20:51.201479 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 07:20:51 crc kubenswrapper[4893]: I0314 07:20:51.277372 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c782f441-cf3b-4a69-965b-5d87dd4a00ad-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c782f441-cf3b-4a69-965b-5d87dd4a00ad\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:20:51 crc kubenswrapper[4893]: I0314 07:20:51.277423 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c782f441-cf3b-4a69-965b-5d87dd4a00ad-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c782f441-cf3b-4a69-965b-5d87dd4a00ad\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:20:51 crc kubenswrapper[4893]: I0314 07:20:51.277449 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c782f441-cf3b-4a69-965b-5d87dd4a00ad-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c782f441-cf3b-4a69-965b-5d87dd4a00ad\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:20:51 crc kubenswrapper[4893]: I0314 07:20:51.277495 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9vrk\" (UniqueName: \"kubernetes.io/projected/c782f441-cf3b-4a69-965b-5d87dd4a00ad-kube-api-access-f9vrk\") pod \"glance-default-internal-api-0\" (UID: \"c782f441-cf3b-4a69-965b-5d87dd4a00ad\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:20:51 crc kubenswrapper[4893]: I0314 07:20:51.277756 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c782f441-cf3b-4a69-965b-5d87dd4a00ad-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c782f441-cf3b-4a69-965b-5d87dd4a00ad\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:20:51 crc kubenswrapper[4893]: I0314 07:20:51.277810 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"c782f441-cf3b-4a69-965b-5d87dd4a00ad\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:20:51 crc kubenswrapper[4893]: I0314 07:20:51.277882 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c782f441-cf3b-4a69-965b-5d87dd4a00ad-logs\") pod \"glance-default-internal-api-0\" (UID: \"c782f441-cf3b-4a69-965b-5d87dd4a00ad\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:20:51 crc kubenswrapper[4893]: I0314 07:20:51.278094 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c782f441-cf3b-4a69-965b-5d87dd4a00ad-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c782f441-cf3b-4a69-965b-5d87dd4a00ad\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:20:51 crc kubenswrapper[4893]: I0314 07:20:51.379462 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9vrk\" (UniqueName: \"kubernetes.io/projected/c782f441-cf3b-4a69-965b-5d87dd4a00ad-kube-api-access-f9vrk\") pod \"glance-default-internal-api-0\" (UID: \"c782f441-cf3b-4a69-965b-5d87dd4a00ad\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:20:51 crc kubenswrapper[4893]: I0314 07:20:51.379635 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c782f441-cf3b-4a69-965b-5d87dd4a00ad-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c782f441-cf3b-4a69-965b-5d87dd4a00ad\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:20:51 crc kubenswrapper[4893]: I0314 07:20:51.379689 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"c782f441-cf3b-4a69-965b-5d87dd4a00ad\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:20:51 crc kubenswrapper[4893]: I0314 07:20:51.379759 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c782f441-cf3b-4a69-965b-5d87dd4a00ad-logs\") pod \"glance-default-internal-api-0\" (UID: \"c782f441-cf3b-4a69-965b-5d87dd4a00ad\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:20:51 crc kubenswrapper[4893]: I0314 07:20:51.379802 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c782f441-cf3b-4a69-965b-5d87dd4a00ad-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c782f441-cf3b-4a69-965b-5d87dd4a00ad\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:20:51 crc kubenswrapper[4893]: I0314 07:20:51.379867 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c782f441-cf3b-4a69-965b-5d87dd4a00ad-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c782f441-cf3b-4a69-965b-5d87dd4a00ad\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:20:51 crc kubenswrapper[4893]: I0314 07:20:51.379931 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c782f441-cf3b-4a69-965b-5d87dd4a00ad-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c782f441-cf3b-4a69-965b-5d87dd4a00ad\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:20:51 crc kubenswrapper[4893]: I0314 07:20:51.379966 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c782f441-cf3b-4a69-965b-5d87dd4a00ad-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c782f441-cf3b-4a69-965b-5d87dd4a00ad\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:20:51 crc kubenswrapper[4893]: I0314 07:20:51.380275 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c782f441-cf3b-4a69-965b-5d87dd4a00ad-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c782f441-cf3b-4a69-965b-5d87dd4a00ad\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:20:51 crc kubenswrapper[4893]: I0314 07:20:51.380363 4893 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"c782f441-cf3b-4a69-965b-5d87dd4a00ad\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Mar 14 07:20:51 crc kubenswrapper[4893]: I0314 07:20:51.380417 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c782f441-cf3b-4a69-965b-5d87dd4a00ad-logs\") pod \"glance-default-internal-api-0\" (UID: \"c782f441-cf3b-4a69-965b-5d87dd4a00ad\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:20:51 crc kubenswrapper[4893]: I0314 07:20:51.384782 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c782f441-cf3b-4a69-965b-5d87dd4a00ad-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c782f441-cf3b-4a69-965b-5d87dd4a00ad\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:20:51 crc kubenswrapper[4893]: I0314 07:20:51.385428 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c782f441-cf3b-4a69-965b-5d87dd4a00ad-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c782f441-cf3b-4a69-965b-5d87dd4a00ad\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:20:51 crc kubenswrapper[4893]: I0314 07:20:51.386349 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c782f441-cf3b-4a69-965b-5d87dd4a00ad-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c782f441-cf3b-4a69-965b-5d87dd4a00ad\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:20:51 crc kubenswrapper[4893]: I0314 07:20:51.388719 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eed2e960-779c-43b9-b2b3-fe08a1957351" path="/var/lib/kubelet/pods/eed2e960-779c-43b9-b2b3-fe08a1957351/volumes" Mar 14 07:20:51 crc kubenswrapper[4893]: I0314 07:20:51.391221 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c782f441-cf3b-4a69-965b-5d87dd4a00ad-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c782f441-cf3b-4a69-965b-5d87dd4a00ad\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:20:51 crc kubenswrapper[4893]: I0314 07:20:51.403584 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9vrk\" (UniqueName: \"kubernetes.io/projected/c782f441-cf3b-4a69-965b-5d87dd4a00ad-kube-api-access-f9vrk\") pod \"glance-default-internal-api-0\" (UID: \"c782f441-cf3b-4a69-965b-5d87dd4a00ad\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:20:51 crc kubenswrapper[4893]: I0314 07:20:51.410804 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"c782f441-cf3b-4a69-965b-5d87dd4a00ad\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:20:51 crc kubenswrapper[4893]: I0314 07:20:51.510840 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 14 07:20:55 crc kubenswrapper[4893]: I0314 07:20:55.009287 4893 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6cd86fcf7-fcmhz" podUID="cc2d4699-ea9a-426b-b441-ee0d9e32445c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.136:5353: connect: connection refused" Mar 14 07:21:00 crc kubenswrapper[4893]: I0314 07:21:00.009363 4893 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6cd86fcf7-fcmhz" podUID="cc2d4699-ea9a-426b-b441-ee0d9e32445c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.136:5353: connect: connection refused" Mar 14 07:21:00 crc kubenswrapper[4893]: I0314 07:21:00.010099 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6cd86fcf7-fcmhz" Mar 14 07:21:00 crc kubenswrapper[4893]: I0314 07:21:00.176148 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 14 07:21:00 crc kubenswrapper[4893]: I0314 07:21:00.333702 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b6e14bd-ad75-4482-aaf2-af8cfae25e2f-logs\") pod \"2b6e14bd-ad75-4482-aaf2-af8cfae25e2f\" (UID: \"2b6e14bd-ad75-4482-aaf2-af8cfae25e2f\") " Mar 14 07:21:00 crc kubenswrapper[4893]: I0314 07:21:00.333862 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b6e14bd-ad75-4482-aaf2-af8cfae25e2f-scripts\") pod \"2b6e14bd-ad75-4482-aaf2-af8cfae25e2f\" (UID: \"2b6e14bd-ad75-4482-aaf2-af8cfae25e2f\") " Mar 14 07:21:00 crc kubenswrapper[4893]: I0314 07:21:00.333893 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b6e14bd-ad75-4482-aaf2-af8cfae25e2f-combined-ca-bundle\") pod \"2b6e14bd-ad75-4482-aaf2-af8cfae25e2f\" (UID: \"2b6e14bd-ad75-4482-aaf2-af8cfae25e2f\") " Mar 14 07:21:00 crc kubenswrapper[4893]: I0314 07:21:00.333945 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvrvv\" (UniqueName: \"kubernetes.io/projected/2b6e14bd-ad75-4482-aaf2-af8cfae25e2f-kube-api-access-zvrvv\") pod \"2b6e14bd-ad75-4482-aaf2-af8cfae25e2f\" (UID: \"2b6e14bd-ad75-4482-aaf2-af8cfae25e2f\") " Mar 14 07:21:00 crc kubenswrapper[4893]: I0314 07:21:00.333993 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"2b6e14bd-ad75-4482-aaf2-af8cfae25e2f\" (UID: \"2b6e14bd-ad75-4482-aaf2-af8cfae25e2f\") " Mar 14 07:21:00 crc kubenswrapper[4893]: I0314 07:21:00.334019 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b6e14bd-ad75-4482-aaf2-af8cfae25e2f-public-tls-certs\") pod \"2b6e14bd-ad75-4482-aaf2-af8cfae25e2f\" (UID: \"2b6e14bd-ad75-4482-aaf2-af8cfae25e2f\") " Mar 14 07:21:00 crc kubenswrapper[4893]: I0314 07:21:00.334045 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b6e14bd-ad75-4482-aaf2-af8cfae25e2f-config-data\") pod \"2b6e14bd-ad75-4482-aaf2-af8cfae25e2f\" (UID: \"2b6e14bd-ad75-4482-aaf2-af8cfae25e2f\") " Mar 14 07:21:00 crc kubenswrapper[4893]: I0314 07:21:00.334134 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2b6e14bd-ad75-4482-aaf2-af8cfae25e2f-httpd-run\") pod \"2b6e14bd-ad75-4482-aaf2-af8cfae25e2f\" (UID: \"2b6e14bd-ad75-4482-aaf2-af8cfae25e2f\") " Mar 14 07:21:00 crc kubenswrapper[4893]: I0314 07:21:00.334836 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b6e14bd-ad75-4482-aaf2-af8cfae25e2f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "2b6e14bd-ad75-4482-aaf2-af8cfae25e2f" (UID: "2b6e14bd-ad75-4482-aaf2-af8cfae25e2f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:21:00 crc kubenswrapper[4893]: I0314 07:21:00.335059 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b6e14bd-ad75-4482-aaf2-af8cfae25e2f-logs" (OuterVolumeSpecName: "logs") pod "2b6e14bd-ad75-4482-aaf2-af8cfae25e2f" (UID: "2b6e14bd-ad75-4482-aaf2-af8cfae25e2f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:21:00 crc kubenswrapper[4893]: I0314 07:21:00.339877 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b6e14bd-ad75-4482-aaf2-af8cfae25e2f-scripts" (OuterVolumeSpecName: "scripts") pod "2b6e14bd-ad75-4482-aaf2-af8cfae25e2f" (UID: "2b6e14bd-ad75-4482-aaf2-af8cfae25e2f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:21:00 crc kubenswrapper[4893]: I0314 07:21:00.339964 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "2b6e14bd-ad75-4482-aaf2-af8cfae25e2f" (UID: "2b6e14bd-ad75-4482-aaf2-af8cfae25e2f"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 14 07:21:00 crc kubenswrapper[4893]: I0314 07:21:00.340879 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b6e14bd-ad75-4482-aaf2-af8cfae25e2f-kube-api-access-zvrvv" (OuterVolumeSpecName: "kube-api-access-zvrvv") pod "2b6e14bd-ad75-4482-aaf2-af8cfae25e2f" (UID: "2b6e14bd-ad75-4482-aaf2-af8cfae25e2f"). InnerVolumeSpecName "kube-api-access-zvrvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:21:00 crc kubenswrapper[4893]: I0314 07:21:00.370675 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2b6e14bd-ad75-4482-aaf2-af8cfae25e2f","Type":"ContainerDied","Data":"76815bfa055d1dbf086e76ecc9e3e45f285d54ab82251adc1e828a5afa883948"} Mar 14 07:21:00 crc kubenswrapper[4893]: I0314 07:21:00.370766 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 14 07:21:00 crc kubenswrapper[4893]: I0314 07:21:00.373366 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b6e14bd-ad75-4482-aaf2-af8cfae25e2f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2b6e14bd-ad75-4482-aaf2-af8cfae25e2f" (UID: "2b6e14bd-ad75-4482-aaf2-af8cfae25e2f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:21:00 crc kubenswrapper[4893]: I0314 07:21:00.396551 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b6e14bd-ad75-4482-aaf2-af8cfae25e2f-config-data" (OuterVolumeSpecName: "config-data") pod "2b6e14bd-ad75-4482-aaf2-af8cfae25e2f" (UID: "2b6e14bd-ad75-4482-aaf2-af8cfae25e2f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:21:00 crc kubenswrapper[4893]: I0314 07:21:00.403712 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b6e14bd-ad75-4482-aaf2-af8cfae25e2f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2b6e14bd-ad75-4482-aaf2-af8cfae25e2f" (UID: "2b6e14bd-ad75-4482-aaf2-af8cfae25e2f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:21:00 crc kubenswrapper[4893]: I0314 07:21:00.436054 4893 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b6e14bd-ad75-4482-aaf2-af8cfae25e2f-logs\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:00 crc kubenswrapper[4893]: I0314 07:21:00.436090 4893 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b6e14bd-ad75-4482-aaf2-af8cfae25e2f-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:00 crc kubenswrapper[4893]: I0314 07:21:00.436100 4893 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b6e14bd-ad75-4482-aaf2-af8cfae25e2f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:00 crc kubenswrapper[4893]: I0314 07:21:00.436110 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvrvv\" (UniqueName: \"kubernetes.io/projected/2b6e14bd-ad75-4482-aaf2-af8cfae25e2f-kube-api-access-zvrvv\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:00 crc kubenswrapper[4893]: I0314 07:21:00.436145 4893 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Mar 14 07:21:00 crc kubenswrapper[4893]: I0314 07:21:00.436155 4893 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b6e14bd-ad75-4482-aaf2-af8cfae25e2f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:00 crc kubenswrapper[4893]: I0314 07:21:00.436163 4893 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b6e14bd-ad75-4482-aaf2-af8cfae25e2f-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:00 crc kubenswrapper[4893]: I0314 07:21:00.436171 4893 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2b6e14bd-ad75-4482-aaf2-af8cfae25e2f-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:00 crc kubenswrapper[4893]: I0314 07:21:00.452785 4893 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Mar 14 07:21:00 crc kubenswrapper[4893]: I0314 07:21:00.538092 4893 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:00 crc kubenswrapper[4893]: I0314 07:21:00.709732 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 07:21:00 crc kubenswrapper[4893]: I0314 07:21:00.718768 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 07:21:00 crc kubenswrapper[4893]: I0314 07:21:00.745167 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 07:21:00 crc kubenswrapper[4893]: E0314 07:21:00.745505 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b6e14bd-ad75-4482-aaf2-af8cfae25e2f" containerName="glance-log" Mar 14 07:21:00 crc kubenswrapper[4893]: I0314 07:21:00.745537 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b6e14bd-ad75-4482-aaf2-af8cfae25e2f" containerName="glance-log" Mar 14 07:21:00 crc kubenswrapper[4893]: E0314 07:21:00.745548 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b6e14bd-ad75-4482-aaf2-af8cfae25e2f" containerName="glance-httpd" Mar 14 07:21:00 crc kubenswrapper[4893]: I0314 07:21:00.745554 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b6e14bd-ad75-4482-aaf2-af8cfae25e2f" containerName="glance-httpd" Mar 14 07:21:00 crc kubenswrapper[4893]: I0314 07:21:00.745730 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b6e14bd-ad75-4482-aaf2-af8cfae25e2f" containerName="glance-httpd" Mar 14 07:21:00 crc kubenswrapper[4893]: I0314 07:21:00.745750 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b6e14bd-ad75-4482-aaf2-af8cfae25e2f" containerName="glance-log" Mar 14 07:21:00 crc kubenswrapper[4893]: I0314 07:21:00.746578 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 14 07:21:00 crc kubenswrapper[4893]: I0314 07:21:00.749054 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 14 07:21:00 crc kubenswrapper[4893]: I0314 07:21:00.750350 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 14 07:21:00 crc kubenswrapper[4893]: I0314 07:21:00.769437 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 07:21:00 crc kubenswrapper[4893]: E0314 07:21:00.829496 4893 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:1240a45aec9c3e1599be762c5565556560849b49fd39c7283b8e5519dcaa501a" Mar 14 07:21:00 crc kubenswrapper[4893]: E0314 07:21:00.829670 4893 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:1240a45aec9c3e1599be762c5565556560849b49fd39c7283b8e5519dcaa501a,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-78vcx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-xwpzh_openstack(8a25b2a2-2d1a-4e0c-8114-c796ed4fe1e2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 14 07:21:00 crc kubenswrapper[4893]: E0314 07:21:00.831816 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-xwpzh" podUID="8a25b2a2-2d1a-4e0c-8114-c796ed4fe1e2" Mar 14 07:21:00 crc kubenswrapper[4893]: I0314 07:21:00.947856 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7acb9d97-b4c7-49b4-90f0-e1d5d97d2581-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7acb9d97-b4c7-49b4-90f0-e1d5d97d2581\") " pod="openstack/glance-default-external-api-0" Mar 14 07:21:00 crc kubenswrapper[4893]: I0314 07:21:00.947943 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7acb9d97-b4c7-49b4-90f0-e1d5d97d2581-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7acb9d97-b4c7-49b4-90f0-e1d5d97d2581\") " pod="openstack/glance-default-external-api-0" Mar 14 07:21:00 crc kubenswrapper[4893]: I0314 07:21:00.947963 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7acb9d97-b4c7-49b4-90f0-e1d5d97d2581-scripts\") pod \"glance-default-external-api-0\" (UID: \"7acb9d97-b4c7-49b4-90f0-e1d5d97d2581\") " pod="openstack/glance-default-external-api-0" Mar 14 07:21:00 crc kubenswrapper[4893]: I0314 07:21:00.947998 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"7acb9d97-b4c7-49b4-90f0-e1d5d97d2581\") " pod="openstack/glance-default-external-api-0" Mar 14 07:21:00 crc kubenswrapper[4893]: I0314 07:21:00.948036 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpvsn\" (UniqueName: \"kubernetes.io/projected/7acb9d97-b4c7-49b4-90f0-e1d5d97d2581-kube-api-access-dpvsn\") pod \"glance-default-external-api-0\" (UID: \"7acb9d97-b4c7-49b4-90f0-e1d5d97d2581\") " pod="openstack/glance-default-external-api-0" Mar 14 07:21:00 crc kubenswrapper[4893]: I0314 07:21:00.948065 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7acb9d97-b4c7-49b4-90f0-e1d5d97d2581-config-data\") pod \"glance-default-external-api-0\" (UID: \"7acb9d97-b4c7-49b4-90f0-e1d5d97d2581\") " pod="openstack/glance-default-external-api-0" Mar 14 07:21:00 crc kubenswrapper[4893]: I0314 07:21:00.948103 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7acb9d97-b4c7-49b4-90f0-e1d5d97d2581-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7acb9d97-b4c7-49b4-90f0-e1d5d97d2581\") " pod="openstack/glance-default-external-api-0" Mar 14 07:21:00 crc kubenswrapper[4893]: I0314 07:21:00.948119 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7acb9d97-b4c7-49b4-90f0-e1d5d97d2581-logs\") pod \"glance-default-external-api-0\" (UID: \"7acb9d97-b4c7-49b4-90f0-e1d5d97d2581\") " pod="openstack/glance-default-external-api-0" Mar 14 07:21:01 crc kubenswrapper[4893]: I0314 07:21:01.050237 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"7acb9d97-b4c7-49b4-90f0-e1d5d97d2581\") " pod="openstack/glance-default-external-api-0" Mar 14 07:21:01 crc kubenswrapper[4893]: I0314 07:21:01.050363 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpvsn\" (UniqueName: \"kubernetes.io/projected/7acb9d97-b4c7-49b4-90f0-e1d5d97d2581-kube-api-access-dpvsn\") pod \"glance-default-external-api-0\" (UID: \"7acb9d97-b4c7-49b4-90f0-e1d5d97d2581\") " pod="openstack/glance-default-external-api-0" Mar 14 07:21:01 crc kubenswrapper[4893]: I0314 07:21:01.050403 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7acb9d97-b4c7-49b4-90f0-e1d5d97d2581-config-data\") pod \"glance-default-external-api-0\" (UID: \"7acb9d97-b4c7-49b4-90f0-e1d5d97d2581\") " pod="openstack/glance-default-external-api-0" Mar 14 07:21:01 crc kubenswrapper[4893]: I0314 07:21:01.050451 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7acb9d97-b4c7-49b4-90f0-e1d5d97d2581-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7acb9d97-b4c7-49b4-90f0-e1d5d97d2581\") " pod="openstack/glance-default-external-api-0" Mar 14 07:21:01 crc kubenswrapper[4893]: I0314 07:21:01.050483 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7acb9d97-b4c7-49b4-90f0-e1d5d97d2581-logs\") pod \"glance-default-external-api-0\" (UID: \"7acb9d97-b4c7-49b4-90f0-e1d5d97d2581\") " pod="openstack/glance-default-external-api-0" Mar 14 07:21:01 crc kubenswrapper[4893]: I0314 07:21:01.050546 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7acb9d97-b4c7-49b4-90f0-e1d5d97d2581-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7acb9d97-b4c7-49b4-90f0-e1d5d97d2581\") " pod="openstack/glance-default-external-api-0" Mar 14 07:21:01 crc kubenswrapper[4893]: I0314 07:21:01.050584 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7acb9d97-b4c7-49b4-90f0-e1d5d97d2581-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7acb9d97-b4c7-49b4-90f0-e1d5d97d2581\") " pod="openstack/glance-default-external-api-0" Mar 14 07:21:01 crc kubenswrapper[4893]: I0314 07:21:01.050618 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7acb9d97-b4c7-49b4-90f0-e1d5d97d2581-scripts\") pod \"glance-default-external-api-0\" (UID: \"7acb9d97-b4c7-49b4-90f0-e1d5d97d2581\") " pod="openstack/glance-default-external-api-0" Mar 14 07:21:01 crc kubenswrapper[4893]: I0314 07:21:01.051077 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7acb9d97-b4c7-49b4-90f0-e1d5d97d2581-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7acb9d97-b4c7-49b4-90f0-e1d5d97d2581\") " pod="openstack/glance-default-external-api-0" Mar 14 07:21:01 crc kubenswrapper[4893]: I0314 07:21:01.051177 4893 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"7acb9d97-b4c7-49b4-90f0-e1d5d97d2581\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Mar 14 07:21:01 crc kubenswrapper[4893]: I0314 07:21:01.052164 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7acb9d97-b4c7-49b4-90f0-e1d5d97d2581-logs\") pod \"glance-default-external-api-0\" (UID: \"7acb9d97-b4c7-49b4-90f0-e1d5d97d2581\") " pod="openstack/glance-default-external-api-0" Mar 14 07:21:01 crc kubenswrapper[4893]: I0314 07:21:01.057023 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7acb9d97-b4c7-49b4-90f0-e1d5d97d2581-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7acb9d97-b4c7-49b4-90f0-e1d5d97d2581\") " pod="openstack/glance-default-external-api-0" Mar 14 07:21:01 crc kubenswrapper[4893]: I0314 07:21:01.062393 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7acb9d97-b4c7-49b4-90f0-e1d5d97d2581-scripts\") pod \"glance-default-external-api-0\" (UID: \"7acb9d97-b4c7-49b4-90f0-e1d5d97d2581\") " pod="openstack/glance-default-external-api-0" Mar 14 07:21:01 crc kubenswrapper[4893]: I0314 07:21:01.063235 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7acb9d97-b4c7-49b4-90f0-e1d5d97d2581-config-data\") pod \"glance-default-external-api-0\" (UID: \"7acb9d97-b4c7-49b4-90f0-e1d5d97d2581\") " pod="openstack/glance-default-external-api-0" Mar 14 07:21:01 crc kubenswrapper[4893]: I0314 07:21:01.069787 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7acb9d97-b4c7-49b4-90f0-e1d5d97d2581-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7acb9d97-b4c7-49b4-90f0-e1d5d97d2581\") " pod="openstack/glance-default-external-api-0" Mar 14 07:21:01 crc kubenswrapper[4893]: I0314 07:21:01.073146 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpvsn\" (UniqueName: \"kubernetes.io/projected/7acb9d97-b4c7-49b4-90f0-e1d5d97d2581-kube-api-access-dpvsn\") pod \"glance-default-external-api-0\" (UID: \"7acb9d97-b4c7-49b4-90f0-e1d5d97d2581\") " pod="openstack/glance-default-external-api-0" Mar 14 07:21:01 crc kubenswrapper[4893]: I0314 07:21:01.082414 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"7acb9d97-b4c7-49b4-90f0-e1d5d97d2581\") " pod="openstack/glance-default-external-api-0" Mar 14 07:21:01 crc kubenswrapper[4893]: I0314 07:21:01.367886 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 14 07:21:01 crc kubenswrapper[4893]: E0314 07:21:01.391305 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:1240a45aec9c3e1599be762c5565556560849b49fd39c7283b8e5519dcaa501a\\\"\"" pod="openstack/barbican-db-sync-xwpzh" podUID="8a25b2a2-2d1a-4e0c-8114-c796ed4fe1e2" Mar 14 07:21:01 crc kubenswrapper[4893]: I0314 07:21:01.394787 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b6e14bd-ad75-4482-aaf2-af8cfae25e2f" path="/var/lib/kubelet/pods/2b6e14bd-ad75-4482-aaf2-af8cfae25e2f/volumes" Mar 14 07:21:01 crc kubenswrapper[4893]: I0314 07:21:01.899851 4893 scope.go:117] "RemoveContainer" containerID="291929403e59b2a44e70b293f5731bdb5fc0ad2be12737e89eaef4c4dd970f28" Mar 14 07:21:01 crc kubenswrapper[4893]: E0314 07:21:01.930622 4893 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:574a17f0877c175128a764f2b37fc02456649c8514689125718ce6ca974bfb6b" Mar 14 07:21:01 crc kubenswrapper[4893]: E0314 07:21:01.930811 4893 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:574a17f0877c175128a764f2b37fc02456649c8514689125718ce6ca974bfb6b,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vtwm9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-pmw52_openstack(7e3cdf1f-7963-494d-86f8-699e7401fe91): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 14 07:21:01 crc kubenswrapper[4893]: E0314 07:21:01.932355 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-pmw52" podUID="7e3cdf1f-7963-494d-86f8-699e7401fe91" Mar 14 07:21:02 crc kubenswrapper[4893]: I0314 07:21:02.037513 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cd86fcf7-fcmhz" Mar 14 07:21:02 crc kubenswrapper[4893]: I0314 07:21:02.132411 4893 scope.go:117] "RemoveContainer" containerID="1bf61af8d794348db7eae8d549c5cff81c4796b8d03d07ad5c720faa969e1df8" Mar 14 07:21:02 crc kubenswrapper[4893]: I0314 07:21:02.160567 4893 scope.go:117] "RemoveContainer" containerID="5371a646ca7b34db41e62759f3abe4915dd902f47672f17726766dc58c5c245d" Mar 14 07:21:02 crc kubenswrapper[4893]: I0314 07:21:02.171417 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cc2d4699-ea9a-426b-b441-ee0d9e32445c-ovsdbserver-sb\") pod \"cc2d4699-ea9a-426b-b441-ee0d9e32445c\" (UID: \"cc2d4699-ea9a-426b-b441-ee0d9e32445c\") " Mar 14 07:21:02 crc kubenswrapper[4893]: I0314 07:21:02.171700 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5g865\" (UniqueName: \"kubernetes.io/projected/cc2d4699-ea9a-426b-b441-ee0d9e32445c-kube-api-access-5g865\") pod \"cc2d4699-ea9a-426b-b441-ee0d9e32445c\" (UID: \"cc2d4699-ea9a-426b-b441-ee0d9e32445c\") " Mar 14 07:21:02 crc kubenswrapper[4893]: I0314 07:21:02.171776 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc2d4699-ea9a-426b-b441-ee0d9e32445c-config\") pod \"cc2d4699-ea9a-426b-b441-ee0d9e32445c\" (UID: \"cc2d4699-ea9a-426b-b441-ee0d9e32445c\") " Mar 14 07:21:02 crc kubenswrapper[4893]: I0314 07:21:02.171840 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc2d4699-ea9a-426b-b441-ee0d9e32445c-dns-svc\") pod \"cc2d4699-ea9a-426b-b441-ee0d9e32445c\" (UID: \"cc2d4699-ea9a-426b-b441-ee0d9e32445c\") " Mar 14 07:21:02 crc kubenswrapper[4893]: I0314 07:21:02.171886 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cc2d4699-ea9a-426b-b441-ee0d9e32445c-ovsdbserver-nb\") pod \"cc2d4699-ea9a-426b-b441-ee0d9e32445c\" (UID: \"cc2d4699-ea9a-426b-b441-ee0d9e32445c\") " Mar 14 07:21:02 crc kubenswrapper[4893]: I0314 07:21:02.179044 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc2d4699-ea9a-426b-b441-ee0d9e32445c-kube-api-access-5g865" (OuterVolumeSpecName: "kube-api-access-5g865") pod "cc2d4699-ea9a-426b-b441-ee0d9e32445c" (UID: "cc2d4699-ea9a-426b-b441-ee0d9e32445c"). InnerVolumeSpecName "kube-api-access-5g865". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:21:02 crc kubenswrapper[4893]: I0314 07:21:02.219621 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc2d4699-ea9a-426b-b441-ee0d9e32445c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cc2d4699-ea9a-426b-b441-ee0d9e32445c" (UID: "cc2d4699-ea9a-426b-b441-ee0d9e32445c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:21:02 crc kubenswrapper[4893]: I0314 07:21:02.228845 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc2d4699-ea9a-426b-b441-ee0d9e32445c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cc2d4699-ea9a-426b-b441-ee0d9e32445c" (UID: "cc2d4699-ea9a-426b-b441-ee0d9e32445c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:21:02 crc kubenswrapper[4893]: I0314 07:21:02.230215 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc2d4699-ea9a-426b-b441-ee0d9e32445c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cc2d4699-ea9a-426b-b441-ee0d9e32445c" (UID: "cc2d4699-ea9a-426b-b441-ee0d9e32445c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:21:02 crc kubenswrapper[4893]: I0314 07:21:02.241617 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc2d4699-ea9a-426b-b441-ee0d9e32445c-config" (OuterVolumeSpecName: "config") pod "cc2d4699-ea9a-426b-b441-ee0d9e32445c" (UID: "cc2d4699-ea9a-426b-b441-ee0d9e32445c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:21:02 crc kubenswrapper[4893]: I0314 07:21:02.274208 4893 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc2d4699-ea9a-426b-b441-ee0d9e32445c-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:02 crc kubenswrapper[4893]: I0314 07:21:02.274241 4893 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cc2d4699-ea9a-426b-b441-ee0d9e32445c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:02 crc kubenswrapper[4893]: I0314 07:21:02.274253 4893 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cc2d4699-ea9a-426b-b441-ee0d9e32445c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:02 crc kubenswrapper[4893]: I0314 07:21:02.274265 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5g865\" (UniqueName: \"kubernetes.io/projected/cc2d4699-ea9a-426b-b441-ee0d9e32445c-kube-api-access-5g865\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:02 crc kubenswrapper[4893]: I0314 07:21:02.274276 4893 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc2d4699-ea9a-426b-b441-ee0d9e32445c-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:02 crc kubenswrapper[4893]: I0314 07:21:02.412551 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cd86fcf7-fcmhz" event={"ID":"cc2d4699-ea9a-426b-b441-ee0d9e32445c","Type":"ContainerDied","Data":"6e57fccf3512fd79d83640ff4bea77de577747c0f24b33be1d3f2a5f8a1e07a2"} Mar 14 07:21:02 crc kubenswrapper[4893]: I0314 07:21:02.412574 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cd86fcf7-fcmhz" Mar 14 07:21:02 crc kubenswrapper[4893]: I0314 07:21:02.412937 4893 scope.go:117] "RemoveContainer" containerID="a2b53e42a4ed1c6046a99777cc0e3efb1372c8fc4318d274864d09b2c1d02052" Mar 14 07:21:02 crc kubenswrapper[4893]: I0314 07:21:02.417169 4893 generic.go:334] "Generic (PLEG): container finished" podID="8e3d2794-7a14-49ad-9ca9-a73ec58d2981" containerID="a99af99d9bf92b3ba08ac70ddfbb37967dcaee5680fcad631e5bfa22ff771102" exitCode=0 Mar 14 07:21:02 crc kubenswrapper[4893]: I0314 07:21:02.417229 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-8vwkl" event={"ID":"8e3d2794-7a14-49ad-9ca9-a73ec58d2981","Type":"ContainerDied","Data":"a99af99d9bf92b3ba08ac70ddfbb37967dcaee5680fcad631e5bfa22ff771102"} Mar 14 07:21:02 crc kubenswrapper[4893]: E0314 07:21:02.425692 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:574a17f0877c175128a764f2b37fc02456649c8514689125718ce6ca974bfb6b\\\"\"" pod="openstack/cinder-db-sync-pmw52" podUID="7e3cdf1f-7963-494d-86f8-699e7401fe91" Mar 14 07:21:02 crc kubenswrapper[4893]: I0314 07:21:02.438483 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-vc8cn"] Mar 14 07:21:02 crc kubenswrapper[4893]: I0314 07:21:02.443207 4893 scope.go:117] "RemoveContainer" containerID="e6c99ae16d5df996f0d39ba2d0e7e0ea6c1dbdcf94fc8fa712802181c912483f" Mar 14 07:21:02 crc kubenswrapper[4893]: W0314 07:21:02.457666 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81acf3d1_58a7_43ed_808d_411467094efe.slice/crio-15f51cf251ac0d35e583e58143784038f5953702421ea403eaedb57972f54c2a WatchSource:0}: Error finding container 15f51cf251ac0d35e583e58143784038f5953702421ea403eaedb57972f54c2a: Status 404 returned error can't find the container with id 15f51cf251ac0d35e583e58143784038f5953702421ea403eaedb57972f54c2a Mar 14 07:21:02 crc kubenswrapper[4893]: I0314 07:21:02.477818 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cd86fcf7-fcmhz"] Mar 14 07:21:02 crc kubenswrapper[4893]: I0314 07:21:02.515906 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6cd86fcf7-fcmhz"] Mar 14 07:21:02 crc kubenswrapper[4893]: I0314 07:21:02.556382 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 07:21:02 crc kubenswrapper[4893]: W0314 07:21:02.576622 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7acb9d97_b4c7_49b4_90f0_e1d5d97d2581.slice/crio-b9e5bbe5677b8d1412b58ad0cec33d0fed3151ffc4aa57e601365ce0294a2b78 WatchSource:0}: Error finding container b9e5bbe5677b8d1412b58ad0cec33d0fed3151ffc4aa57e601365ce0294a2b78: Status 404 returned error can't find the container with id b9e5bbe5677b8d1412b58ad0cec33d0fed3151ffc4aa57e601365ce0294a2b78 Mar 14 07:21:03 crc kubenswrapper[4893]: I0314 07:21:03.391165 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc2d4699-ea9a-426b-b441-ee0d9e32445c" path="/var/lib/kubelet/pods/cc2d4699-ea9a-426b-b441-ee0d9e32445c/volumes" Mar 14 07:21:03 crc kubenswrapper[4893]: I0314 07:21:03.444160 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vc8cn" event={"ID":"81acf3d1-58a7-43ed-808d-411467094efe","Type":"ContainerStarted","Data":"34a97db1b93c5de877265ef53897d09a872b74dc2ac956116974a8aec710573c"} Mar 14 07:21:03 crc kubenswrapper[4893]: I0314 07:21:03.444210 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vc8cn" event={"ID":"81acf3d1-58a7-43ed-808d-411467094efe","Type":"ContainerStarted","Data":"15f51cf251ac0d35e583e58143784038f5953702421ea403eaedb57972f54c2a"} Mar 14 07:21:03 crc kubenswrapper[4893]: I0314 07:21:03.468805 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"079232b7-87bb-42cf-96ff-1eb2d1cfe2b5","Type":"ContainerStarted","Data":"42879a04fa13be12602354e7d2287565ac8c111d2ae1985e6be2c424ccdde936"} Mar 14 07:21:03 crc kubenswrapper[4893]: I0314 07:21:03.468886 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"079232b7-87bb-42cf-96ff-1eb2d1cfe2b5","Type":"ContainerStarted","Data":"8e99f57da5e8b525a1c282c63776aab48b212ab715d4fbb62b57b9f55dd89177"} Mar 14 07:21:03 crc kubenswrapper[4893]: I0314 07:21:03.468901 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"079232b7-87bb-42cf-96ff-1eb2d1cfe2b5","Type":"ContainerStarted","Data":"b2ee065d0fc6e1370be16e3b3da813d71fb595937dc649af8cca1aaacddcfda3"} Mar 14 07:21:03 crc kubenswrapper[4893]: I0314 07:21:03.468911 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"079232b7-87bb-42cf-96ff-1eb2d1cfe2b5","Type":"ContainerStarted","Data":"c81f7949be58dba02aa8c25ce6d88e2e1c6c71c62d11b3a1abcfa0c71bd14098"} Mar 14 07:21:03 crc kubenswrapper[4893]: I0314 07:21:03.468920 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"079232b7-87bb-42cf-96ff-1eb2d1cfe2b5","Type":"ContainerStarted","Data":"5caf4411272c4d0599dad02231d0a82c29d93699fd274a70d9030405888f8c29"} Mar 14 07:21:03 crc kubenswrapper[4893]: I0314 07:21:03.471472 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aa99848c-b3da-4307-afd1-f37484e648d6","Type":"ContainerStarted","Data":"1376e2738bb5dd05eea9d91edfaf196013317f6b8119087b98c8835756a2312d"} Mar 14 07:21:03 crc kubenswrapper[4893]: I0314 07:21:03.471895 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-vc8cn" podStartSLOduration=16.471869805 podStartE2EDuration="16.471869805s" podCreationTimestamp="2026-03-14 07:20:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:21:03.458804037 +0000 UTC m=+1342.720980839" watchObservedRunningTime="2026-03-14 07:21:03.471869805 +0000 UTC m=+1342.734046597" Mar 14 07:21:03 crc kubenswrapper[4893]: I0314 07:21:03.473503 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7acb9d97-b4c7-49b4-90f0-e1d5d97d2581","Type":"ContainerStarted","Data":"7c0fa865cdda16c5e7099733a49b13b73a32bab1ba4edabe119ddc0ae6452bb4"} Mar 14 07:21:03 crc kubenswrapper[4893]: I0314 07:21:03.473541 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7acb9d97-b4c7-49b4-90f0-e1d5d97d2581","Type":"ContainerStarted","Data":"b9e5bbe5677b8d1412b58ad0cec33d0fed3151ffc4aa57e601365ce0294a2b78"} Mar 14 07:21:03 crc kubenswrapper[4893]: I0314 07:21:03.480598 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-mdkpp" event={"ID":"0b4b8032-3d08-4574-9987-fc159aa8c506","Type":"ContainerStarted","Data":"d33597c4daeda4ec0fe028c311645b884a212bf1f0a2101cb0b7a2f6903446eb"} Mar 14 07:21:03 crc kubenswrapper[4893]: I0314 07:21:03.504015 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-mdkpp" podStartSLOduration=3.410581897 podStartE2EDuration="25.503991887s" podCreationTimestamp="2026-03-14 07:20:38 +0000 UTC" firstStartedPulling="2026-03-14 07:20:39.777391008 +0000 UTC m=+1319.039567800" lastFinishedPulling="2026-03-14 07:21:01.870800998 +0000 UTC m=+1341.132977790" observedRunningTime="2026-03-14 07:21:03.500225396 +0000 UTC m=+1342.762402198" watchObservedRunningTime="2026-03-14 07:21:03.503991887 +0000 UTC m=+1342.766168679" Mar 14 07:21:03 crc kubenswrapper[4893]: I0314 07:21:03.523172 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 07:21:04 crc kubenswrapper[4893]: W0314 07:21:04.033258 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc782f441_cf3b_4a69_965b_5d87dd4a00ad.slice/crio-5755d2f0017544014d3552adfe6784a4bfb019725b81953a5a8f764a925767ee WatchSource:0}: Error finding container 5755d2f0017544014d3552adfe6784a4bfb019725b81953a5a8f764a925767ee: Status 404 returned error can't find the container with id 5755d2f0017544014d3552adfe6784a4bfb019725b81953a5a8f764a925767ee Mar 14 07:21:04 crc kubenswrapper[4893]: I0314 07:21:04.042756 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-8vwkl" Mar 14 07:21:04 crc kubenswrapper[4893]: I0314 07:21:04.206770 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e3d2794-7a14-49ad-9ca9-a73ec58d2981-combined-ca-bundle\") pod \"8e3d2794-7a14-49ad-9ca9-a73ec58d2981\" (UID: \"8e3d2794-7a14-49ad-9ca9-a73ec58d2981\") " Mar 14 07:21:04 crc kubenswrapper[4893]: I0314 07:21:04.206968 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gptp8\" (UniqueName: \"kubernetes.io/projected/8e3d2794-7a14-49ad-9ca9-a73ec58d2981-kube-api-access-gptp8\") pod \"8e3d2794-7a14-49ad-9ca9-a73ec58d2981\" (UID: \"8e3d2794-7a14-49ad-9ca9-a73ec58d2981\") " Mar 14 07:21:04 crc kubenswrapper[4893]: I0314 07:21:04.207083 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8e3d2794-7a14-49ad-9ca9-a73ec58d2981-config\") pod \"8e3d2794-7a14-49ad-9ca9-a73ec58d2981\" (UID: \"8e3d2794-7a14-49ad-9ca9-a73ec58d2981\") " Mar 14 07:21:04 crc kubenswrapper[4893]: I0314 07:21:04.213553 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e3d2794-7a14-49ad-9ca9-a73ec58d2981-kube-api-access-gptp8" (OuterVolumeSpecName: "kube-api-access-gptp8") pod "8e3d2794-7a14-49ad-9ca9-a73ec58d2981" (UID: "8e3d2794-7a14-49ad-9ca9-a73ec58d2981"). InnerVolumeSpecName "kube-api-access-gptp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:21:04 crc kubenswrapper[4893]: I0314 07:21:04.238647 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e3d2794-7a14-49ad-9ca9-a73ec58d2981-config" (OuterVolumeSpecName: "config") pod "8e3d2794-7a14-49ad-9ca9-a73ec58d2981" (UID: "8e3d2794-7a14-49ad-9ca9-a73ec58d2981"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:21:04 crc kubenswrapper[4893]: I0314 07:21:04.244636 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e3d2794-7a14-49ad-9ca9-a73ec58d2981-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8e3d2794-7a14-49ad-9ca9-a73ec58d2981" (UID: "8e3d2794-7a14-49ad-9ca9-a73ec58d2981"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:21:04 crc kubenswrapper[4893]: I0314 07:21:04.309118 4893 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e3d2794-7a14-49ad-9ca9-a73ec58d2981-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:04 crc kubenswrapper[4893]: I0314 07:21:04.309154 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gptp8\" (UniqueName: \"kubernetes.io/projected/8e3d2794-7a14-49ad-9ca9-a73ec58d2981-kube-api-access-gptp8\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:04 crc kubenswrapper[4893]: I0314 07:21:04.309167 4893 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/8e3d2794-7a14-49ad-9ca9-a73ec58d2981-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:04 crc kubenswrapper[4893]: I0314 07:21:04.487612 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c782f441-cf3b-4a69-965b-5d87dd4a00ad","Type":"ContainerStarted","Data":"5755d2f0017544014d3552adfe6784a4bfb019725b81953a5a8f764a925767ee"} Mar 14 07:21:04 crc kubenswrapper[4893]: I0314 07:21:04.504022 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"079232b7-87bb-42cf-96ff-1eb2d1cfe2b5","Type":"ContainerStarted","Data":"50280f9973eaf24cfaecf1b0325f81232bba8db39fef91743adc8159634ae0a5"} Mar 14 07:21:04 crc kubenswrapper[4893]: I0314 07:21:04.504076 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"079232b7-87bb-42cf-96ff-1eb2d1cfe2b5","Type":"ContainerStarted","Data":"61f20b3d153fc2b7e2f88fb820282348d684b7c7d0ab7a0173efe70cf59b854d"} Mar 14 07:21:04 crc kubenswrapper[4893]: I0314 07:21:04.517233 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-8vwkl" event={"ID":"8e3d2794-7a14-49ad-9ca9-a73ec58d2981","Type":"ContainerDied","Data":"5b7e5c52c542dd7f93b569a371b5ccfcf5517d9ae865ac9c63de27f5cae741bd"} Mar 14 07:21:04 crc kubenswrapper[4893]: I0314 07:21:04.517277 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b7e5c52c542dd7f93b569a371b5ccfcf5517d9ae865ac9c63de27f5cae741bd" Mar 14 07:21:04 crc kubenswrapper[4893]: I0314 07:21:04.517347 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-8vwkl" Mar 14 07:21:04 crc kubenswrapper[4893]: I0314 07:21:04.523588 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7acb9d97-b4c7-49b4-90f0-e1d5d97d2581","Type":"ContainerStarted","Data":"7789f926434baf6d22b935982b2af6e952f4a729d8d70fb74ec2c13da50d7032"} Mar 14 07:21:04 crc kubenswrapper[4893]: I0314 07:21:04.559411 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=37.326183938 podStartE2EDuration="1m3.55938772s" podCreationTimestamp="2026-03-14 07:20:01 +0000 UTC" firstStartedPulling="2026-03-14 07:20:35.650188323 +0000 UTC m=+1314.912365115" lastFinishedPulling="2026-03-14 07:21:01.883392105 +0000 UTC m=+1341.145568897" observedRunningTime="2026-03-14 07:21:04.544321093 +0000 UTC m=+1343.806497895" watchObservedRunningTime="2026-03-14 07:21:04.55938772 +0000 UTC m=+1343.821564512" Mar 14 07:21:04 crc kubenswrapper[4893]: I0314 07:21:04.585965 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.585946697 podStartE2EDuration="4.585946697s" podCreationTimestamp="2026-03-14 07:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:21:04.575679347 +0000 UTC m=+1343.837856159" watchObservedRunningTime="2026-03-14 07:21:04.585946697 +0000 UTC m=+1343.848123489" Mar 14 07:21:04 crc kubenswrapper[4893]: I0314 07:21:04.673055 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-586fc8487f-gjjdq"] Mar 14 07:21:04 crc kubenswrapper[4893]: E0314 07:21:04.699115 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc2d4699-ea9a-426b-b441-ee0d9e32445c" containerName="dnsmasq-dns" Mar 14 07:21:04 crc kubenswrapper[4893]: I0314 07:21:04.699150 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc2d4699-ea9a-426b-b441-ee0d9e32445c" containerName="dnsmasq-dns" Mar 14 07:21:04 crc kubenswrapper[4893]: E0314 07:21:04.699204 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc2d4699-ea9a-426b-b441-ee0d9e32445c" containerName="init" Mar 14 07:21:04 crc kubenswrapper[4893]: I0314 07:21:04.699212 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc2d4699-ea9a-426b-b441-ee0d9e32445c" containerName="init" Mar 14 07:21:04 crc kubenswrapper[4893]: E0314 07:21:04.699240 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e3d2794-7a14-49ad-9ca9-a73ec58d2981" containerName="neutron-db-sync" Mar 14 07:21:04 crc kubenswrapper[4893]: I0314 07:21:04.699249 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e3d2794-7a14-49ad-9ca9-a73ec58d2981" containerName="neutron-db-sync" Mar 14 07:21:04 crc kubenswrapper[4893]: I0314 07:21:04.699672 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e3d2794-7a14-49ad-9ca9-a73ec58d2981" containerName="neutron-db-sync" Mar 14 07:21:04 crc kubenswrapper[4893]: I0314 07:21:04.699708 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc2d4699-ea9a-426b-b441-ee0d9e32445c" containerName="dnsmasq-dns" Mar 14 07:21:04 crc kubenswrapper[4893]: I0314 07:21:04.705591 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586fc8487f-gjjdq" Mar 14 07:21:04 crc kubenswrapper[4893]: I0314 07:21:04.739319 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-586fc8487f-gjjdq"] Mar 14 07:21:04 crc kubenswrapper[4893]: I0314 07:21:04.809066 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-b866f57b8-fbw4s"] Mar 14 07:21:04 crc kubenswrapper[4893]: I0314 07:21:04.810927 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b866f57b8-fbw4s" Mar 14 07:21:04 crc kubenswrapper[4893]: I0314 07:21:04.816401 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 14 07:21:04 crc kubenswrapper[4893]: I0314 07:21:04.816697 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 14 07:21:04 crc kubenswrapper[4893]: I0314 07:21:04.816862 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 14 07:21:04 crc kubenswrapper[4893]: I0314 07:21:04.816986 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-bqpbr" Mar 14 07:21:04 crc kubenswrapper[4893]: I0314 07:21:04.821631 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b512edab-7aac-4d18-baed-7329ac1c9151-dns-svc\") pod \"dnsmasq-dns-586fc8487f-gjjdq\" (UID: \"b512edab-7aac-4d18-baed-7329ac1c9151\") " pod="openstack/dnsmasq-dns-586fc8487f-gjjdq" Mar 14 07:21:04 crc kubenswrapper[4893]: I0314 07:21:04.821705 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-958h6\" (UniqueName: \"kubernetes.io/projected/b512edab-7aac-4d18-baed-7329ac1c9151-kube-api-access-958h6\") pod \"dnsmasq-dns-586fc8487f-gjjdq\" (UID: \"b512edab-7aac-4d18-baed-7329ac1c9151\") " pod="openstack/dnsmasq-dns-586fc8487f-gjjdq" Mar 14 07:21:04 crc kubenswrapper[4893]: I0314 07:21:04.821738 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b512edab-7aac-4d18-baed-7329ac1c9151-ovsdbserver-nb\") pod \"dnsmasq-dns-586fc8487f-gjjdq\" (UID: \"b512edab-7aac-4d18-baed-7329ac1c9151\") " pod="openstack/dnsmasq-dns-586fc8487f-gjjdq" Mar 14 07:21:04 crc kubenswrapper[4893]: I0314 07:21:04.821773 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b512edab-7aac-4d18-baed-7329ac1c9151-config\") pod \"dnsmasq-dns-586fc8487f-gjjdq\" (UID: \"b512edab-7aac-4d18-baed-7329ac1c9151\") " pod="openstack/dnsmasq-dns-586fc8487f-gjjdq" Mar 14 07:21:04 crc kubenswrapper[4893]: I0314 07:21:04.821836 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b512edab-7aac-4d18-baed-7329ac1c9151-ovsdbserver-sb\") pod \"dnsmasq-dns-586fc8487f-gjjdq\" (UID: \"b512edab-7aac-4d18-baed-7329ac1c9151\") " pod="openstack/dnsmasq-dns-586fc8487f-gjjdq" Mar 14 07:21:04 crc kubenswrapper[4893]: I0314 07:21:04.847855 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b866f57b8-fbw4s"] Mar 14 07:21:04 crc kubenswrapper[4893]: I0314 07:21:04.924417 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/61f6301e-96d7-4b42-b14c-1286aff6c13f-ovndb-tls-certs\") pod \"neutron-b866f57b8-fbw4s\" (UID: \"61f6301e-96d7-4b42-b14c-1286aff6c13f\") " pod="openstack/neutron-b866f57b8-fbw4s" Mar 14 07:21:04 crc kubenswrapper[4893]: I0314 07:21:04.924491 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-958h6\" (UniqueName: \"kubernetes.io/projected/b512edab-7aac-4d18-baed-7329ac1c9151-kube-api-access-958h6\") pod \"dnsmasq-dns-586fc8487f-gjjdq\" (UID: \"b512edab-7aac-4d18-baed-7329ac1c9151\") " pod="openstack/dnsmasq-dns-586fc8487f-gjjdq" Mar 14 07:21:04 crc kubenswrapper[4893]: I0314 07:21:04.924529 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/61f6301e-96d7-4b42-b14c-1286aff6c13f-httpd-config\") pod \"neutron-b866f57b8-fbw4s\" (UID: \"61f6301e-96d7-4b42-b14c-1286aff6c13f\") " pod="openstack/neutron-b866f57b8-fbw4s" Mar 14 07:21:04 crc kubenswrapper[4893]: I0314 07:21:04.924557 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b512edab-7aac-4d18-baed-7329ac1c9151-ovsdbserver-nb\") pod \"dnsmasq-dns-586fc8487f-gjjdq\" (UID: \"b512edab-7aac-4d18-baed-7329ac1c9151\") " pod="openstack/dnsmasq-dns-586fc8487f-gjjdq" Mar 14 07:21:04 crc kubenswrapper[4893]: I0314 07:21:04.924588 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b512edab-7aac-4d18-baed-7329ac1c9151-config\") pod \"dnsmasq-dns-586fc8487f-gjjdq\" (UID: \"b512edab-7aac-4d18-baed-7329ac1c9151\") " pod="openstack/dnsmasq-dns-586fc8487f-gjjdq" Mar 14 07:21:04 crc kubenswrapper[4893]: I0314 07:21:04.924656 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b512edab-7aac-4d18-baed-7329ac1c9151-ovsdbserver-sb\") pod \"dnsmasq-dns-586fc8487f-gjjdq\" (UID: \"b512edab-7aac-4d18-baed-7329ac1c9151\") " pod="openstack/dnsmasq-dns-586fc8487f-gjjdq" Mar 14 07:21:04 crc kubenswrapper[4893]: I0314 07:21:04.924682 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/61f6301e-96d7-4b42-b14c-1286aff6c13f-config\") pod \"neutron-b866f57b8-fbw4s\" (UID: \"61f6301e-96d7-4b42-b14c-1286aff6c13f\") " pod="openstack/neutron-b866f57b8-fbw4s" Mar 14 07:21:04 crc kubenswrapper[4893]: I0314 07:21:04.924703 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61f6301e-96d7-4b42-b14c-1286aff6c13f-combined-ca-bundle\") pod \"neutron-b866f57b8-fbw4s\" (UID: \"61f6301e-96d7-4b42-b14c-1286aff6c13f\") " pod="openstack/neutron-b866f57b8-fbw4s" Mar 14 07:21:04 crc kubenswrapper[4893]: I0314 07:21:04.924719 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htr4q\" (UniqueName: \"kubernetes.io/projected/61f6301e-96d7-4b42-b14c-1286aff6c13f-kube-api-access-htr4q\") pod \"neutron-b866f57b8-fbw4s\" (UID: \"61f6301e-96d7-4b42-b14c-1286aff6c13f\") " pod="openstack/neutron-b866f57b8-fbw4s" Mar 14 07:21:04 crc kubenswrapper[4893]: I0314 07:21:04.924737 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b512edab-7aac-4d18-baed-7329ac1c9151-dns-svc\") pod \"dnsmasq-dns-586fc8487f-gjjdq\" (UID: \"b512edab-7aac-4d18-baed-7329ac1c9151\") " pod="openstack/dnsmasq-dns-586fc8487f-gjjdq" Mar 14 07:21:04 crc kubenswrapper[4893]: I0314 07:21:04.925573 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b512edab-7aac-4d18-baed-7329ac1c9151-dns-svc\") pod \"dnsmasq-dns-586fc8487f-gjjdq\" (UID: \"b512edab-7aac-4d18-baed-7329ac1c9151\") " pod="openstack/dnsmasq-dns-586fc8487f-gjjdq" Mar 14 07:21:04 crc kubenswrapper[4893]: I0314 07:21:04.926386 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b512edab-7aac-4d18-baed-7329ac1c9151-ovsdbserver-nb\") pod \"dnsmasq-dns-586fc8487f-gjjdq\" (UID: \"b512edab-7aac-4d18-baed-7329ac1c9151\") " pod="openstack/dnsmasq-dns-586fc8487f-gjjdq" Mar 14 07:21:04 crc kubenswrapper[4893]: I0314 07:21:04.927056 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b512edab-7aac-4d18-baed-7329ac1c9151-config\") pod \"dnsmasq-dns-586fc8487f-gjjdq\" (UID: \"b512edab-7aac-4d18-baed-7329ac1c9151\") " pod="openstack/dnsmasq-dns-586fc8487f-gjjdq" Mar 14 07:21:04 crc kubenswrapper[4893]: I0314 07:21:04.927560 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b512edab-7aac-4d18-baed-7329ac1c9151-ovsdbserver-sb\") pod \"dnsmasq-dns-586fc8487f-gjjdq\" (UID: \"b512edab-7aac-4d18-baed-7329ac1c9151\") " pod="openstack/dnsmasq-dns-586fc8487f-gjjdq" Mar 14 07:21:04 crc kubenswrapper[4893]: I0314 07:21:04.962665 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-586fc8487f-gjjdq"] Mar 14 07:21:04 crc kubenswrapper[4893]: E0314 07:21:04.963251 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-958h6], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-586fc8487f-gjjdq" podUID="b512edab-7aac-4d18-baed-7329ac1c9151" Mar 14 07:21:04 crc kubenswrapper[4893]: I0314 07:21:04.972485 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-958h6\" (UniqueName: \"kubernetes.io/projected/b512edab-7aac-4d18-baed-7329ac1c9151-kube-api-access-958h6\") pod \"dnsmasq-dns-586fc8487f-gjjdq\" (UID: \"b512edab-7aac-4d18-baed-7329ac1c9151\") " pod="openstack/dnsmasq-dns-586fc8487f-gjjdq" Mar 14 07:21:04 crc kubenswrapper[4893]: I0314 07:21:04.991567 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d8b7f7f4c-klm8k"] Mar 14 07:21:04 crc kubenswrapper[4893]: I0314 07:21:04.992837 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d8b7f7f4c-klm8k" Mar 14 07:21:05 crc kubenswrapper[4893]: I0314 07:21:05.000498 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 14 07:21:05 crc kubenswrapper[4893]: I0314 07:21:05.029404 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/86774cd6-a036-4ec8-8e5a-d82247af2737-ovsdbserver-nb\") pod \"dnsmasq-dns-6d8b7f7f4c-klm8k\" (UID: \"86774cd6-a036-4ec8-8e5a-d82247af2737\") " pod="openstack/dnsmasq-dns-6d8b7f7f4c-klm8k" Mar 14 07:21:05 crc kubenswrapper[4893]: I0314 07:21:05.029452 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/86774cd6-a036-4ec8-8e5a-d82247af2737-dns-swift-storage-0\") pod \"dnsmasq-dns-6d8b7f7f4c-klm8k\" (UID: \"86774cd6-a036-4ec8-8e5a-d82247af2737\") " pod="openstack/dnsmasq-dns-6d8b7f7f4c-klm8k" Mar 14 07:21:05 crc kubenswrapper[4893]: I0314 07:21:05.029488 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/86774cd6-a036-4ec8-8e5a-d82247af2737-dns-svc\") pod \"dnsmasq-dns-6d8b7f7f4c-klm8k\" (UID: \"86774cd6-a036-4ec8-8e5a-d82247af2737\") " pod="openstack/dnsmasq-dns-6d8b7f7f4c-klm8k" Mar 14 07:21:05 crc kubenswrapper[4893]: I0314 07:21:05.029515 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bl6r\" (UniqueName: \"kubernetes.io/projected/86774cd6-a036-4ec8-8e5a-d82247af2737-kube-api-access-9bl6r\") pod \"dnsmasq-dns-6d8b7f7f4c-klm8k\" (UID: \"86774cd6-a036-4ec8-8e5a-d82247af2737\") " pod="openstack/dnsmasq-dns-6d8b7f7f4c-klm8k" Mar 14 07:21:05 crc kubenswrapper[4893]: I0314 07:21:05.029557 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/61f6301e-96d7-4b42-b14c-1286aff6c13f-config\") pod \"neutron-b866f57b8-fbw4s\" (UID: \"61f6301e-96d7-4b42-b14c-1286aff6c13f\") " pod="openstack/neutron-b866f57b8-fbw4s" Mar 14 07:21:05 crc kubenswrapper[4893]: I0314 07:21:05.029583 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61f6301e-96d7-4b42-b14c-1286aff6c13f-combined-ca-bundle\") pod \"neutron-b866f57b8-fbw4s\" (UID: \"61f6301e-96d7-4b42-b14c-1286aff6c13f\") " pod="openstack/neutron-b866f57b8-fbw4s" Mar 14 07:21:05 crc kubenswrapper[4893]: I0314 07:21:05.029598 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htr4q\" (UniqueName: \"kubernetes.io/projected/61f6301e-96d7-4b42-b14c-1286aff6c13f-kube-api-access-htr4q\") pod \"neutron-b866f57b8-fbw4s\" (UID: \"61f6301e-96d7-4b42-b14c-1286aff6c13f\") " pod="openstack/neutron-b866f57b8-fbw4s" Mar 14 07:21:05 crc kubenswrapper[4893]: I0314 07:21:05.029630 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/86774cd6-a036-4ec8-8e5a-d82247af2737-ovsdbserver-sb\") pod \"dnsmasq-dns-6d8b7f7f4c-klm8k\" (UID: \"86774cd6-a036-4ec8-8e5a-d82247af2737\") " pod="openstack/dnsmasq-dns-6d8b7f7f4c-klm8k" Mar 14 07:21:05 crc kubenswrapper[4893]: I0314 07:21:05.029647 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/61f6301e-96d7-4b42-b14c-1286aff6c13f-ovndb-tls-certs\") pod \"neutron-b866f57b8-fbw4s\" (UID: \"61f6301e-96d7-4b42-b14c-1286aff6c13f\") " pod="openstack/neutron-b866f57b8-fbw4s" Mar 14 07:21:05 crc kubenswrapper[4893]: I0314 07:21:05.029673 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86774cd6-a036-4ec8-8e5a-d82247af2737-config\") pod \"dnsmasq-dns-6d8b7f7f4c-klm8k\" (UID: \"86774cd6-a036-4ec8-8e5a-d82247af2737\") " pod="openstack/dnsmasq-dns-6d8b7f7f4c-klm8k" Mar 14 07:21:05 crc kubenswrapper[4893]: I0314 07:21:05.029696 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/61f6301e-96d7-4b42-b14c-1286aff6c13f-httpd-config\") pod \"neutron-b866f57b8-fbw4s\" (UID: \"61f6301e-96d7-4b42-b14c-1286aff6c13f\") " pod="openstack/neutron-b866f57b8-fbw4s" Mar 14 07:21:05 crc kubenswrapper[4893]: I0314 07:21:05.037657 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d8b7f7f4c-klm8k"] Mar 14 07:21:05 crc kubenswrapper[4893]: I0314 07:21:05.043245 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61f6301e-96d7-4b42-b14c-1286aff6c13f-combined-ca-bundle\") pod \"neutron-b866f57b8-fbw4s\" (UID: \"61f6301e-96d7-4b42-b14c-1286aff6c13f\") " pod="openstack/neutron-b866f57b8-fbw4s" Mar 14 07:21:05 crc kubenswrapper[4893]: I0314 07:21:05.049208 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/61f6301e-96d7-4b42-b14c-1286aff6c13f-httpd-config\") pod \"neutron-b866f57b8-fbw4s\" (UID: \"61f6301e-96d7-4b42-b14c-1286aff6c13f\") " pod="openstack/neutron-b866f57b8-fbw4s" Mar 14 07:21:05 crc kubenswrapper[4893]: I0314 07:21:05.049892 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/61f6301e-96d7-4b42-b14c-1286aff6c13f-ovndb-tls-certs\") pod \"neutron-b866f57b8-fbw4s\" (UID: \"61f6301e-96d7-4b42-b14c-1286aff6c13f\") " pod="openstack/neutron-b866f57b8-fbw4s" Mar 14 07:21:05 crc kubenswrapper[4893]: I0314 07:21:05.054941 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/61f6301e-96d7-4b42-b14c-1286aff6c13f-config\") pod \"neutron-b866f57b8-fbw4s\" (UID: \"61f6301e-96d7-4b42-b14c-1286aff6c13f\") " pod="openstack/neutron-b866f57b8-fbw4s" Mar 14 07:21:05 crc kubenswrapper[4893]: I0314 07:21:05.055643 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htr4q\" (UniqueName: \"kubernetes.io/projected/61f6301e-96d7-4b42-b14c-1286aff6c13f-kube-api-access-htr4q\") pod \"neutron-b866f57b8-fbw4s\" (UID: \"61f6301e-96d7-4b42-b14c-1286aff6c13f\") " pod="openstack/neutron-b866f57b8-fbw4s" Mar 14 07:21:05 crc kubenswrapper[4893]: I0314 07:21:05.131766 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/86774cd6-a036-4ec8-8e5a-d82247af2737-ovsdbserver-nb\") pod \"dnsmasq-dns-6d8b7f7f4c-klm8k\" (UID: \"86774cd6-a036-4ec8-8e5a-d82247af2737\") " pod="openstack/dnsmasq-dns-6d8b7f7f4c-klm8k" Mar 14 07:21:05 crc kubenswrapper[4893]: I0314 07:21:05.131815 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/86774cd6-a036-4ec8-8e5a-d82247af2737-dns-swift-storage-0\") pod \"dnsmasq-dns-6d8b7f7f4c-klm8k\" (UID: \"86774cd6-a036-4ec8-8e5a-d82247af2737\") " pod="openstack/dnsmasq-dns-6d8b7f7f4c-klm8k" Mar 14 07:21:05 crc kubenswrapper[4893]: I0314 07:21:05.131843 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/86774cd6-a036-4ec8-8e5a-d82247af2737-dns-svc\") pod \"dnsmasq-dns-6d8b7f7f4c-klm8k\" (UID: \"86774cd6-a036-4ec8-8e5a-d82247af2737\") " pod="openstack/dnsmasq-dns-6d8b7f7f4c-klm8k" Mar 14 07:21:05 crc kubenswrapper[4893]: I0314 07:21:05.131865 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bl6r\" (UniqueName: \"kubernetes.io/projected/86774cd6-a036-4ec8-8e5a-d82247af2737-kube-api-access-9bl6r\") pod \"dnsmasq-dns-6d8b7f7f4c-klm8k\" (UID: \"86774cd6-a036-4ec8-8e5a-d82247af2737\") " pod="openstack/dnsmasq-dns-6d8b7f7f4c-klm8k" Mar 14 07:21:05 crc kubenswrapper[4893]: I0314 07:21:05.131906 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/86774cd6-a036-4ec8-8e5a-d82247af2737-ovsdbserver-sb\") pod \"dnsmasq-dns-6d8b7f7f4c-klm8k\" (UID: \"86774cd6-a036-4ec8-8e5a-d82247af2737\") " pod="openstack/dnsmasq-dns-6d8b7f7f4c-klm8k" Mar 14 07:21:05 crc kubenswrapper[4893]: I0314 07:21:05.131929 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86774cd6-a036-4ec8-8e5a-d82247af2737-config\") pod \"dnsmasq-dns-6d8b7f7f4c-klm8k\" (UID: \"86774cd6-a036-4ec8-8e5a-d82247af2737\") " pod="openstack/dnsmasq-dns-6d8b7f7f4c-klm8k" Mar 14 07:21:05 crc kubenswrapper[4893]: I0314 07:21:05.132632 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/86774cd6-a036-4ec8-8e5a-d82247af2737-dns-swift-storage-0\") pod \"dnsmasq-dns-6d8b7f7f4c-klm8k\" (UID: \"86774cd6-a036-4ec8-8e5a-d82247af2737\") " pod="openstack/dnsmasq-dns-6d8b7f7f4c-klm8k" Mar 14 07:21:05 crc kubenswrapper[4893]: I0314 07:21:05.132812 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86774cd6-a036-4ec8-8e5a-d82247af2737-config\") pod \"dnsmasq-dns-6d8b7f7f4c-klm8k\" (UID: \"86774cd6-a036-4ec8-8e5a-d82247af2737\") " pod="openstack/dnsmasq-dns-6d8b7f7f4c-klm8k" Mar 14 07:21:05 crc kubenswrapper[4893]: I0314 07:21:05.133188 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/86774cd6-a036-4ec8-8e5a-d82247af2737-ovsdbserver-sb\") pod \"dnsmasq-dns-6d8b7f7f4c-klm8k\" (UID: \"86774cd6-a036-4ec8-8e5a-d82247af2737\") " pod="openstack/dnsmasq-dns-6d8b7f7f4c-klm8k" Mar 14 07:21:05 crc kubenswrapper[4893]: I0314 07:21:05.133443 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/86774cd6-a036-4ec8-8e5a-d82247af2737-dns-svc\") pod \"dnsmasq-dns-6d8b7f7f4c-klm8k\" (UID: \"86774cd6-a036-4ec8-8e5a-d82247af2737\") " pod="openstack/dnsmasq-dns-6d8b7f7f4c-klm8k" Mar 14 07:21:05 crc kubenswrapper[4893]: I0314 07:21:05.133786 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/86774cd6-a036-4ec8-8e5a-d82247af2737-ovsdbserver-nb\") pod \"dnsmasq-dns-6d8b7f7f4c-klm8k\" (UID: \"86774cd6-a036-4ec8-8e5a-d82247af2737\") " pod="openstack/dnsmasq-dns-6d8b7f7f4c-klm8k" Mar 14 07:21:05 crc kubenswrapper[4893]: I0314 07:21:05.147005 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b866f57b8-fbw4s" Mar 14 07:21:05 crc kubenswrapper[4893]: I0314 07:21:05.150989 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bl6r\" (UniqueName: \"kubernetes.io/projected/86774cd6-a036-4ec8-8e5a-d82247af2737-kube-api-access-9bl6r\") pod \"dnsmasq-dns-6d8b7f7f4c-klm8k\" (UID: \"86774cd6-a036-4ec8-8e5a-d82247af2737\") " pod="openstack/dnsmasq-dns-6d8b7f7f4c-klm8k" Mar 14 07:21:05 crc kubenswrapper[4893]: I0314 07:21:05.342872 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d8b7f7f4c-klm8k" Mar 14 07:21:05 crc kubenswrapper[4893]: I0314 07:21:05.571188 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c782f441-cf3b-4a69-965b-5d87dd4a00ad","Type":"ContainerStarted","Data":"fee9c35478e394d2ad663afea76c8313320b0a7f236352aaa7a70481737b3c1f"} Mar 14 07:21:05 crc kubenswrapper[4893]: I0314 07:21:05.573125 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586fc8487f-gjjdq" Mar 14 07:21:05 crc kubenswrapper[4893]: I0314 07:21:05.573710 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aa99848c-b3da-4307-afd1-f37484e648d6","Type":"ContainerStarted","Data":"2f317dd6ab7555b498a5f2e58f54aaf4c94f2e9694be7af371bfdad72c87110f"} Mar 14 07:21:05 crc kubenswrapper[4893]: I0314 07:21:05.590826 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586fc8487f-gjjdq" Mar 14 07:21:05 crc kubenswrapper[4893]: I0314 07:21:05.648102 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b512edab-7aac-4d18-baed-7329ac1c9151-ovsdbserver-nb\") pod \"b512edab-7aac-4d18-baed-7329ac1c9151\" (UID: \"b512edab-7aac-4d18-baed-7329ac1c9151\") " Mar 14 07:21:05 crc kubenswrapper[4893]: I0314 07:21:05.648184 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-958h6\" (UniqueName: \"kubernetes.io/projected/b512edab-7aac-4d18-baed-7329ac1c9151-kube-api-access-958h6\") pod \"b512edab-7aac-4d18-baed-7329ac1c9151\" (UID: \"b512edab-7aac-4d18-baed-7329ac1c9151\") " Mar 14 07:21:05 crc kubenswrapper[4893]: I0314 07:21:05.648220 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b512edab-7aac-4d18-baed-7329ac1c9151-config\") pod \"b512edab-7aac-4d18-baed-7329ac1c9151\" (UID: \"b512edab-7aac-4d18-baed-7329ac1c9151\") " Mar 14 07:21:05 crc kubenswrapper[4893]: I0314 07:21:05.648347 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b512edab-7aac-4d18-baed-7329ac1c9151-dns-svc\") pod \"b512edab-7aac-4d18-baed-7329ac1c9151\" (UID: \"b512edab-7aac-4d18-baed-7329ac1c9151\") " Mar 14 07:21:05 crc kubenswrapper[4893]: I0314 07:21:05.648391 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b512edab-7aac-4d18-baed-7329ac1c9151-ovsdbserver-sb\") pod \"b512edab-7aac-4d18-baed-7329ac1c9151\" (UID: \"b512edab-7aac-4d18-baed-7329ac1c9151\") " Mar 14 07:21:05 crc kubenswrapper[4893]: I0314 07:21:05.648852 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b512edab-7aac-4d18-baed-7329ac1c9151-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b512edab-7aac-4d18-baed-7329ac1c9151" (UID: "b512edab-7aac-4d18-baed-7329ac1c9151"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:21:05 crc kubenswrapper[4893]: I0314 07:21:05.648863 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b512edab-7aac-4d18-baed-7329ac1c9151-config" (OuterVolumeSpecName: "config") pod "b512edab-7aac-4d18-baed-7329ac1c9151" (UID: "b512edab-7aac-4d18-baed-7329ac1c9151"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:21:05 crc kubenswrapper[4893]: I0314 07:21:05.648985 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b512edab-7aac-4d18-baed-7329ac1c9151-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b512edab-7aac-4d18-baed-7329ac1c9151" (UID: "b512edab-7aac-4d18-baed-7329ac1c9151"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:21:05 crc kubenswrapper[4893]: I0314 07:21:05.649232 4893 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b512edab-7aac-4d18-baed-7329ac1c9151-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:05 crc kubenswrapper[4893]: I0314 07:21:05.649250 4893 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b512edab-7aac-4d18-baed-7329ac1c9151-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:05 crc kubenswrapper[4893]: I0314 07:21:05.649263 4893 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b512edab-7aac-4d18-baed-7329ac1c9151-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:05 crc kubenswrapper[4893]: I0314 07:21:05.651110 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b512edab-7aac-4d18-baed-7329ac1c9151-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b512edab-7aac-4d18-baed-7329ac1c9151" (UID: "b512edab-7aac-4d18-baed-7329ac1c9151"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:21:05 crc kubenswrapper[4893]: I0314 07:21:05.657887 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b512edab-7aac-4d18-baed-7329ac1c9151-kube-api-access-958h6" (OuterVolumeSpecName: "kube-api-access-958h6") pod "b512edab-7aac-4d18-baed-7329ac1c9151" (UID: "b512edab-7aac-4d18-baed-7329ac1c9151"). InnerVolumeSpecName "kube-api-access-958h6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:21:05 crc kubenswrapper[4893]: I0314 07:21:05.750620 4893 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b512edab-7aac-4d18-baed-7329ac1c9151-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:05 crc kubenswrapper[4893]: I0314 07:21:05.750655 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-958h6\" (UniqueName: \"kubernetes.io/projected/b512edab-7aac-4d18-baed-7329ac1c9151-kube-api-access-958h6\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:05 crc kubenswrapper[4893]: I0314 07:21:05.852948 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d8b7f7f4c-klm8k"] Mar 14 07:21:06 crc kubenswrapper[4893]: I0314 07:21:06.054428 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b866f57b8-fbw4s"] Mar 14 07:21:06 crc kubenswrapper[4893]: W0314 07:21:06.061987 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod61f6301e_96d7_4b42_b14c_1286aff6c13f.slice/crio-01aafd1c37af981ae6a784c909126598b665e39f904330c0d510c03e31a4fa65 WatchSource:0}: Error finding container 01aafd1c37af981ae6a784c909126598b665e39f904330c0d510c03e31a4fa65: Status 404 returned error can't find the container with id 01aafd1c37af981ae6a784c909126598b665e39f904330c0d510c03e31a4fa65 Mar 14 07:21:06 crc kubenswrapper[4893]: I0314 07:21:06.582705 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b866f57b8-fbw4s" event={"ID":"61f6301e-96d7-4b42-b14c-1286aff6c13f","Type":"ContainerStarted","Data":"66c46a8e3cf50ed9e16b93b6415976a6c522afb38e02301e75b1ac6e561113c1"} Mar 14 07:21:06 crc kubenswrapper[4893]: I0314 07:21:06.583038 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b866f57b8-fbw4s" event={"ID":"61f6301e-96d7-4b42-b14c-1286aff6c13f","Type":"ContainerStarted","Data":"01aafd1c37af981ae6a784c909126598b665e39f904330c0d510c03e31a4fa65"} Mar 14 07:21:06 crc kubenswrapper[4893]: I0314 07:21:06.583977 4893 generic.go:334] "Generic (PLEG): container finished" podID="86774cd6-a036-4ec8-8e5a-d82247af2737" containerID="5377adfc2fccaa116e61b562ad604ad0d01cff2abe44fc2da96d2d4897c0057c" exitCode=0 Mar 14 07:21:06 crc kubenswrapper[4893]: I0314 07:21:06.584016 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d8b7f7f4c-klm8k" event={"ID":"86774cd6-a036-4ec8-8e5a-d82247af2737","Type":"ContainerDied","Data":"5377adfc2fccaa116e61b562ad604ad0d01cff2abe44fc2da96d2d4897c0057c"} Mar 14 07:21:06 crc kubenswrapper[4893]: I0314 07:21:06.584031 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d8b7f7f4c-klm8k" event={"ID":"86774cd6-a036-4ec8-8e5a-d82247af2737","Type":"ContainerStarted","Data":"2fd1965943467948c7c1175965b9971a2bd44044f146dca9426132ce1b25c66e"} Mar 14 07:21:06 crc kubenswrapper[4893]: I0314 07:21:06.588348 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c782f441-cf3b-4a69-965b-5d87dd4a00ad","Type":"ContainerStarted","Data":"cc50be6a0913b80b7fcb0323f8b14eebc02204609d22675c73eb7671be680d74"} Mar 14 07:21:06 crc kubenswrapper[4893]: I0314 07:21:06.588371 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586fc8487f-gjjdq" Mar 14 07:21:06 crc kubenswrapper[4893]: I0314 07:21:06.628282 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=15.628247615 podStartE2EDuration="15.628247615s" podCreationTimestamp="2026-03-14 07:20:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:21:06.623202102 +0000 UTC m=+1345.885378914" watchObservedRunningTime="2026-03-14 07:21:06.628247615 +0000 UTC m=+1345.890424427" Mar 14 07:21:06 crc kubenswrapper[4893]: I0314 07:21:06.713648 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-586fc8487f-gjjdq"] Mar 14 07:21:06 crc kubenswrapper[4893]: I0314 07:21:06.721712 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-586fc8487f-gjjdq"] Mar 14 07:21:07 crc kubenswrapper[4893]: I0314 07:21:07.385920 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b512edab-7aac-4d18-baed-7329ac1c9151" path="/var/lib/kubelet/pods/b512edab-7aac-4d18-baed-7329ac1c9151/volumes" Mar 14 07:21:07 crc kubenswrapper[4893]: I0314 07:21:07.541427 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6bc7479fc9-jhvmx"] Mar 14 07:21:07 crc kubenswrapper[4893]: I0314 07:21:07.545320 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6bc7479fc9-jhvmx" Mar 14 07:21:07 crc kubenswrapper[4893]: I0314 07:21:07.550107 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 14 07:21:07 crc kubenswrapper[4893]: I0314 07:21:07.550474 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 14 07:21:07 crc kubenswrapper[4893]: I0314 07:21:07.550979 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6bc7479fc9-jhvmx"] Mar 14 07:21:07 crc kubenswrapper[4893]: I0314 07:21:07.599872 4893 generic.go:334] "Generic (PLEG): container finished" podID="0b4b8032-3d08-4574-9987-fc159aa8c506" containerID="d33597c4daeda4ec0fe028c311645b884a212bf1f0a2101cb0b7a2f6903446eb" exitCode=0 Mar 14 07:21:07 crc kubenswrapper[4893]: I0314 07:21:07.599956 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-mdkpp" event={"ID":"0b4b8032-3d08-4574-9987-fc159aa8c506","Type":"ContainerDied","Data":"d33597c4daeda4ec0fe028c311645b884a212bf1f0a2101cb0b7a2f6903446eb"} Mar 14 07:21:07 crc kubenswrapper[4893]: I0314 07:21:07.602335 4893 generic.go:334] "Generic (PLEG): container finished" podID="81acf3d1-58a7-43ed-808d-411467094efe" containerID="34a97db1b93c5de877265ef53897d09a872b74dc2ac956116974a8aec710573c" exitCode=0 Mar 14 07:21:07 crc kubenswrapper[4893]: I0314 07:21:07.602406 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vc8cn" event={"ID":"81acf3d1-58a7-43ed-808d-411467094efe","Type":"ContainerDied","Data":"34a97db1b93c5de877265ef53897d09a872b74dc2ac956116974a8aec710573c"} Mar 14 07:21:07 crc kubenswrapper[4893]: I0314 07:21:07.607640 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b866f57b8-fbw4s" event={"ID":"61f6301e-96d7-4b42-b14c-1286aff6c13f","Type":"ContainerStarted","Data":"78579ad35de8d3d335e36041fbcc2c6cb7eddbfe098db0f486431cd760fd2ad8"} Mar 14 07:21:07 crc kubenswrapper[4893]: I0314 07:21:07.607820 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-b866f57b8-fbw4s" Mar 14 07:21:07 crc kubenswrapper[4893]: I0314 07:21:07.610546 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d8b7f7f4c-klm8k" event={"ID":"86774cd6-a036-4ec8-8e5a-d82247af2737","Type":"ContainerStarted","Data":"9fcd35de59276a79b1840facdabeb20b07e44c036d429f0836781619eb44e0b1"} Mar 14 07:21:07 crc kubenswrapper[4893]: I0314 07:21:07.610584 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d8b7f7f4c-klm8k" Mar 14 07:21:07 crc kubenswrapper[4893]: I0314 07:21:07.659421 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-b866f57b8-fbw4s" podStartSLOduration=3.659400338 podStartE2EDuration="3.659400338s" podCreationTimestamp="2026-03-14 07:21:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:21:07.64923035 +0000 UTC m=+1346.911407162" watchObservedRunningTime="2026-03-14 07:21:07.659400338 +0000 UTC m=+1346.921577130" Mar 14 07:21:07 crc kubenswrapper[4893]: I0314 07:21:07.677433 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d8b7f7f4c-klm8k" podStartSLOduration=3.677416767 podStartE2EDuration="3.677416767s" podCreationTimestamp="2026-03-14 07:21:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:21:07.669960045 +0000 UTC m=+1346.932136847" watchObservedRunningTime="2026-03-14 07:21:07.677416767 +0000 UTC m=+1346.939593559" Mar 14 07:21:07 crc kubenswrapper[4893]: I0314 07:21:07.683134 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe041ac0-b6d6-4ce7-8d87-c91e695bcf20-public-tls-certs\") pod \"neutron-6bc7479fc9-jhvmx\" (UID: \"fe041ac0-b6d6-4ce7-8d87-c91e695bcf20\") " pod="openstack/neutron-6bc7479fc9-jhvmx" Mar 14 07:21:07 crc kubenswrapper[4893]: I0314 07:21:07.683198 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fe041ac0-b6d6-4ce7-8d87-c91e695bcf20-config\") pod \"neutron-6bc7479fc9-jhvmx\" (UID: \"fe041ac0-b6d6-4ce7-8d87-c91e695bcf20\") " pod="openstack/neutron-6bc7479fc9-jhvmx" Mar 14 07:21:07 crc kubenswrapper[4893]: I0314 07:21:07.683294 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe041ac0-b6d6-4ce7-8d87-c91e695bcf20-ovndb-tls-certs\") pod \"neutron-6bc7479fc9-jhvmx\" (UID: \"fe041ac0-b6d6-4ce7-8d87-c91e695bcf20\") " pod="openstack/neutron-6bc7479fc9-jhvmx" Mar 14 07:21:07 crc kubenswrapper[4893]: I0314 07:21:07.683337 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbsrc\" (UniqueName: \"kubernetes.io/projected/fe041ac0-b6d6-4ce7-8d87-c91e695bcf20-kube-api-access-vbsrc\") pod \"neutron-6bc7479fc9-jhvmx\" (UID: \"fe041ac0-b6d6-4ce7-8d87-c91e695bcf20\") " pod="openstack/neutron-6bc7479fc9-jhvmx" Mar 14 07:21:07 crc kubenswrapper[4893]: I0314 07:21:07.683373 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe041ac0-b6d6-4ce7-8d87-c91e695bcf20-combined-ca-bundle\") pod \"neutron-6bc7479fc9-jhvmx\" (UID: \"fe041ac0-b6d6-4ce7-8d87-c91e695bcf20\") " pod="openstack/neutron-6bc7479fc9-jhvmx" Mar 14 07:21:07 crc kubenswrapper[4893]: I0314 07:21:07.683394 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fe041ac0-b6d6-4ce7-8d87-c91e695bcf20-httpd-config\") pod \"neutron-6bc7479fc9-jhvmx\" (UID: \"fe041ac0-b6d6-4ce7-8d87-c91e695bcf20\") " pod="openstack/neutron-6bc7479fc9-jhvmx" Mar 14 07:21:07 crc kubenswrapper[4893]: I0314 07:21:07.683421 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe041ac0-b6d6-4ce7-8d87-c91e695bcf20-internal-tls-certs\") pod \"neutron-6bc7479fc9-jhvmx\" (UID: \"fe041ac0-b6d6-4ce7-8d87-c91e695bcf20\") " pod="openstack/neutron-6bc7479fc9-jhvmx" Mar 14 07:21:07 crc kubenswrapper[4893]: I0314 07:21:07.785056 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fe041ac0-b6d6-4ce7-8d87-c91e695bcf20-httpd-config\") pod \"neutron-6bc7479fc9-jhvmx\" (UID: \"fe041ac0-b6d6-4ce7-8d87-c91e695bcf20\") " pod="openstack/neutron-6bc7479fc9-jhvmx" Mar 14 07:21:07 crc kubenswrapper[4893]: I0314 07:21:07.785120 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe041ac0-b6d6-4ce7-8d87-c91e695bcf20-internal-tls-certs\") pod \"neutron-6bc7479fc9-jhvmx\" (UID: \"fe041ac0-b6d6-4ce7-8d87-c91e695bcf20\") " pod="openstack/neutron-6bc7479fc9-jhvmx" Mar 14 07:21:07 crc kubenswrapper[4893]: I0314 07:21:07.785237 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe041ac0-b6d6-4ce7-8d87-c91e695bcf20-public-tls-certs\") pod \"neutron-6bc7479fc9-jhvmx\" (UID: \"fe041ac0-b6d6-4ce7-8d87-c91e695bcf20\") " pod="openstack/neutron-6bc7479fc9-jhvmx" Mar 14 07:21:07 crc kubenswrapper[4893]: I0314 07:21:07.785258 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fe041ac0-b6d6-4ce7-8d87-c91e695bcf20-config\") pod \"neutron-6bc7479fc9-jhvmx\" (UID: \"fe041ac0-b6d6-4ce7-8d87-c91e695bcf20\") " pod="openstack/neutron-6bc7479fc9-jhvmx" Mar 14 07:21:07 crc kubenswrapper[4893]: I0314 07:21:07.785483 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe041ac0-b6d6-4ce7-8d87-c91e695bcf20-ovndb-tls-certs\") pod \"neutron-6bc7479fc9-jhvmx\" (UID: \"fe041ac0-b6d6-4ce7-8d87-c91e695bcf20\") " pod="openstack/neutron-6bc7479fc9-jhvmx" Mar 14 07:21:07 crc kubenswrapper[4893]: I0314 07:21:07.785553 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbsrc\" (UniqueName: \"kubernetes.io/projected/fe041ac0-b6d6-4ce7-8d87-c91e695bcf20-kube-api-access-vbsrc\") pod \"neutron-6bc7479fc9-jhvmx\" (UID: \"fe041ac0-b6d6-4ce7-8d87-c91e695bcf20\") " pod="openstack/neutron-6bc7479fc9-jhvmx" Mar 14 07:21:07 crc kubenswrapper[4893]: I0314 07:21:07.785609 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe041ac0-b6d6-4ce7-8d87-c91e695bcf20-combined-ca-bundle\") pod \"neutron-6bc7479fc9-jhvmx\" (UID: \"fe041ac0-b6d6-4ce7-8d87-c91e695bcf20\") " pod="openstack/neutron-6bc7479fc9-jhvmx" Mar 14 07:21:07 crc kubenswrapper[4893]: I0314 07:21:07.791427 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe041ac0-b6d6-4ce7-8d87-c91e695bcf20-public-tls-certs\") pod \"neutron-6bc7479fc9-jhvmx\" (UID: \"fe041ac0-b6d6-4ce7-8d87-c91e695bcf20\") " pod="openstack/neutron-6bc7479fc9-jhvmx" Mar 14 07:21:07 crc kubenswrapper[4893]: I0314 07:21:07.792095 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fe041ac0-b6d6-4ce7-8d87-c91e695bcf20-httpd-config\") pod \"neutron-6bc7479fc9-jhvmx\" (UID: \"fe041ac0-b6d6-4ce7-8d87-c91e695bcf20\") " pod="openstack/neutron-6bc7479fc9-jhvmx" Mar 14 07:21:07 crc kubenswrapper[4893]: I0314 07:21:07.792467 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe041ac0-b6d6-4ce7-8d87-c91e695bcf20-combined-ca-bundle\") pod \"neutron-6bc7479fc9-jhvmx\" (UID: \"fe041ac0-b6d6-4ce7-8d87-c91e695bcf20\") " pod="openstack/neutron-6bc7479fc9-jhvmx" Mar 14 07:21:07 crc kubenswrapper[4893]: I0314 07:21:07.793239 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe041ac0-b6d6-4ce7-8d87-c91e695bcf20-internal-tls-certs\") pod \"neutron-6bc7479fc9-jhvmx\" (UID: \"fe041ac0-b6d6-4ce7-8d87-c91e695bcf20\") " pod="openstack/neutron-6bc7479fc9-jhvmx" Mar 14 07:21:07 crc kubenswrapper[4893]: I0314 07:21:07.793853 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe041ac0-b6d6-4ce7-8d87-c91e695bcf20-ovndb-tls-certs\") pod \"neutron-6bc7479fc9-jhvmx\" (UID: \"fe041ac0-b6d6-4ce7-8d87-c91e695bcf20\") " pod="openstack/neutron-6bc7479fc9-jhvmx" Mar 14 07:21:07 crc kubenswrapper[4893]: I0314 07:21:07.800487 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/fe041ac0-b6d6-4ce7-8d87-c91e695bcf20-config\") pod \"neutron-6bc7479fc9-jhvmx\" (UID: \"fe041ac0-b6d6-4ce7-8d87-c91e695bcf20\") " pod="openstack/neutron-6bc7479fc9-jhvmx" Mar 14 07:21:07 crc kubenswrapper[4893]: I0314 07:21:07.829030 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbsrc\" (UniqueName: \"kubernetes.io/projected/fe041ac0-b6d6-4ce7-8d87-c91e695bcf20-kube-api-access-vbsrc\") pod \"neutron-6bc7479fc9-jhvmx\" (UID: \"fe041ac0-b6d6-4ce7-8d87-c91e695bcf20\") " pod="openstack/neutron-6bc7479fc9-jhvmx" Mar 14 07:21:07 crc kubenswrapper[4893]: I0314 07:21:07.862986 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6bc7479fc9-jhvmx" Mar 14 07:21:09 crc kubenswrapper[4893]: I0314 07:21:09.630446 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vc8cn" event={"ID":"81acf3d1-58a7-43ed-808d-411467094efe","Type":"ContainerDied","Data":"15f51cf251ac0d35e583e58143784038f5953702421ea403eaedb57972f54c2a"} Mar 14 07:21:09 crc kubenswrapper[4893]: I0314 07:21:09.631076 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15f51cf251ac0d35e583e58143784038f5953702421ea403eaedb57972f54c2a" Mar 14 07:21:09 crc kubenswrapper[4893]: I0314 07:21:09.632257 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-mdkpp" event={"ID":"0b4b8032-3d08-4574-9987-fc159aa8c506","Type":"ContainerDied","Data":"ca062e024534d6308c9c6fe94a63e91ea2018dc1d51de4d15ba7e0da8e82ace0"} Mar 14 07:21:09 crc kubenswrapper[4893]: I0314 07:21:09.632281 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca062e024534d6308c9c6fe94a63e91ea2018dc1d51de4d15ba7e0da8e82ace0" Mar 14 07:21:09 crc kubenswrapper[4893]: I0314 07:21:09.786900 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vc8cn" Mar 14 07:21:09 crc kubenswrapper[4893]: I0314 07:21:09.794924 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-mdkpp" Mar 14 07:21:09 crc kubenswrapper[4893]: I0314 07:21:09.952035 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/81acf3d1-58a7-43ed-808d-411467094efe-fernet-keys\") pod \"81acf3d1-58a7-43ed-808d-411467094efe\" (UID: \"81acf3d1-58a7-43ed-808d-411467094efe\") " Mar 14 07:21:09 crc kubenswrapper[4893]: I0314 07:21:09.952084 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b4b8032-3d08-4574-9987-fc159aa8c506-scripts\") pod \"0b4b8032-3d08-4574-9987-fc159aa8c506\" (UID: \"0b4b8032-3d08-4574-9987-fc159aa8c506\") " Mar 14 07:21:09 crc kubenswrapper[4893]: I0314 07:21:09.952136 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/81acf3d1-58a7-43ed-808d-411467094efe-credential-keys\") pod \"81acf3d1-58a7-43ed-808d-411467094efe\" (UID: \"81acf3d1-58a7-43ed-808d-411467094efe\") " Mar 14 07:21:09 crc kubenswrapper[4893]: I0314 07:21:09.952158 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b4b8032-3d08-4574-9987-fc159aa8c506-logs\") pod \"0b4b8032-3d08-4574-9987-fc159aa8c506\" (UID: \"0b4b8032-3d08-4574-9987-fc159aa8c506\") " Mar 14 07:21:09 crc kubenswrapper[4893]: I0314 07:21:09.952208 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b4b8032-3d08-4574-9987-fc159aa8c506-config-data\") pod \"0b4b8032-3d08-4574-9987-fc159aa8c506\" (UID: \"0b4b8032-3d08-4574-9987-fc159aa8c506\") " Mar 14 07:21:09 crc kubenswrapper[4893]: I0314 07:21:09.952230 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81acf3d1-58a7-43ed-808d-411467094efe-scripts\") pod \"81acf3d1-58a7-43ed-808d-411467094efe\" (UID: \"81acf3d1-58a7-43ed-808d-411467094efe\") " Mar 14 07:21:09 crc kubenswrapper[4893]: I0314 07:21:09.952271 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gkqc8\" (UniqueName: \"kubernetes.io/projected/0b4b8032-3d08-4574-9987-fc159aa8c506-kube-api-access-gkqc8\") pod \"0b4b8032-3d08-4574-9987-fc159aa8c506\" (UID: \"0b4b8032-3d08-4574-9987-fc159aa8c506\") " Mar 14 07:21:09 crc kubenswrapper[4893]: I0314 07:21:09.952299 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2x5nc\" (UniqueName: \"kubernetes.io/projected/81acf3d1-58a7-43ed-808d-411467094efe-kube-api-access-2x5nc\") pod \"81acf3d1-58a7-43ed-808d-411467094efe\" (UID: \"81acf3d1-58a7-43ed-808d-411467094efe\") " Mar 14 07:21:09 crc kubenswrapper[4893]: I0314 07:21:09.952324 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81acf3d1-58a7-43ed-808d-411467094efe-config-data\") pod \"81acf3d1-58a7-43ed-808d-411467094efe\" (UID: \"81acf3d1-58a7-43ed-808d-411467094efe\") " Mar 14 07:21:09 crc kubenswrapper[4893]: I0314 07:21:09.952425 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81acf3d1-58a7-43ed-808d-411467094efe-combined-ca-bundle\") pod \"81acf3d1-58a7-43ed-808d-411467094efe\" (UID: \"81acf3d1-58a7-43ed-808d-411467094efe\") " Mar 14 07:21:09 crc kubenswrapper[4893]: I0314 07:21:09.952447 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b4b8032-3d08-4574-9987-fc159aa8c506-combined-ca-bundle\") pod \"0b4b8032-3d08-4574-9987-fc159aa8c506\" (UID: \"0b4b8032-3d08-4574-9987-fc159aa8c506\") " Mar 14 07:21:09 crc kubenswrapper[4893]: I0314 07:21:09.953224 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b4b8032-3d08-4574-9987-fc159aa8c506-logs" (OuterVolumeSpecName: "logs") pod "0b4b8032-3d08-4574-9987-fc159aa8c506" (UID: "0b4b8032-3d08-4574-9987-fc159aa8c506"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:21:09 crc kubenswrapper[4893]: I0314 07:21:09.971188 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b4b8032-3d08-4574-9987-fc159aa8c506-kube-api-access-gkqc8" (OuterVolumeSpecName: "kube-api-access-gkqc8") pod "0b4b8032-3d08-4574-9987-fc159aa8c506" (UID: "0b4b8032-3d08-4574-9987-fc159aa8c506"). InnerVolumeSpecName "kube-api-access-gkqc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:21:09 crc kubenswrapper[4893]: I0314 07:21:09.971660 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81acf3d1-58a7-43ed-808d-411467094efe-scripts" (OuterVolumeSpecName: "scripts") pod "81acf3d1-58a7-43ed-808d-411467094efe" (UID: "81acf3d1-58a7-43ed-808d-411467094efe"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:21:09 crc kubenswrapper[4893]: I0314 07:21:09.974891 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81acf3d1-58a7-43ed-808d-411467094efe-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "81acf3d1-58a7-43ed-808d-411467094efe" (UID: "81acf3d1-58a7-43ed-808d-411467094efe"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:21:09 crc kubenswrapper[4893]: I0314 07:21:09.975137 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b4b8032-3d08-4574-9987-fc159aa8c506-scripts" (OuterVolumeSpecName: "scripts") pod "0b4b8032-3d08-4574-9987-fc159aa8c506" (UID: "0b4b8032-3d08-4574-9987-fc159aa8c506"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:21:09 crc kubenswrapper[4893]: I0314 07:21:09.991171 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81acf3d1-58a7-43ed-808d-411467094efe-kube-api-access-2x5nc" (OuterVolumeSpecName: "kube-api-access-2x5nc") pod "81acf3d1-58a7-43ed-808d-411467094efe" (UID: "81acf3d1-58a7-43ed-808d-411467094efe"). InnerVolumeSpecName "kube-api-access-2x5nc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:21:10 crc kubenswrapper[4893]: I0314 07:21:10.006793 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81acf3d1-58a7-43ed-808d-411467094efe-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "81acf3d1-58a7-43ed-808d-411467094efe" (UID: "81acf3d1-58a7-43ed-808d-411467094efe"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:21:10 crc kubenswrapper[4893]: I0314 07:21:10.023109 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b4b8032-3d08-4574-9987-fc159aa8c506-config-data" (OuterVolumeSpecName: "config-data") pod "0b4b8032-3d08-4574-9987-fc159aa8c506" (UID: "0b4b8032-3d08-4574-9987-fc159aa8c506"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:21:10 crc kubenswrapper[4893]: I0314 07:21:10.054267 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2x5nc\" (UniqueName: \"kubernetes.io/projected/81acf3d1-58a7-43ed-808d-411467094efe-kube-api-access-2x5nc\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:10 crc kubenswrapper[4893]: I0314 07:21:10.054314 4893 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/81acf3d1-58a7-43ed-808d-411467094efe-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:10 crc kubenswrapper[4893]: I0314 07:21:10.054324 4893 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b4b8032-3d08-4574-9987-fc159aa8c506-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:10 crc kubenswrapper[4893]: I0314 07:21:10.054333 4893 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/81acf3d1-58a7-43ed-808d-411467094efe-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:10 crc kubenswrapper[4893]: I0314 07:21:10.054341 4893 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b4b8032-3d08-4574-9987-fc159aa8c506-logs\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:10 crc kubenswrapper[4893]: I0314 07:21:10.054349 4893 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b4b8032-3d08-4574-9987-fc159aa8c506-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:10 crc kubenswrapper[4893]: I0314 07:21:10.054357 4893 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81acf3d1-58a7-43ed-808d-411467094efe-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:10 crc kubenswrapper[4893]: I0314 07:21:10.054366 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gkqc8\" (UniqueName: \"kubernetes.io/projected/0b4b8032-3d08-4574-9987-fc159aa8c506-kube-api-access-gkqc8\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:10 crc kubenswrapper[4893]: I0314 07:21:10.054623 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81acf3d1-58a7-43ed-808d-411467094efe-config-data" (OuterVolumeSpecName: "config-data") pod "81acf3d1-58a7-43ed-808d-411467094efe" (UID: "81acf3d1-58a7-43ed-808d-411467094efe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:21:10 crc kubenswrapper[4893]: I0314 07:21:10.060212 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81acf3d1-58a7-43ed-808d-411467094efe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "81acf3d1-58a7-43ed-808d-411467094efe" (UID: "81acf3d1-58a7-43ed-808d-411467094efe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:21:10 crc kubenswrapper[4893]: I0314 07:21:10.062146 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b4b8032-3d08-4574-9987-fc159aa8c506-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0b4b8032-3d08-4574-9987-fc159aa8c506" (UID: "0b4b8032-3d08-4574-9987-fc159aa8c506"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:21:10 crc kubenswrapper[4893]: I0314 07:21:10.155593 4893 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81acf3d1-58a7-43ed-808d-411467094efe-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:10 crc kubenswrapper[4893]: I0314 07:21:10.155954 4893 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81acf3d1-58a7-43ed-808d-411467094efe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:10 crc kubenswrapper[4893]: I0314 07:21:10.155965 4893 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b4b8032-3d08-4574-9987-fc159aa8c506-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:10 crc kubenswrapper[4893]: I0314 07:21:10.208987 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6bc7479fc9-jhvmx"] Mar 14 07:21:10 crc kubenswrapper[4893]: I0314 07:21:10.650914 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aa99848c-b3da-4307-afd1-f37484e648d6","Type":"ContainerStarted","Data":"9aefa053e67c31eb3cbb6bada27fa7330f22c04186e260a16b7a1675979550cc"} Mar 14 07:21:10 crc kubenswrapper[4893]: I0314 07:21:10.652757 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-mdkpp" Mar 14 07:21:10 crc kubenswrapper[4893]: I0314 07:21:10.657253 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vc8cn" Mar 14 07:21:10 crc kubenswrapper[4893]: I0314 07:21:10.657257 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bc7479fc9-jhvmx" event={"ID":"fe041ac0-b6d6-4ce7-8d87-c91e695bcf20","Type":"ContainerStarted","Data":"70e6e56d3864bf198a3fe7ff9fce7eed328301b7a8a47671c97cf9c60930d59c"} Mar 14 07:21:10 crc kubenswrapper[4893]: I0314 07:21:10.657341 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bc7479fc9-jhvmx" event={"ID":"fe041ac0-b6d6-4ce7-8d87-c91e695bcf20","Type":"ContainerStarted","Data":"fb1d21418e4f5a24e0fc43ce4b7a9d513c6c2d457b8866d744b8f46d082bc8e5"} Mar 14 07:21:10 crc kubenswrapper[4893]: I0314 07:21:10.657355 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bc7479fc9-jhvmx" event={"ID":"fe041ac0-b6d6-4ce7-8d87-c91e695bcf20","Type":"ContainerStarted","Data":"abdce33177212cde9de28ef362cd6d21bd60d8807b089f1be2aef8c4f7eebcfa"} Mar 14 07:21:10 crc kubenswrapper[4893]: I0314 07:21:10.657550 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6bc7479fc9-jhvmx" Mar 14 07:21:10 crc kubenswrapper[4893]: I0314 07:21:10.693652 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6bc7479fc9-jhvmx" podStartSLOduration=3.693636664 podStartE2EDuration="3.693636664s" podCreationTimestamp="2026-03-14 07:21:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:21:10.68606218 +0000 UTC m=+1349.948239002" watchObservedRunningTime="2026-03-14 07:21:10.693636664 +0000 UTC m=+1349.955813456" Mar 14 07:21:10 crc kubenswrapper[4893]: I0314 07:21:10.999380 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-787475576f-6cj4v"] Mar 14 07:21:10 crc kubenswrapper[4893]: E0314 07:21:10.999869 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81acf3d1-58a7-43ed-808d-411467094efe" containerName="keystone-bootstrap" Mar 14 07:21:10 crc kubenswrapper[4893]: I0314 07:21:10.999891 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="81acf3d1-58a7-43ed-808d-411467094efe" containerName="keystone-bootstrap" Mar 14 07:21:11 crc kubenswrapper[4893]: E0314 07:21:10.999916 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b4b8032-3d08-4574-9987-fc159aa8c506" containerName="placement-db-sync" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:10.999925 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b4b8032-3d08-4574-9987-fc159aa8c506" containerName="placement-db-sync" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.000156 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b4b8032-3d08-4574-9987-fc159aa8c506" containerName="placement-db-sync" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.000198 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="81acf3d1-58a7-43ed-808d-411467094efe" containerName="keystone-bootstrap" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.000905 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-787475576f-6cj4v" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.003294 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.004275 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.004764 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.004795 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-k497z" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.005080 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.008074 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.009872 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-65c5f4f57d-t9vqz"] Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.012054 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-65c5f4f57d-t9vqz" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.014598 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.014956 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-h869l" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.015566 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.018815 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-65c5f4f57d-t9vqz"] Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.019485 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.019562 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.083093 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-787475576f-6cj4v"] Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.173967 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/596e7b0c-f030-4a30-870b-184071c1eab3-internal-tls-certs\") pod \"placement-65c5f4f57d-t9vqz\" (UID: \"596e7b0c-f030-4a30-870b-184071c1eab3\") " pod="openstack/placement-65c5f4f57d-t9vqz" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.174035 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cjlh\" (UniqueName: \"kubernetes.io/projected/596e7b0c-f030-4a30-870b-184071c1eab3-kube-api-access-8cjlh\") pod \"placement-65c5f4f57d-t9vqz\" (UID: \"596e7b0c-f030-4a30-870b-184071c1eab3\") " pod="openstack/placement-65c5f4f57d-t9vqz" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.174083 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a-combined-ca-bundle\") pod \"keystone-787475576f-6cj4v\" (UID: \"0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a\") " pod="openstack/keystone-787475576f-6cj4v" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.174102 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/596e7b0c-f030-4a30-870b-184071c1eab3-scripts\") pod \"placement-65c5f4f57d-t9vqz\" (UID: \"596e7b0c-f030-4a30-870b-184071c1eab3\") " pod="openstack/placement-65c5f4f57d-t9vqz" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.174247 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a-config-data\") pod \"keystone-787475576f-6cj4v\" (UID: \"0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a\") " pod="openstack/keystone-787475576f-6cj4v" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.174340 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/596e7b0c-f030-4a30-870b-184071c1eab3-combined-ca-bundle\") pod \"placement-65c5f4f57d-t9vqz\" (UID: \"596e7b0c-f030-4a30-870b-184071c1eab3\") " pod="openstack/placement-65c5f4f57d-t9vqz" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.174382 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a-fernet-keys\") pod \"keystone-787475576f-6cj4v\" (UID: \"0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a\") " pod="openstack/keystone-787475576f-6cj4v" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.174496 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a-scripts\") pod \"keystone-787475576f-6cj4v\" (UID: \"0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a\") " pod="openstack/keystone-787475576f-6cj4v" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.174622 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/596e7b0c-f030-4a30-870b-184071c1eab3-logs\") pod \"placement-65c5f4f57d-t9vqz\" (UID: \"596e7b0c-f030-4a30-870b-184071c1eab3\") " pod="openstack/placement-65c5f4f57d-t9vqz" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.174721 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ph9ps\" (UniqueName: \"kubernetes.io/projected/0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a-kube-api-access-ph9ps\") pod \"keystone-787475576f-6cj4v\" (UID: \"0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a\") " pod="openstack/keystone-787475576f-6cj4v" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.174746 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/596e7b0c-f030-4a30-870b-184071c1eab3-public-tls-certs\") pod \"placement-65c5f4f57d-t9vqz\" (UID: \"596e7b0c-f030-4a30-870b-184071c1eab3\") " pod="openstack/placement-65c5f4f57d-t9vqz" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.174765 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/596e7b0c-f030-4a30-870b-184071c1eab3-config-data\") pod \"placement-65c5f4f57d-t9vqz\" (UID: \"596e7b0c-f030-4a30-870b-184071c1eab3\") " pod="openstack/placement-65c5f4f57d-t9vqz" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.174819 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a-credential-keys\") pod \"keystone-787475576f-6cj4v\" (UID: \"0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a\") " pod="openstack/keystone-787475576f-6cj4v" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.174838 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a-internal-tls-certs\") pod \"keystone-787475576f-6cj4v\" (UID: \"0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a\") " pod="openstack/keystone-787475576f-6cj4v" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.174890 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a-public-tls-certs\") pod \"keystone-787475576f-6cj4v\" (UID: \"0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a\") " pod="openstack/keystone-787475576f-6cj4v" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.276904 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/596e7b0c-f030-4a30-870b-184071c1eab3-combined-ca-bundle\") pod \"placement-65c5f4f57d-t9vqz\" (UID: \"596e7b0c-f030-4a30-870b-184071c1eab3\") " pod="openstack/placement-65c5f4f57d-t9vqz" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.276990 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a-fernet-keys\") pod \"keystone-787475576f-6cj4v\" (UID: \"0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a\") " pod="openstack/keystone-787475576f-6cj4v" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.277055 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a-scripts\") pod \"keystone-787475576f-6cj4v\" (UID: \"0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a\") " pod="openstack/keystone-787475576f-6cj4v" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.277103 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/596e7b0c-f030-4a30-870b-184071c1eab3-logs\") pod \"placement-65c5f4f57d-t9vqz\" (UID: \"596e7b0c-f030-4a30-870b-184071c1eab3\") " pod="openstack/placement-65c5f4f57d-t9vqz" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.277139 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ph9ps\" (UniqueName: \"kubernetes.io/projected/0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a-kube-api-access-ph9ps\") pod \"keystone-787475576f-6cj4v\" (UID: \"0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a\") " pod="openstack/keystone-787475576f-6cj4v" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.277161 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/596e7b0c-f030-4a30-870b-184071c1eab3-public-tls-certs\") pod \"placement-65c5f4f57d-t9vqz\" (UID: \"596e7b0c-f030-4a30-870b-184071c1eab3\") " pod="openstack/placement-65c5f4f57d-t9vqz" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.277183 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/596e7b0c-f030-4a30-870b-184071c1eab3-config-data\") pod \"placement-65c5f4f57d-t9vqz\" (UID: \"596e7b0c-f030-4a30-870b-184071c1eab3\") " pod="openstack/placement-65c5f4f57d-t9vqz" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.277213 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a-credential-keys\") pod \"keystone-787475576f-6cj4v\" (UID: \"0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a\") " pod="openstack/keystone-787475576f-6cj4v" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.277235 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a-internal-tls-certs\") pod \"keystone-787475576f-6cj4v\" (UID: \"0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a\") " pod="openstack/keystone-787475576f-6cj4v" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.277264 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a-public-tls-certs\") pod \"keystone-787475576f-6cj4v\" (UID: \"0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a\") " pod="openstack/keystone-787475576f-6cj4v" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.277289 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/596e7b0c-f030-4a30-870b-184071c1eab3-internal-tls-certs\") pod \"placement-65c5f4f57d-t9vqz\" (UID: \"596e7b0c-f030-4a30-870b-184071c1eab3\") " pod="openstack/placement-65c5f4f57d-t9vqz" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.277324 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cjlh\" (UniqueName: \"kubernetes.io/projected/596e7b0c-f030-4a30-870b-184071c1eab3-kube-api-access-8cjlh\") pod \"placement-65c5f4f57d-t9vqz\" (UID: \"596e7b0c-f030-4a30-870b-184071c1eab3\") " pod="openstack/placement-65c5f4f57d-t9vqz" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.277410 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a-combined-ca-bundle\") pod \"keystone-787475576f-6cj4v\" (UID: \"0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a\") " pod="openstack/keystone-787475576f-6cj4v" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.277433 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/596e7b0c-f030-4a30-870b-184071c1eab3-scripts\") pod \"placement-65c5f4f57d-t9vqz\" (UID: \"596e7b0c-f030-4a30-870b-184071c1eab3\") " pod="openstack/placement-65c5f4f57d-t9vqz" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.277460 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a-config-data\") pod \"keystone-787475576f-6cj4v\" (UID: \"0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a\") " pod="openstack/keystone-787475576f-6cj4v" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.279357 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/596e7b0c-f030-4a30-870b-184071c1eab3-logs\") pod \"placement-65c5f4f57d-t9vqz\" (UID: \"596e7b0c-f030-4a30-870b-184071c1eab3\") " pod="openstack/placement-65c5f4f57d-t9vqz" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.282731 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/596e7b0c-f030-4a30-870b-184071c1eab3-scripts\") pod \"placement-65c5f4f57d-t9vqz\" (UID: \"596e7b0c-f030-4a30-870b-184071c1eab3\") " pod="openstack/placement-65c5f4f57d-t9vqz" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.286052 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/596e7b0c-f030-4a30-870b-184071c1eab3-config-data\") pod \"placement-65c5f4f57d-t9vqz\" (UID: \"596e7b0c-f030-4a30-870b-184071c1eab3\") " pod="openstack/placement-65c5f4f57d-t9vqz" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.286277 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a-internal-tls-certs\") pod \"keystone-787475576f-6cj4v\" (UID: \"0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a\") " pod="openstack/keystone-787475576f-6cj4v" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.286765 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a-scripts\") pod \"keystone-787475576f-6cj4v\" (UID: \"0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a\") " pod="openstack/keystone-787475576f-6cj4v" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.286974 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a-config-data\") pod \"keystone-787475576f-6cj4v\" (UID: \"0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a\") " pod="openstack/keystone-787475576f-6cj4v" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.288029 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/596e7b0c-f030-4a30-870b-184071c1eab3-combined-ca-bundle\") pod \"placement-65c5f4f57d-t9vqz\" (UID: \"596e7b0c-f030-4a30-870b-184071c1eab3\") " pod="openstack/placement-65c5f4f57d-t9vqz" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.289997 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a-combined-ca-bundle\") pod \"keystone-787475576f-6cj4v\" (UID: \"0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a\") " pod="openstack/keystone-787475576f-6cj4v" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.291029 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/596e7b0c-f030-4a30-870b-184071c1eab3-internal-tls-certs\") pod \"placement-65c5f4f57d-t9vqz\" (UID: \"596e7b0c-f030-4a30-870b-184071c1eab3\") " pod="openstack/placement-65c5f4f57d-t9vqz" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.291594 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a-credential-keys\") pod \"keystone-787475576f-6cj4v\" (UID: \"0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a\") " pod="openstack/keystone-787475576f-6cj4v" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.291935 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a-public-tls-certs\") pod \"keystone-787475576f-6cj4v\" (UID: \"0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a\") " pod="openstack/keystone-787475576f-6cj4v" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.294105 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/596e7b0c-f030-4a30-870b-184071c1eab3-public-tls-certs\") pod \"placement-65c5f4f57d-t9vqz\" (UID: \"596e7b0c-f030-4a30-870b-184071c1eab3\") " pod="openstack/placement-65c5f4f57d-t9vqz" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.296753 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a-fernet-keys\") pod \"keystone-787475576f-6cj4v\" (UID: \"0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a\") " pod="openstack/keystone-787475576f-6cj4v" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.302982 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cjlh\" (UniqueName: \"kubernetes.io/projected/596e7b0c-f030-4a30-870b-184071c1eab3-kube-api-access-8cjlh\") pod \"placement-65c5f4f57d-t9vqz\" (UID: \"596e7b0c-f030-4a30-870b-184071c1eab3\") " pod="openstack/placement-65c5f4f57d-t9vqz" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.309160 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ph9ps\" (UniqueName: \"kubernetes.io/projected/0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a-kube-api-access-ph9ps\") pod \"keystone-787475576f-6cj4v\" (UID: \"0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a\") " pod="openstack/keystone-787475576f-6cj4v" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.320189 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-787475576f-6cj4v" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.334676 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-65c5f4f57d-t9vqz" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.368398 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.368814 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.426045 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.436667 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.511775 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.514729 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.558866 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.583796 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.666395 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.666452 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.666465 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.666475 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.853933 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-787475576f-6cj4v"] Mar 14 07:21:11 crc kubenswrapper[4893]: W0314 07:21:11.858684 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0193b07f_cfa8_4721_bc4c_ef7f3f0d2d2a.slice/crio-396df3dc672d4824a6f00250b633c94cf41afd05817c9b12e2f6bfc896aa4a85 WatchSource:0}: Error finding container 396df3dc672d4824a6f00250b633c94cf41afd05817c9b12e2f6bfc896aa4a85: Status 404 returned error can't find the container with id 396df3dc672d4824a6f00250b633c94cf41afd05817c9b12e2f6bfc896aa4a85 Mar 14 07:21:11 crc kubenswrapper[4893]: I0314 07:21:11.879113 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-65c5f4f57d-t9vqz"] Mar 14 07:21:11 crc kubenswrapper[4893]: W0314 07:21:11.885647 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod596e7b0c_f030_4a30_870b_184071c1eab3.slice/crio-2873c24a7dd33d43255b2d00ad7f3a07b2907efd749752fffd5260424f1ad838 WatchSource:0}: Error finding container 2873c24a7dd33d43255b2d00ad7f3a07b2907efd749752fffd5260424f1ad838: Status 404 returned error can't find the container with id 2873c24a7dd33d43255b2d00ad7f3a07b2907efd749752fffd5260424f1ad838 Mar 14 07:21:12 crc kubenswrapper[4893]: I0314 07:21:12.676411 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-65c5f4f57d-t9vqz" event={"ID":"596e7b0c-f030-4a30-870b-184071c1eab3","Type":"ContainerStarted","Data":"00125448b2e07210458115d4d01b1b06cc59d956fbc3e01b78b30a9a5107b0f1"} Mar 14 07:21:12 crc kubenswrapper[4893]: I0314 07:21:12.676875 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-65c5f4f57d-t9vqz" event={"ID":"596e7b0c-f030-4a30-870b-184071c1eab3","Type":"ContainerStarted","Data":"2873c24a7dd33d43255b2d00ad7f3a07b2907efd749752fffd5260424f1ad838"} Mar 14 07:21:12 crc kubenswrapper[4893]: I0314 07:21:12.680323 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-787475576f-6cj4v" event={"ID":"0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a","Type":"ContainerStarted","Data":"c53128a0bfb2847ab538756827eaf9e85484436e9a236685873bbc44f3f46b7f"} Mar 14 07:21:12 crc kubenswrapper[4893]: I0314 07:21:12.680370 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-787475576f-6cj4v" event={"ID":"0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a","Type":"ContainerStarted","Data":"396df3dc672d4824a6f00250b633c94cf41afd05817c9b12e2f6bfc896aa4a85"} Mar 14 07:21:13 crc kubenswrapper[4893]: I0314 07:21:13.569539 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-787475576f-6cj4v" podStartSLOduration=3.569493806 podStartE2EDuration="3.569493806s" podCreationTimestamp="2026-03-14 07:21:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:21:12.704599061 +0000 UTC m=+1351.966775853" watchObservedRunningTime="2026-03-14 07:21:13.569493806 +0000 UTC m=+1352.831670598" Mar 14 07:21:13 crc kubenswrapper[4893]: I0314 07:21:13.571450 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-d4589578b-zwqpr"] Mar 14 07:21:13 crc kubenswrapper[4893]: I0314 07:21:13.573162 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d4589578b-zwqpr" Mar 14 07:21:13 crc kubenswrapper[4893]: I0314 07:21:13.594057 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-d4589578b-zwqpr"] Mar 14 07:21:13 crc kubenswrapper[4893]: I0314 07:21:13.634480 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2195ecfb-6eeb-48f1-8b55-c57520974663-public-tls-certs\") pod \"placement-d4589578b-zwqpr\" (UID: \"2195ecfb-6eeb-48f1-8b55-c57520974663\") " pod="openstack/placement-d4589578b-zwqpr" Mar 14 07:21:13 crc kubenswrapper[4893]: I0314 07:21:13.634577 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2195ecfb-6eeb-48f1-8b55-c57520974663-config-data\") pod \"placement-d4589578b-zwqpr\" (UID: \"2195ecfb-6eeb-48f1-8b55-c57520974663\") " pod="openstack/placement-d4589578b-zwqpr" Mar 14 07:21:13 crc kubenswrapper[4893]: I0314 07:21:13.634665 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2195ecfb-6eeb-48f1-8b55-c57520974663-logs\") pod \"placement-d4589578b-zwqpr\" (UID: \"2195ecfb-6eeb-48f1-8b55-c57520974663\") " pod="openstack/placement-d4589578b-zwqpr" Mar 14 07:21:13 crc kubenswrapper[4893]: I0314 07:21:13.634727 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2195ecfb-6eeb-48f1-8b55-c57520974663-combined-ca-bundle\") pod \"placement-d4589578b-zwqpr\" (UID: \"2195ecfb-6eeb-48f1-8b55-c57520974663\") " pod="openstack/placement-d4589578b-zwqpr" Mar 14 07:21:13 crc kubenswrapper[4893]: I0314 07:21:13.634835 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2195ecfb-6eeb-48f1-8b55-c57520974663-scripts\") pod \"placement-d4589578b-zwqpr\" (UID: \"2195ecfb-6eeb-48f1-8b55-c57520974663\") " pod="openstack/placement-d4589578b-zwqpr" Mar 14 07:21:13 crc kubenswrapper[4893]: I0314 07:21:13.634922 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2195ecfb-6eeb-48f1-8b55-c57520974663-internal-tls-certs\") pod \"placement-d4589578b-zwqpr\" (UID: \"2195ecfb-6eeb-48f1-8b55-c57520974663\") " pod="openstack/placement-d4589578b-zwqpr" Mar 14 07:21:13 crc kubenswrapper[4893]: I0314 07:21:13.634991 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tdws\" (UniqueName: \"kubernetes.io/projected/2195ecfb-6eeb-48f1-8b55-c57520974663-kube-api-access-4tdws\") pod \"placement-d4589578b-zwqpr\" (UID: \"2195ecfb-6eeb-48f1-8b55-c57520974663\") " pod="openstack/placement-d4589578b-zwqpr" Mar 14 07:21:13 crc kubenswrapper[4893]: I0314 07:21:13.695370 4893 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 14 07:21:13 crc kubenswrapper[4893]: I0314 07:21:13.695393 4893 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 14 07:21:13 crc kubenswrapper[4893]: I0314 07:21:13.696641 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-65c5f4f57d-t9vqz" event={"ID":"596e7b0c-f030-4a30-870b-184071c1eab3","Type":"ContainerStarted","Data":"31d57d680adf2fd6fcd48d4af379a1f8d6a3daa8bd9cc2c05320d23011238bb3"} Mar 14 07:21:13 crc kubenswrapper[4893]: I0314 07:21:13.696995 4893 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 14 07:21:13 crc kubenswrapper[4893]: I0314 07:21:13.697049 4893 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 14 07:21:13 crc kubenswrapper[4893]: I0314 07:21:13.697034 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-65c5f4f57d-t9vqz" Mar 14 07:21:13 crc kubenswrapper[4893]: I0314 07:21:13.697256 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-787475576f-6cj4v" Mar 14 07:21:13 crc kubenswrapper[4893]: I0314 07:21:13.697269 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-65c5f4f57d-t9vqz" Mar 14 07:21:13 crc kubenswrapper[4893]: I0314 07:21:13.733006 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-65c5f4f57d-t9vqz" podStartSLOduration=3.732984706 podStartE2EDuration="3.732984706s" podCreationTimestamp="2026-03-14 07:21:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:21:13.71630635 +0000 UTC m=+1352.978483142" watchObservedRunningTime="2026-03-14 07:21:13.732984706 +0000 UTC m=+1352.995161498" Mar 14 07:21:13 crc kubenswrapper[4893]: I0314 07:21:13.736910 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2195ecfb-6eeb-48f1-8b55-c57520974663-logs\") pod \"placement-d4589578b-zwqpr\" (UID: \"2195ecfb-6eeb-48f1-8b55-c57520974663\") " pod="openstack/placement-d4589578b-zwqpr" Mar 14 07:21:13 crc kubenswrapper[4893]: I0314 07:21:13.737024 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2195ecfb-6eeb-48f1-8b55-c57520974663-combined-ca-bundle\") pod \"placement-d4589578b-zwqpr\" (UID: \"2195ecfb-6eeb-48f1-8b55-c57520974663\") " pod="openstack/placement-d4589578b-zwqpr" Mar 14 07:21:13 crc kubenswrapper[4893]: I0314 07:21:13.737057 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2195ecfb-6eeb-48f1-8b55-c57520974663-scripts\") pod \"placement-d4589578b-zwqpr\" (UID: \"2195ecfb-6eeb-48f1-8b55-c57520974663\") " pod="openstack/placement-d4589578b-zwqpr" Mar 14 07:21:13 crc kubenswrapper[4893]: I0314 07:21:13.737082 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2195ecfb-6eeb-48f1-8b55-c57520974663-internal-tls-certs\") pod \"placement-d4589578b-zwqpr\" (UID: \"2195ecfb-6eeb-48f1-8b55-c57520974663\") " pod="openstack/placement-d4589578b-zwqpr" Mar 14 07:21:13 crc kubenswrapper[4893]: I0314 07:21:13.737108 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tdws\" (UniqueName: \"kubernetes.io/projected/2195ecfb-6eeb-48f1-8b55-c57520974663-kube-api-access-4tdws\") pod \"placement-d4589578b-zwqpr\" (UID: \"2195ecfb-6eeb-48f1-8b55-c57520974663\") " pod="openstack/placement-d4589578b-zwqpr" Mar 14 07:21:13 crc kubenswrapper[4893]: I0314 07:21:13.737194 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2195ecfb-6eeb-48f1-8b55-c57520974663-public-tls-certs\") pod \"placement-d4589578b-zwqpr\" (UID: \"2195ecfb-6eeb-48f1-8b55-c57520974663\") " pod="openstack/placement-d4589578b-zwqpr" Mar 14 07:21:13 crc kubenswrapper[4893]: I0314 07:21:13.737210 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2195ecfb-6eeb-48f1-8b55-c57520974663-config-data\") pod \"placement-d4589578b-zwqpr\" (UID: \"2195ecfb-6eeb-48f1-8b55-c57520974663\") " pod="openstack/placement-d4589578b-zwqpr" Mar 14 07:21:13 crc kubenswrapper[4893]: I0314 07:21:13.740067 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2195ecfb-6eeb-48f1-8b55-c57520974663-logs\") pod \"placement-d4589578b-zwqpr\" (UID: \"2195ecfb-6eeb-48f1-8b55-c57520974663\") " pod="openstack/placement-d4589578b-zwqpr" Mar 14 07:21:13 crc kubenswrapper[4893]: I0314 07:21:13.744465 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2195ecfb-6eeb-48f1-8b55-c57520974663-internal-tls-certs\") pod \"placement-d4589578b-zwqpr\" (UID: \"2195ecfb-6eeb-48f1-8b55-c57520974663\") " pod="openstack/placement-d4589578b-zwqpr" Mar 14 07:21:13 crc kubenswrapper[4893]: I0314 07:21:13.744621 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2195ecfb-6eeb-48f1-8b55-c57520974663-config-data\") pod \"placement-d4589578b-zwqpr\" (UID: \"2195ecfb-6eeb-48f1-8b55-c57520974663\") " pod="openstack/placement-d4589578b-zwqpr" Mar 14 07:21:13 crc kubenswrapper[4893]: I0314 07:21:13.746985 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2195ecfb-6eeb-48f1-8b55-c57520974663-scripts\") pod \"placement-d4589578b-zwqpr\" (UID: \"2195ecfb-6eeb-48f1-8b55-c57520974663\") " pod="openstack/placement-d4589578b-zwqpr" Mar 14 07:21:13 crc kubenswrapper[4893]: I0314 07:21:13.753813 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2195ecfb-6eeb-48f1-8b55-c57520974663-public-tls-certs\") pod \"placement-d4589578b-zwqpr\" (UID: \"2195ecfb-6eeb-48f1-8b55-c57520974663\") " pod="openstack/placement-d4589578b-zwqpr" Mar 14 07:21:13 crc kubenswrapper[4893]: I0314 07:21:13.755475 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2195ecfb-6eeb-48f1-8b55-c57520974663-combined-ca-bundle\") pod \"placement-d4589578b-zwqpr\" (UID: \"2195ecfb-6eeb-48f1-8b55-c57520974663\") " pod="openstack/placement-d4589578b-zwqpr" Mar 14 07:21:13 crc kubenswrapper[4893]: I0314 07:21:13.766620 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tdws\" (UniqueName: \"kubernetes.io/projected/2195ecfb-6eeb-48f1-8b55-c57520974663-kube-api-access-4tdws\") pod \"placement-d4589578b-zwqpr\" (UID: \"2195ecfb-6eeb-48f1-8b55-c57520974663\") " pod="openstack/placement-d4589578b-zwqpr" Mar 14 07:21:13 crc kubenswrapper[4893]: I0314 07:21:13.897243 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d4589578b-zwqpr" Mar 14 07:21:13 crc kubenswrapper[4893]: I0314 07:21:13.973858 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 14 07:21:14 crc kubenswrapper[4893]: I0314 07:21:14.107891 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 14 07:21:14 crc kubenswrapper[4893]: I0314 07:21:14.467885 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 14 07:21:14 crc kubenswrapper[4893]: I0314 07:21:14.467935 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 14 07:21:14 crc kubenswrapper[4893]: W0314 07:21:14.604945 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2195ecfb_6eeb_48f1_8b55_c57520974663.slice/crio-38eeda47e7a5eea0b6dfddbcfda7c7aa248c1d05790dee7938fc29ba025a25f6 WatchSource:0}: Error finding container 38eeda47e7a5eea0b6dfddbcfda7c7aa248c1d05790dee7938fc29ba025a25f6: Status 404 returned error can't find the container with id 38eeda47e7a5eea0b6dfddbcfda7c7aa248c1d05790dee7938fc29ba025a25f6 Mar 14 07:21:14 crc kubenswrapper[4893]: I0314 07:21:14.613301 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-d4589578b-zwqpr"] Mar 14 07:21:14 crc kubenswrapper[4893]: I0314 07:21:14.715742 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d4589578b-zwqpr" event={"ID":"2195ecfb-6eeb-48f1-8b55-c57520974663","Type":"ContainerStarted","Data":"38eeda47e7a5eea0b6dfddbcfda7c7aa248c1d05790dee7938fc29ba025a25f6"} Mar 14 07:21:15 crc kubenswrapper[4893]: I0314 07:21:15.345712 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6d8b7f7f4c-klm8k" Mar 14 07:21:15 crc kubenswrapper[4893]: I0314 07:21:15.432654 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f778bb97-s6z9c"] Mar 14 07:21:15 crc kubenswrapper[4893]: I0314 07:21:15.432880 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55f778bb97-s6z9c" podUID="f453ebfe-e64b-475d-b37b-3cbb5f564a14" containerName="dnsmasq-dns" containerID="cri-o://17b7c3a5ef23a2a562a9326a114880d79bf48b80260dab9ac2904d926ee63ab9" gracePeriod=10 Mar 14 07:21:15 crc kubenswrapper[4893]: I0314 07:21:15.728704 4893 generic.go:334] "Generic (PLEG): container finished" podID="f453ebfe-e64b-475d-b37b-3cbb5f564a14" containerID="17b7c3a5ef23a2a562a9326a114880d79bf48b80260dab9ac2904d926ee63ab9" exitCode=0 Mar 14 07:21:15 crc kubenswrapper[4893]: I0314 07:21:15.729011 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f778bb97-s6z9c" event={"ID":"f453ebfe-e64b-475d-b37b-3cbb5f564a14","Type":"ContainerDied","Data":"17b7c3a5ef23a2a562a9326a114880d79bf48b80260dab9ac2904d926ee63ab9"} Mar 14 07:21:15 crc kubenswrapper[4893]: I0314 07:21:15.772722 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d4589578b-zwqpr" event={"ID":"2195ecfb-6eeb-48f1-8b55-c57520974663","Type":"ContainerStarted","Data":"071be3d7b6163f25cab591d62f20975ae6d40b5630f0ede440ea7ddafb12315f"} Mar 14 07:21:15 crc kubenswrapper[4893]: I0314 07:21:15.772758 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d4589578b-zwqpr" event={"ID":"2195ecfb-6eeb-48f1-8b55-c57520974663","Type":"ContainerStarted","Data":"bea0181f97653e52ef6b00db91426f325b021b8db102f6f9ca17b54176110d69"} Mar 14 07:21:15 crc kubenswrapper[4893]: I0314 07:21:15.773195 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-d4589578b-zwqpr" Mar 14 07:21:15 crc kubenswrapper[4893]: I0314 07:21:15.773223 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-d4589578b-zwqpr" Mar 14 07:21:15 crc kubenswrapper[4893]: I0314 07:21:15.801680 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-d4589578b-zwqpr" podStartSLOduration=2.801656507 podStartE2EDuration="2.801656507s" podCreationTimestamp="2026-03-14 07:21:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:21:15.790475585 +0000 UTC m=+1355.052652387" watchObservedRunningTime="2026-03-14 07:21:15.801656507 +0000 UTC m=+1355.063833299" Mar 14 07:21:16 crc kubenswrapper[4893]: I0314 07:21:16.015738 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f778bb97-s6z9c" Mar 14 07:21:16 crc kubenswrapper[4893]: I0314 07:21:16.110309 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bd84l\" (UniqueName: \"kubernetes.io/projected/f453ebfe-e64b-475d-b37b-3cbb5f564a14-kube-api-access-bd84l\") pod \"f453ebfe-e64b-475d-b37b-3cbb5f564a14\" (UID: \"f453ebfe-e64b-475d-b37b-3cbb5f564a14\") " Mar 14 07:21:16 crc kubenswrapper[4893]: I0314 07:21:16.110403 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f453ebfe-e64b-475d-b37b-3cbb5f564a14-ovsdbserver-sb\") pod \"f453ebfe-e64b-475d-b37b-3cbb5f564a14\" (UID: \"f453ebfe-e64b-475d-b37b-3cbb5f564a14\") " Mar 14 07:21:16 crc kubenswrapper[4893]: I0314 07:21:16.110431 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f453ebfe-e64b-475d-b37b-3cbb5f564a14-ovsdbserver-nb\") pod \"f453ebfe-e64b-475d-b37b-3cbb5f564a14\" (UID: \"f453ebfe-e64b-475d-b37b-3cbb5f564a14\") " Mar 14 07:21:16 crc kubenswrapper[4893]: I0314 07:21:16.110480 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f453ebfe-e64b-475d-b37b-3cbb5f564a14-dns-svc\") pod \"f453ebfe-e64b-475d-b37b-3cbb5f564a14\" (UID: \"f453ebfe-e64b-475d-b37b-3cbb5f564a14\") " Mar 14 07:21:16 crc kubenswrapper[4893]: I0314 07:21:16.110569 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f453ebfe-e64b-475d-b37b-3cbb5f564a14-config\") pod \"f453ebfe-e64b-475d-b37b-3cbb5f564a14\" (UID: \"f453ebfe-e64b-475d-b37b-3cbb5f564a14\") " Mar 14 07:21:16 crc kubenswrapper[4893]: I0314 07:21:16.120122 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f453ebfe-e64b-475d-b37b-3cbb5f564a14-kube-api-access-bd84l" (OuterVolumeSpecName: "kube-api-access-bd84l") pod "f453ebfe-e64b-475d-b37b-3cbb5f564a14" (UID: "f453ebfe-e64b-475d-b37b-3cbb5f564a14"). InnerVolumeSpecName "kube-api-access-bd84l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:21:16 crc kubenswrapper[4893]: I0314 07:21:16.211886 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bd84l\" (UniqueName: \"kubernetes.io/projected/f453ebfe-e64b-475d-b37b-3cbb5f564a14-kube-api-access-bd84l\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:16 crc kubenswrapper[4893]: I0314 07:21:16.251335 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f453ebfe-e64b-475d-b37b-3cbb5f564a14-config" (OuterVolumeSpecName: "config") pod "f453ebfe-e64b-475d-b37b-3cbb5f564a14" (UID: "f453ebfe-e64b-475d-b37b-3cbb5f564a14"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:21:16 crc kubenswrapper[4893]: I0314 07:21:16.275305 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f453ebfe-e64b-475d-b37b-3cbb5f564a14-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f453ebfe-e64b-475d-b37b-3cbb5f564a14" (UID: "f453ebfe-e64b-475d-b37b-3cbb5f564a14"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:21:16 crc kubenswrapper[4893]: I0314 07:21:16.276255 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f453ebfe-e64b-475d-b37b-3cbb5f564a14-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f453ebfe-e64b-475d-b37b-3cbb5f564a14" (UID: "f453ebfe-e64b-475d-b37b-3cbb5f564a14"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:21:16 crc kubenswrapper[4893]: I0314 07:21:16.276493 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f453ebfe-e64b-475d-b37b-3cbb5f564a14-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f453ebfe-e64b-475d-b37b-3cbb5f564a14" (UID: "f453ebfe-e64b-475d-b37b-3cbb5f564a14"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:21:16 crc kubenswrapper[4893]: I0314 07:21:16.314891 4893 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f453ebfe-e64b-475d-b37b-3cbb5f564a14-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:16 crc kubenswrapper[4893]: I0314 07:21:16.314943 4893 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f453ebfe-e64b-475d-b37b-3cbb5f564a14-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:16 crc kubenswrapper[4893]: I0314 07:21:16.314957 4893 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f453ebfe-e64b-475d-b37b-3cbb5f564a14-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:16 crc kubenswrapper[4893]: I0314 07:21:16.314970 4893 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f453ebfe-e64b-475d-b37b-3cbb5f564a14-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:16 crc kubenswrapper[4893]: I0314 07:21:16.781872 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-xwpzh" event={"ID":"8a25b2a2-2d1a-4e0c-8114-c796ed4fe1e2","Type":"ContainerStarted","Data":"4d963d82562ecbbe578925bc8419a6dd95330e0048641392bedfe5325585ff73"} Mar 14 07:21:16 crc kubenswrapper[4893]: I0314 07:21:16.786325 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f778bb97-s6z9c" Mar 14 07:21:16 crc kubenswrapper[4893]: I0314 07:21:16.789654 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f778bb97-s6z9c" event={"ID":"f453ebfe-e64b-475d-b37b-3cbb5f564a14","Type":"ContainerDied","Data":"ce5c6ace3a8f884128f120bb6c35ee374e0c4c285ef3a0f072dcdd71ae03bee6"} Mar 14 07:21:16 crc kubenswrapper[4893]: I0314 07:21:16.789739 4893 scope.go:117] "RemoveContainer" containerID="17b7c3a5ef23a2a562a9326a114880d79bf48b80260dab9ac2904d926ee63ab9" Mar 14 07:21:16 crc kubenswrapper[4893]: I0314 07:21:16.815663 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-xwpzh" podStartSLOduration=2.680174436 podStartE2EDuration="38.815643042s" podCreationTimestamp="2026-03-14 07:20:38 +0000 UTC" firstStartedPulling="2026-03-14 07:20:39.775108152 +0000 UTC m=+1319.037284944" lastFinishedPulling="2026-03-14 07:21:15.910576758 +0000 UTC m=+1355.172753550" observedRunningTime="2026-03-14 07:21:16.809927113 +0000 UTC m=+1356.072103945" watchObservedRunningTime="2026-03-14 07:21:16.815643042 +0000 UTC m=+1356.077819834" Mar 14 07:21:16 crc kubenswrapper[4893]: I0314 07:21:16.840579 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f778bb97-s6z9c"] Mar 14 07:21:16 crc kubenswrapper[4893]: I0314 07:21:16.846275 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55f778bb97-s6z9c"] Mar 14 07:21:17 crc kubenswrapper[4893]: I0314 07:21:17.392684 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f453ebfe-e64b-475d-b37b-3cbb5f564a14" path="/var/lib/kubelet/pods/f453ebfe-e64b-475d-b37b-3cbb5f564a14/volumes" Mar 14 07:21:19 crc kubenswrapper[4893]: I0314 07:21:19.817965 4893 generic.go:334] "Generic (PLEG): container finished" podID="8a25b2a2-2d1a-4e0c-8114-c796ed4fe1e2" containerID="4d963d82562ecbbe578925bc8419a6dd95330e0048641392bedfe5325585ff73" exitCode=0 Mar 14 07:21:19 crc kubenswrapper[4893]: I0314 07:21:19.818083 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-xwpzh" event={"ID":"8a25b2a2-2d1a-4e0c-8114-c796ed4fe1e2","Type":"ContainerDied","Data":"4d963d82562ecbbe578925bc8419a6dd95330e0048641392bedfe5325585ff73"} Mar 14 07:21:21 crc kubenswrapper[4893]: I0314 07:21:21.034721 4893 scope.go:117] "RemoveContainer" containerID="27dc77b8acd6b38d20435727dd8faf044af0c588eb456abcf368c4493ca34d71" Mar 14 07:21:21 crc kubenswrapper[4893]: I0314 07:21:21.293623 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-xwpzh" Mar 14 07:21:21 crc kubenswrapper[4893]: I0314 07:21:21.326712 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78vcx\" (UniqueName: \"kubernetes.io/projected/8a25b2a2-2d1a-4e0c-8114-c796ed4fe1e2-kube-api-access-78vcx\") pod \"8a25b2a2-2d1a-4e0c-8114-c796ed4fe1e2\" (UID: \"8a25b2a2-2d1a-4e0c-8114-c796ed4fe1e2\") " Mar 14 07:21:21 crc kubenswrapper[4893]: I0314 07:21:21.326798 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8a25b2a2-2d1a-4e0c-8114-c796ed4fe1e2-db-sync-config-data\") pod \"8a25b2a2-2d1a-4e0c-8114-c796ed4fe1e2\" (UID: \"8a25b2a2-2d1a-4e0c-8114-c796ed4fe1e2\") " Mar 14 07:21:21 crc kubenswrapper[4893]: I0314 07:21:21.332941 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a25b2a2-2d1a-4e0c-8114-c796ed4fe1e2-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "8a25b2a2-2d1a-4e0c-8114-c796ed4fe1e2" (UID: "8a25b2a2-2d1a-4e0c-8114-c796ed4fe1e2"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:21:21 crc kubenswrapper[4893]: I0314 07:21:21.333558 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a25b2a2-2d1a-4e0c-8114-c796ed4fe1e2-kube-api-access-78vcx" (OuterVolumeSpecName: "kube-api-access-78vcx") pod "8a25b2a2-2d1a-4e0c-8114-c796ed4fe1e2" (UID: "8a25b2a2-2d1a-4e0c-8114-c796ed4fe1e2"). InnerVolumeSpecName "kube-api-access-78vcx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:21:21 crc kubenswrapper[4893]: I0314 07:21:21.427810 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a25b2a2-2d1a-4e0c-8114-c796ed4fe1e2-combined-ca-bundle\") pod \"8a25b2a2-2d1a-4e0c-8114-c796ed4fe1e2\" (UID: \"8a25b2a2-2d1a-4e0c-8114-c796ed4fe1e2\") " Mar 14 07:21:21 crc kubenswrapper[4893]: I0314 07:21:21.428322 4893 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8a25b2a2-2d1a-4e0c-8114-c796ed4fe1e2-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:21 crc kubenswrapper[4893]: I0314 07:21:21.428344 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78vcx\" (UniqueName: \"kubernetes.io/projected/8a25b2a2-2d1a-4e0c-8114-c796ed4fe1e2-kube-api-access-78vcx\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:21 crc kubenswrapper[4893]: I0314 07:21:21.460468 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a25b2a2-2d1a-4e0c-8114-c796ed4fe1e2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8a25b2a2-2d1a-4e0c-8114-c796ed4fe1e2" (UID: "8a25b2a2-2d1a-4e0c-8114-c796ed4fe1e2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:21:21 crc kubenswrapper[4893]: I0314 07:21:21.531894 4893 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a25b2a2-2d1a-4e0c-8114-c796ed4fe1e2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:21 crc kubenswrapper[4893]: I0314 07:21:21.846136 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-pmw52" event={"ID":"7e3cdf1f-7963-494d-86f8-699e7401fe91","Type":"ContainerStarted","Data":"3dab1821b74126bc10d3f1995b8bb609c8b02b71125d8eef1e9212847979e89d"} Mar 14 07:21:21 crc kubenswrapper[4893]: I0314 07:21:21.848430 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aa99848c-b3da-4307-afd1-f37484e648d6","Type":"ContainerStarted","Data":"c045a9a66c64df82f2dce8de6eaed13dc3abe648d523380f389d497f149eabce"} Mar 14 07:21:21 crc kubenswrapper[4893]: I0314 07:21:21.848550 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="aa99848c-b3da-4307-afd1-f37484e648d6" containerName="ceilometer-central-agent" containerID="cri-o://1376e2738bb5dd05eea9d91edfaf196013317f6b8119087b98c8835756a2312d" gracePeriod=30 Mar 14 07:21:21 crc kubenswrapper[4893]: I0314 07:21:21.848593 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 14 07:21:21 crc kubenswrapper[4893]: I0314 07:21:21.848650 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="aa99848c-b3da-4307-afd1-f37484e648d6" containerName="ceilometer-notification-agent" containerID="cri-o://2f317dd6ab7555b498a5f2e58f54aaf4c94f2e9694be7af371bfdad72c87110f" gracePeriod=30 Mar 14 07:21:21 crc kubenswrapper[4893]: I0314 07:21:21.848640 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="aa99848c-b3da-4307-afd1-f37484e648d6" containerName="proxy-httpd" containerID="cri-o://c045a9a66c64df82f2dce8de6eaed13dc3abe648d523380f389d497f149eabce" gracePeriod=30 Mar 14 07:21:21 crc kubenswrapper[4893]: I0314 07:21:21.848647 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="aa99848c-b3da-4307-afd1-f37484e648d6" containerName="sg-core" containerID="cri-o://9aefa053e67c31eb3cbb6bada27fa7330f22c04186e260a16b7a1675979550cc" gracePeriod=30 Mar 14 07:21:21 crc kubenswrapper[4893]: I0314 07:21:21.855889 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-xwpzh" event={"ID":"8a25b2a2-2d1a-4e0c-8114-c796ed4fe1e2","Type":"ContainerDied","Data":"bb32cdbe87fb65aeb08da10ad9ae68aa3c26620126b3351d4bce40bac1066108"} Mar 14 07:21:21 crc kubenswrapper[4893]: I0314 07:21:21.855932 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb32cdbe87fb65aeb08da10ad9ae68aa3c26620126b3351d4bce40bac1066108" Mar 14 07:21:21 crc kubenswrapper[4893]: I0314 07:21:21.856020 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-xwpzh" Mar 14 07:21:21 crc kubenswrapper[4893]: I0314 07:21:21.896700 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-pmw52" podStartSLOduration=1.933955569 podStartE2EDuration="43.896680216s" podCreationTimestamp="2026-03-14 07:20:38 +0000 UTC" firstStartedPulling="2026-03-14 07:20:39.175587757 +0000 UTC m=+1318.437764559" lastFinishedPulling="2026-03-14 07:21:21.138312414 +0000 UTC m=+1360.400489206" observedRunningTime="2026-03-14 07:21:21.875298575 +0000 UTC m=+1361.137475377" watchObservedRunningTime="2026-03-14 07:21:21.896680216 +0000 UTC m=+1361.158857008" Mar 14 07:21:21 crc kubenswrapper[4893]: I0314 07:21:21.908733 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.148727978 podStartE2EDuration="43.908707979s" podCreationTimestamp="2026-03-14 07:20:38 +0000 UTC" firstStartedPulling="2026-03-14 07:20:39.394198729 +0000 UTC m=+1318.656375521" lastFinishedPulling="2026-03-14 07:21:21.15417873 +0000 UTC m=+1360.416355522" observedRunningTime="2026-03-14 07:21:21.898975842 +0000 UTC m=+1361.161152634" watchObservedRunningTime="2026-03-14 07:21:21.908707979 +0000 UTC m=+1361.170884771" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.124478 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-68559f9fc9-2zprf"] Mar 14 07:21:22 crc kubenswrapper[4893]: E0314 07:21:22.124838 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a25b2a2-2d1a-4e0c-8114-c796ed4fe1e2" containerName="barbican-db-sync" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.124850 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a25b2a2-2d1a-4e0c-8114-c796ed4fe1e2" containerName="barbican-db-sync" Mar 14 07:21:22 crc kubenswrapper[4893]: E0314 07:21:22.124863 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f453ebfe-e64b-475d-b37b-3cbb5f564a14" containerName="dnsmasq-dns" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.124868 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="f453ebfe-e64b-475d-b37b-3cbb5f564a14" containerName="dnsmasq-dns" Mar 14 07:21:22 crc kubenswrapper[4893]: E0314 07:21:22.130495 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f453ebfe-e64b-475d-b37b-3cbb5f564a14" containerName="init" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.130506 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="f453ebfe-e64b-475d-b37b-3cbb5f564a14" containerName="init" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.130809 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="f453ebfe-e64b-475d-b37b-3cbb5f564a14" containerName="dnsmasq-dns" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.130839 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a25b2a2-2d1a-4e0c-8114-c796ed4fe1e2" containerName="barbican-db-sync" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.131752 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-68559f9fc9-2zprf" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.137964 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.138438 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.139625 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-w74g5" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.140629 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fc48c5b-eba4-4ce3-b68a-289b737dd9c4-combined-ca-bundle\") pod \"barbican-worker-68559f9fc9-2zprf\" (UID: \"1fc48c5b-eba4-4ce3-b68a-289b737dd9c4\") " pod="openstack/barbican-worker-68559f9fc9-2zprf" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.140682 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1fc48c5b-eba4-4ce3-b68a-289b737dd9c4-config-data-custom\") pod \"barbican-worker-68559f9fc9-2zprf\" (UID: \"1fc48c5b-eba4-4ce3-b68a-289b737dd9c4\") " pod="openstack/barbican-worker-68559f9fc9-2zprf" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.140702 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1fc48c5b-eba4-4ce3-b68a-289b737dd9c4-logs\") pod \"barbican-worker-68559f9fc9-2zprf\" (UID: \"1fc48c5b-eba4-4ce3-b68a-289b737dd9c4\") " pod="openstack/barbican-worker-68559f9fc9-2zprf" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.140753 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxns2\" (UniqueName: \"kubernetes.io/projected/1fc48c5b-eba4-4ce3-b68a-289b737dd9c4-kube-api-access-rxns2\") pod \"barbican-worker-68559f9fc9-2zprf\" (UID: \"1fc48c5b-eba4-4ce3-b68a-289b737dd9c4\") " pod="openstack/barbican-worker-68559f9fc9-2zprf" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.140790 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fc48c5b-eba4-4ce3-b68a-289b737dd9c4-config-data\") pod \"barbican-worker-68559f9fc9-2zprf\" (UID: \"1fc48c5b-eba4-4ce3-b68a-289b737dd9c4\") " pod="openstack/barbican-worker-68559f9fc9-2zprf" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.163251 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-68559f9fc9-2zprf"] Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.238884 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-6f4f4558c4-87m4w"] Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.240540 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6f4f4558c4-87m4w" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.242148 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fc48c5b-eba4-4ce3-b68a-289b737dd9c4-combined-ca-bundle\") pod \"barbican-worker-68559f9fc9-2zprf\" (UID: \"1fc48c5b-eba4-4ce3-b68a-289b737dd9c4\") " pod="openstack/barbican-worker-68559f9fc9-2zprf" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.242312 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1fc48c5b-eba4-4ce3-b68a-289b737dd9c4-config-data-custom\") pod \"barbican-worker-68559f9fc9-2zprf\" (UID: \"1fc48c5b-eba4-4ce3-b68a-289b737dd9c4\") " pod="openstack/barbican-worker-68559f9fc9-2zprf" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.242391 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1fc48c5b-eba4-4ce3-b68a-289b737dd9c4-logs\") pod \"barbican-worker-68559f9fc9-2zprf\" (UID: \"1fc48c5b-eba4-4ce3-b68a-289b737dd9c4\") " pod="openstack/barbican-worker-68559f9fc9-2zprf" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.242507 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxns2\" (UniqueName: \"kubernetes.io/projected/1fc48c5b-eba4-4ce3-b68a-289b737dd9c4-kube-api-access-rxns2\") pod \"barbican-worker-68559f9fc9-2zprf\" (UID: \"1fc48c5b-eba4-4ce3-b68a-289b737dd9c4\") " pod="openstack/barbican-worker-68559f9fc9-2zprf" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.242602 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49d7922c-f9be-40bc-ba17-ec777a331998-config-data\") pod \"barbican-keystone-listener-6f4f4558c4-87m4w\" (UID: \"49d7922c-f9be-40bc-ba17-ec777a331998\") " pod="openstack/barbican-keystone-listener-6f4f4558c4-87m4w" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.242688 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/49d7922c-f9be-40bc-ba17-ec777a331998-config-data-custom\") pod \"barbican-keystone-listener-6f4f4558c4-87m4w\" (UID: \"49d7922c-f9be-40bc-ba17-ec777a331998\") " pod="openstack/barbican-keystone-listener-6f4f4558c4-87m4w" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.242775 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49d7922c-f9be-40bc-ba17-ec777a331998-logs\") pod \"barbican-keystone-listener-6f4f4558c4-87m4w\" (UID: \"49d7922c-f9be-40bc-ba17-ec777a331998\") " pod="openstack/barbican-keystone-listener-6f4f4558c4-87m4w" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.242856 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fc48c5b-eba4-4ce3-b68a-289b737dd9c4-config-data\") pod \"barbican-worker-68559f9fc9-2zprf\" (UID: \"1fc48c5b-eba4-4ce3-b68a-289b737dd9c4\") " pod="openstack/barbican-worker-68559f9fc9-2zprf" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.242936 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49d7922c-f9be-40bc-ba17-ec777a331998-combined-ca-bundle\") pod \"barbican-keystone-listener-6f4f4558c4-87m4w\" (UID: \"49d7922c-f9be-40bc-ba17-ec777a331998\") " pod="openstack/barbican-keystone-listener-6f4f4558c4-87m4w" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.243022 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88zmj\" (UniqueName: \"kubernetes.io/projected/49d7922c-f9be-40bc-ba17-ec777a331998-kube-api-access-88zmj\") pod \"barbican-keystone-listener-6f4f4558c4-87m4w\" (UID: \"49d7922c-f9be-40bc-ba17-ec777a331998\") " pod="openstack/barbican-keystone-listener-6f4f4558c4-87m4w" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.245848 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.246668 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1fc48c5b-eba4-4ce3-b68a-289b737dd9c4-logs\") pod \"barbican-worker-68559f9fc9-2zprf\" (UID: \"1fc48c5b-eba4-4ce3-b68a-289b737dd9c4\") " pod="openstack/barbican-worker-68559f9fc9-2zprf" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.248383 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1fc48c5b-eba4-4ce3-b68a-289b737dd9c4-config-data-custom\") pod \"barbican-worker-68559f9fc9-2zprf\" (UID: \"1fc48c5b-eba4-4ce3-b68a-289b737dd9c4\") " pod="openstack/barbican-worker-68559f9fc9-2zprf" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.266690 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fc48c5b-eba4-4ce3-b68a-289b737dd9c4-config-data\") pod \"barbican-worker-68559f9fc9-2zprf\" (UID: \"1fc48c5b-eba4-4ce3-b68a-289b737dd9c4\") " pod="openstack/barbican-worker-68559f9fc9-2zprf" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.267152 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6f4f4558c4-87m4w"] Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.268112 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fc48c5b-eba4-4ce3-b68a-289b737dd9c4-combined-ca-bundle\") pod \"barbican-worker-68559f9fc9-2zprf\" (UID: \"1fc48c5b-eba4-4ce3-b68a-289b737dd9c4\") " pod="openstack/barbican-worker-68559f9fc9-2zprf" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.297127 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxns2\" (UniqueName: \"kubernetes.io/projected/1fc48c5b-eba4-4ce3-b68a-289b737dd9c4-kube-api-access-rxns2\") pod \"barbican-worker-68559f9fc9-2zprf\" (UID: \"1fc48c5b-eba4-4ce3-b68a-289b737dd9c4\") " pod="openstack/barbican-worker-68559f9fc9-2zprf" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.344578 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77cf8fb985-qmxrz"] Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.346275 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77cf8fb985-qmxrz" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.348307 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/49d7922c-f9be-40bc-ba17-ec777a331998-config-data-custom\") pod \"barbican-keystone-listener-6f4f4558c4-87m4w\" (UID: \"49d7922c-f9be-40bc-ba17-ec777a331998\") " pod="openstack/barbican-keystone-listener-6f4f4558c4-87m4w" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.348940 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49d7922c-f9be-40bc-ba17-ec777a331998-logs\") pod \"barbican-keystone-listener-6f4f4558c4-87m4w\" (UID: \"49d7922c-f9be-40bc-ba17-ec777a331998\") " pod="openstack/barbican-keystone-listener-6f4f4558c4-87m4w" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.349042 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2fd95203-37e0-4d71-9fd6-a1b04882fe7b-dns-svc\") pod \"dnsmasq-dns-77cf8fb985-qmxrz\" (UID: \"2fd95203-37e0-4d71-9fd6-a1b04882fe7b\") " pod="openstack/dnsmasq-dns-77cf8fb985-qmxrz" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.349127 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49d7922c-f9be-40bc-ba17-ec777a331998-combined-ca-bundle\") pod \"barbican-keystone-listener-6f4f4558c4-87m4w\" (UID: \"49d7922c-f9be-40bc-ba17-ec777a331998\") " pod="openstack/barbican-keystone-listener-6f4f4558c4-87m4w" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.349210 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2fd95203-37e0-4d71-9fd6-a1b04882fe7b-ovsdbserver-nb\") pod \"dnsmasq-dns-77cf8fb985-qmxrz\" (UID: \"2fd95203-37e0-4d71-9fd6-a1b04882fe7b\") " pod="openstack/dnsmasq-dns-77cf8fb985-qmxrz" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.349283 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88zmj\" (UniqueName: \"kubernetes.io/projected/49d7922c-f9be-40bc-ba17-ec777a331998-kube-api-access-88zmj\") pod \"barbican-keystone-listener-6f4f4558c4-87m4w\" (UID: \"49d7922c-f9be-40bc-ba17-ec777a331998\") " pod="openstack/barbican-keystone-listener-6f4f4558c4-87m4w" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.349373 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5j6x2\" (UniqueName: \"kubernetes.io/projected/2fd95203-37e0-4d71-9fd6-a1b04882fe7b-kube-api-access-5j6x2\") pod \"dnsmasq-dns-77cf8fb985-qmxrz\" (UID: \"2fd95203-37e0-4d71-9fd6-a1b04882fe7b\") " pod="openstack/dnsmasq-dns-77cf8fb985-qmxrz" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.349445 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fd95203-37e0-4d71-9fd6-a1b04882fe7b-config\") pod \"dnsmasq-dns-77cf8fb985-qmxrz\" (UID: \"2fd95203-37e0-4d71-9fd6-a1b04882fe7b\") " pod="openstack/dnsmasq-dns-77cf8fb985-qmxrz" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.349587 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2fd95203-37e0-4d71-9fd6-a1b04882fe7b-dns-swift-storage-0\") pod \"dnsmasq-dns-77cf8fb985-qmxrz\" (UID: \"2fd95203-37e0-4d71-9fd6-a1b04882fe7b\") " pod="openstack/dnsmasq-dns-77cf8fb985-qmxrz" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.349666 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2fd95203-37e0-4d71-9fd6-a1b04882fe7b-ovsdbserver-sb\") pod \"dnsmasq-dns-77cf8fb985-qmxrz\" (UID: \"2fd95203-37e0-4d71-9fd6-a1b04882fe7b\") " pod="openstack/dnsmasq-dns-77cf8fb985-qmxrz" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.349753 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49d7922c-f9be-40bc-ba17-ec777a331998-config-data\") pod \"barbican-keystone-listener-6f4f4558c4-87m4w\" (UID: \"49d7922c-f9be-40bc-ba17-ec777a331998\") " pod="openstack/barbican-keystone-listener-6f4f4558c4-87m4w" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.351193 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49d7922c-f9be-40bc-ba17-ec777a331998-logs\") pod \"barbican-keystone-listener-6f4f4558c4-87m4w\" (UID: \"49d7922c-f9be-40bc-ba17-ec777a331998\") " pod="openstack/barbican-keystone-listener-6f4f4558c4-87m4w" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.359359 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/49d7922c-f9be-40bc-ba17-ec777a331998-config-data-custom\") pod \"barbican-keystone-listener-6f4f4558c4-87m4w\" (UID: \"49d7922c-f9be-40bc-ba17-ec777a331998\") " pod="openstack/barbican-keystone-listener-6f4f4558c4-87m4w" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.361275 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49d7922c-f9be-40bc-ba17-ec777a331998-config-data\") pod \"barbican-keystone-listener-6f4f4558c4-87m4w\" (UID: \"49d7922c-f9be-40bc-ba17-ec777a331998\") " pod="openstack/barbican-keystone-listener-6f4f4558c4-87m4w" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.366244 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49d7922c-f9be-40bc-ba17-ec777a331998-combined-ca-bundle\") pod \"barbican-keystone-listener-6f4f4558c4-87m4w\" (UID: \"49d7922c-f9be-40bc-ba17-ec777a331998\") " pod="openstack/barbican-keystone-listener-6f4f4558c4-87m4w" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.392729 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88zmj\" (UniqueName: \"kubernetes.io/projected/49d7922c-f9be-40bc-ba17-ec777a331998-kube-api-access-88zmj\") pod \"barbican-keystone-listener-6f4f4558c4-87m4w\" (UID: \"49d7922c-f9be-40bc-ba17-ec777a331998\") " pod="openstack/barbican-keystone-listener-6f4f4558c4-87m4w" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.418803 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77cf8fb985-qmxrz"] Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.420860 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6f4f4558c4-87m4w" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.459406 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2fd95203-37e0-4d71-9fd6-a1b04882fe7b-dns-swift-storage-0\") pod \"dnsmasq-dns-77cf8fb985-qmxrz\" (UID: \"2fd95203-37e0-4d71-9fd6-a1b04882fe7b\") " pod="openstack/dnsmasq-dns-77cf8fb985-qmxrz" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.459473 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2fd95203-37e0-4d71-9fd6-a1b04882fe7b-ovsdbserver-sb\") pod \"dnsmasq-dns-77cf8fb985-qmxrz\" (UID: \"2fd95203-37e0-4d71-9fd6-a1b04882fe7b\") " pod="openstack/dnsmasq-dns-77cf8fb985-qmxrz" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.459559 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2fd95203-37e0-4d71-9fd6-a1b04882fe7b-dns-svc\") pod \"dnsmasq-dns-77cf8fb985-qmxrz\" (UID: \"2fd95203-37e0-4d71-9fd6-a1b04882fe7b\") " pod="openstack/dnsmasq-dns-77cf8fb985-qmxrz" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.459604 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2fd95203-37e0-4d71-9fd6-a1b04882fe7b-ovsdbserver-nb\") pod \"dnsmasq-dns-77cf8fb985-qmxrz\" (UID: \"2fd95203-37e0-4d71-9fd6-a1b04882fe7b\") " pod="openstack/dnsmasq-dns-77cf8fb985-qmxrz" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.459661 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5j6x2\" (UniqueName: \"kubernetes.io/projected/2fd95203-37e0-4d71-9fd6-a1b04882fe7b-kube-api-access-5j6x2\") pod \"dnsmasq-dns-77cf8fb985-qmxrz\" (UID: \"2fd95203-37e0-4d71-9fd6-a1b04882fe7b\") " pod="openstack/dnsmasq-dns-77cf8fb985-qmxrz" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.459695 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fd95203-37e0-4d71-9fd6-a1b04882fe7b-config\") pod \"dnsmasq-dns-77cf8fb985-qmxrz\" (UID: \"2fd95203-37e0-4d71-9fd6-a1b04882fe7b\") " pod="openstack/dnsmasq-dns-77cf8fb985-qmxrz" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.460844 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fd95203-37e0-4d71-9fd6-a1b04882fe7b-config\") pod \"dnsmasq-dns-77cf8fb985-qmxrz\" (UID: \"2fd95203-37e0-4d71-9fd6-a1b04882fe7b\") " pod="openstack/dnsmasq-dns-77cf8fb985-qmxrz" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.461046 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2fd95203-37e0-4d71-9fd6-a1b04882fe7b-ovsdbserver-sb\") pod \"dnsmasq-dns-77cf8fb985-qmxrz\" (UID: \"2fd95203-37e0-4d71-9fd6-a1b04882fe7b\") " pod="openstack/dnsmasq-dns-77cf8fb985-qmxrz" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.461582 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2fd95203-37e0-4d71-9fd6-a1b04882fe7b-dns-swift-storage-0\") pod \"dnsmasq-dns-77cf8fb985-qmxrz\" (UID: \"2fd95203-37e0-4d71-9fd6-a1b04882fe7b\") " pod="openstack/dnsmasq-dns-77cf8fb985-qmxrz" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.474819 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2fd95203-37e0-4d71-9fd6-a1b04882fe7b-ovsdbserver-nb\") pod \"dnsmasq-dns-77cf8fb985-qmxrz\" (UID: \"2fd95203-37e0-4d71-9fd6-a1b04882fe7b\") " pod="openstack/dnsmasq-dns-77cf8fb985-qmxrz" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.475499 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2fd95203-37e0-4d71-9fd6-a1b04882fe7b-dns-svc\") pod \"dnsmasq-dns-77cf8fb985-qmxrz\" (UID: \"2fd95203-37e0-4d71-9fd6-a1b04882fe7b\") " pod="openstack/dnsmasq-dns-77cf8fb985-qmxrz" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.520355 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5j6x2\" (UniqueName: \"kubernetes.io/projected/2fd95203-37e0-4d71-9fd6-a1b04882fe7b-kube-api-access-5j6x2\") pod \"dnsmasq-dns-77cf8fb985-qmxrz\" (UID: \"2fd95203-37e0-4d71-9fd6-a1b04882fe7b\") " pod="openstack/dnsmasq-dns-77cf8fb985-qmxrz" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.521199 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-68559f9fc9-2zprf" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.529285 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7547ccd746-fj68b"] Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.530658 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7547ccd746-fj68b" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.534804 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.599510 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7547ccd746-fj68b"] Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.667841 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4095938-fa00-41e6-ae21-21852f42f7bf-config-data\") pod \"barbican-api-7547ccd746-fj68b\" (UID: \"e4095938-fa00-41e6-ae21-21852f42f7bf\") " pod="openstack/barbican-api-7547ccd746-fj68b" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.668185 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpl68\" (UniqueName: \"kubernetes.io/projected/e4095938-fa00-41e6-ae21-21852f42f7bf-kube-api-access-cpl68\") pod \"barbican-api-7547ccd746-fj68b\" (UID: \"e4095938-fa00-41e6-ae21-21852f42f7bf\") " pod="openstack/barbican-api-7547ccd746-fj68b" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.668211 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4095938-fa00-41e6-ae21-21852f42f7bf-combined-ca-bundle\") pod \"barbican-api-7547ccd746-fj68b\" (UID: \"e4095938-fa00-41e6-ae21-21852f42f7bf\") " pod="openstack/barbican-api-7547ccd746-fj68b" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.668242 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4095938-fa00-41e6-ae21-21852f42f7bf-config-data-custom\") pod \"barbican-api-7547ccd746-fj68b\" (UID: \"e4095938-fa00-41e6-ae21-21852f42f7bf\") " pod="openstack/barbican-api-7547ccd746-fj68b" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.668283 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4095938-fa00-41e6-ae21-21852f42f7bf-logs\") pod \"barbican-api-7547ccd746-fj68b\" (UID: \"e4095938-fa00-41e6-ae21-21852f42f7bf\") " pod="openstack/barbican-api-7547ccd746-fj68b" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.758024 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77cf8fb985-qmxrz" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.769891 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpl68\" (UniqueName: \"kubernetes.io/projected/e4095938-fa00-41e6-ae21-21852f42f7bf-kube-api-access-cpl68\") pod \"barbican-api-7547ccd746-fj68b\" (UID: \"e4095938-fa00-41e6-ae21-21852f42f7bf\") " pod="openstack/barbican-api-7547ccd746-fj68b" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.769930 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4095938-fa00-41e6-ae21-21852f42f7bf-combined-ca-bundle\") pod \"barbican-api-7547ccd746-fj68b\" (UID: \"e4095938-fa00-41e6-ae21-21852f42f7bf\") " pod="openstack/barbican-api-7547ccd746-fj68b" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.769960 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4095938-fa00-41e6-ae21-21852f42f7bf-config-data-custom\") pod \"barbican-api-7547ccd746-fj68b\" (UID: \"e4095938-fa00-41e6-ae21-21852f42f7bf\") " pod="openstack/barbican-api-7547ccd746-fj68b" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.770004 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4095938-fa00-41e6-ae21-21852f42f7bf-logs\") pod \"barbican-api-7547ccd746-fj68b\" (UID: \"e4095938-fa00-41e6-ae21-21852f42f7bf\") " pod="openstack/barbican-api-7547ccd746-fj68b" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.770064 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4095938-fa00-41e6-ae21-21852f42f7bf-config-data\") pod \"barbican-api-7547ccd746-fj68b\" (UID: \"e4095938-fa00-41e6-ae21-21852f42f7bf\") " pod="openstack/barbican-api-7547ccd746-fj68b" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.772363 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4095938-fa00-41e6-ae21-21852f42f7bf-logs\") pod \"barbican-api-7547ccd746-fj68b\" (UID: \"e4095938-fa00-41e6-ae21-21852f42f7bf\") " pod="openstack/barbican-api-7547ccd746-fj68b" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.777185 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4095938-fa00-41e6-ae21-21852f42f7bf-config-data\") pod \"barbican-api-7547ccd746-fj68b\" (UID: \"e4095938-fa00-41e6-ae21-21852f42f7bf\") " pod="openstack/barbican-api-7547ccd746-fj68b" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.778077 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4095938-fa00-41e6-ae21-21852f42f7bf-combined-ca-bundle\") pod \"barbican-api-7547ccd746-fj68b\" (UID: \"e4095938-fa00-41e6-ae21-21852f42f7bf\") " pod="openstack/barbican-api-7547ccd746-fj68b" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.778086 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4095938-fa00-41e6-ae21-21852f42f7bf-config-data-custom\") pod \"barbican-api-7547ccd746-fj68b\" (UID: \"e4095938-fa00-41e6-ae21-21852f42f7bf\") " pod="openstack/barbican-api-7547ccd746-fj68b" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.789793 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpl68\" (UniqueName: \"kubernetes.io/projected/e4095938-fa00-41e6-ae21-21852f42f7bf-kube-api-access-cpl68\") pod \"barbican-api-7547ccd746-fj68b\" (UID: \"e4095938-fa00-41e6-ae21-21852f42f7bf\") " pod="openstack/barbican-api-7547ccd746-fj68b" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.877472 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7547ccd746-fj68b" Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.887310 4893 generic.go:334] "Generic (PLEG): container finished" podID="aa99848c-b3da-4307-afd1-f37484e648d6" containerID="c045a9a66c64df82f2dce8de6eaed13dc3abe648d523380f389d497f149eabce" exitCode=0 Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.887343 4893 generic.go:334] "Generic (PLEG): container finished" podID="aa99848c-b3da-4307-afd1-f37484e648d6" containerID="9aefa053e67c31eb3cbb6bada27fa7330f22c04186e260a16b7a1675979550cc" exitCode=2 Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.887352 4893 generic.go:334] "Generic (PLEG): container finished" podID="aa99848c-b3da-4307-afd1-f37484e648d6" containerID="1376e2738bb5dd05eea9d91edfaf196013317f6b8119087b98c8835756a2312d" exitCode=0 Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.887374 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aa99848c-b3da-4307-afd1-f37484e648d6","Type":"ContainerDied","Data":"c045a9a66c64df82f2dce8de6eaed13dc3abe648d523380f389d497f149eabce"} Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.887399 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aa99848c-b3da-4307-afd1-f37484e648d6","Type":"ContainerDied","Data":"9aefa053e67c31eb3cbb6bada27fa7330f22c04186e260a16b7a1675979550cc"} Mar 14 07:21:22 crc kubenswrapper[4893]: I0314 07:21:22.887411 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aa99848c-b3da-4307-afd1-f37484e648d6","Type":"ContainerDied","Data":"1376e2738bb5dd05eea9d91edfaf196013317f6b8119087b98c8835756a2312d"} Mar 14 07:21:23 crc kubenswrapper[4893]: I0314 07:21:23.113554 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6f4f4558c4-87m4w"] Mar 14 07:21:23 crc kubenswrapper[4893]: I0314 07:21:23.254404 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-68559f9fc9-2zprf"] Mar 14 07:21:23 crc kubenswrapper[4893]: W0314 07:21:23.365048 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2fd95203_37e0_4d71_9fd6_a1b04882fe7b.slice/crio-2266f7a2ed2d44b471c0c433e13eae4905c8605ad53fb429d70d51ede88c6eb5 WatchSource:0}: Error finding container 2266f7a2ed2d44b471c0c433e13eae4905c8605ad53fb429d70d51ede88c6eb5: Status 404 returned error can't find the container with id 2266f7a2ed2d44b471c0c433e13eae4905c8605ad53fb429d70d51ede88c6eb5 Mar 14 07:21:23 crc kubenswrapper[4893]: I0314 07:21:23.369496 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77cf8fb985-qmxrz"] Mar 14 07:21:23 crc kubenswrapper[4893]: I0314 07:21:23.568199 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7547ccd746-fj68b"] Mar 14 07:21:23 crc kubenswrapper[4893]: I0314 07:21:23.908316 4893 generic.go:334] "Generic (PLEG): container finished" podID="aa99848c-b3da-4307-afd1-f37484e648d6" containerID="2f317dd6ab7555b498a5f2e58f54aaf4c94f2e9694be7af371bfdad72c87110f" exitCode=0 Mar 14 07:21:23 crc kubenswrapper[4893]: I0314 07:21:23.908419 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aa99848c-b3da-4307-afd1-f37484e648d6","Type":"ContainerDied","Data":"2f317dd6ab7555b498a5f2e58f54aaf4c94f2e9694be7af371bfdad72c87110f"} Mar 14 07:21:23 crc kubenswrapper[4893]: I0314 07:21:23.912384 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-68559f9fc9-2zprf" event={"ID":"1fc48c5b-eba4-4ce3-b68a-289b737dd9c4","Type":"ContainerStarted","Data":"c45b4223c1181b0b8ee7a4e885b762497d18d2d9d9cc57f3b95405b85294b2d2"} Mar 14 07:21:23 crc kubenswrapper[4893]: I0314 07:21:23.914966 4893 generic.go:334] "Generic (PLEG): container finished" podID="2fd95203-37e0-4d71-9fd6-a1b04882fe7b" containerID="a1a1fb73e4b20dc9ca888200e4e9e9dc0978fc9b586d57aee895efacec634de8" exitCode=0 Mar 14 07:21:23 crc kubenswrapper[4893]: I0314 07:21:23.915085 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77cf8fb985-qmxrz" event={"ID":"2fd95203-37e0-4d71-9fd6-a1b04882fe7b","Type":"ContainerDied","Data":"a1a1fb73e4b20dc9ca888200e4e9e9dc0978fc9b586d57aee895efacec634de8"} Mar 14 07:21:23 crc kubenswrapper[4893]: I0314 07:21:23.915119 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77cf8fb985-qmxrz" event={"ID":"2fd95203-37e0-4d71-9fd6-a1b04882fe7b","Type":"ContainerStarted","Data":"2266f7a2ed2d44b471c0c433e13eae4905c8605ad53fb429d70d51ede88c6eb5"} Mar 14 07:21:23 crc kubenswrapper[4893]: I0314 07:21:23.918071 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6f4f4558c4-87m4w" event={"ID":"49d7922c-f9be-40bc-ba17-ec777a331998","Type":"ContainerStarted","Data":"a93b9a216f3578d1686b9815fb061a16066f44f3f6d7d1ecddb888b83af71220"} Mar 14 07:21:23 crc kubenswrapper[4893]: I0314 07:21:23.923772 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7547ccd746-fj68b" event={"ID":"e4095938-fa00-41e6-ae21-21852f42f7bf","Type":"ContainerStarted","Data":"f83bb5b7c58518fd63d12bcf6720a0ae3316f7b583e2cf48c242df5a8edbf760"} Mar 14 07:21:23 crc kubenswrapper[4893]: I0314 07:21:23.923823 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7547ccd746-fj68b" event={"ID":"e4095938-fa00-41e6-ae21-21852f42f7bf","Type":"ContainerStarted","Data":"c7b651d861e98dcd5f1e28325f511c49663f6e9104fa4554c6ac5299f93493e2"} Mar 14 07:21:24 crc kubenswrapper[4893]: E0314 07:21:24.013839 4893 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa99848c_b3da_4307_afd1_f37484e648d6.slice/crio-2f317dd6ab7555b498a5f2e58f54aaf4c94f2e9694be7af371bfdad72c87110f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa99848c_b3da_4307_afd1_f37484e648d6.slice/crio-conmon-2f317dd6ab7555b498a5f2e58f54aaf4c94f2e9694be7af371bfdad72c87110f.scope\": RecentStats: unable to find data in memory cache]" Mar 14 07:21:24 crc kubenswrapper[4893]: I0314 07:21:24.362809 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 07:21:24 crc kubenswrapper[4893]: I0314 07:21:24.507235 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aa99848c-b3da-4307-afd1-f37484e648d6-sg-core-conf-yaml\") pod \"aa99848c-b3da-4307-afd1-f37484e648d6\" (UID: \"aa99848c-b3da-4307-afd1-f37484e648d6\") " Mar 14 07:21:24 crc kubenswrapper[4893]: I0314 07:21:24.507666 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa99848c-b3da-4307-afd1-f37484e648d6-scripts\") pod \"aa99848c-b3da-4307-afd1-f37484e648d6\" (UID: \"aa99848c-b3da-4307-afd1-f37484e648d6\") " Mar 14 07:21:24 crc kubenswrapper[4893]: I0314 07:21:24.507738 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa99848c-b3da-4307-afd1-f37484e648d6-combined-ca-bundle\") pod \"aa99848c-b3da-4307-afd1-f37484e648d6\" (UID: \"aa99848c-b3da-4307-afd1-f37484e648d6\") " Mar 14 07:21:24 crc kubenswrapper[4893]: I0314 07:21:24.507781 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvddj\" (UniqueName: \"kubernetes.io/projected/aa99848c-b3da-4307-afd1-f37484e648d6-kube-api-access-zvddj\") pod \"aa99848c-b3da-4307-afd1-f37484e648d6\" (UID: \"aa99848c-b3da-4307-afd1-f37484e648d6\") " Mar 14 07:21:24 crc kubenswrapper[4893]: I0314 07:21:24.507808 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa99848c-b3da-4307-afd1-f37484e648d6-log-httpd\") pod \"aa99848c-b3da-4307-afd1-f37484e648d6\" (UID: \"aa99848c-b3da-4307-afd1-f37484e648d6\") " Mar 14 07:21:24 crc kubenswrapper[4893]: I0314 07:21:24.507830 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa99848c-b3da-4307-afd1-f37484e648d6-config-data\") pod \"aa99848c-b3da-4307-afd1-f37484e648d6\" (UID: \"aa99848c-b3da-4307-afd1-f37484e648d6\") " Mar 14 07:21:24 crc kubenswrapper[4893]: I0314 07:21:24.507875 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa99848c-b3da-4307-afd1-f37484e648d6-run-httpd\") pod \"aa99848c-b3da-4307-afd1-f37484e648d6\" (UID: \"aa99848c-b3da-4307-afd1-f37484e648d6\") " Mar 14 07:21:24 crc kubenswrapper[4893]: I0314 07:21:24.508993 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa99848c-b3da-4307-afd1-f37484e648d6-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "aa99848c-b3da-4307-afd1-f37484e648d6" (UID: "aa99848c-b3da-4307-afd1-f37484e648d6"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:21:24 crc kubenswrapper[4893]: I0314 07:21:24.509358 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa99848c-b3da-4307-afd1-f37484e648d6-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "aa99848c-b3da-4307-afd1-f37484e648d6" (UID: "aa99848c-b3da-4307-afd1-f37484e648d6"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:21:24 crc kubenswrapper[4893]: I0314 07:21:24.529952 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa99848c-b3da-4307-afd1-f37484e648d6-kube-api-access-zvddj" (OuterVolumeSpecName: "kube-api-access-zvddj") pod "aa99848c-b3da-4307-afd1-f37484e648d6" (UID: "aa99848c-b3da-4307-afd1-f37484e648d6"). InnerVolumeSpecName "kube-api-access-zvddj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:21:24 crc kubenswrapper[4893]: I0314 07:21:24.546453 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa99848c-b3da-4307-afd1-f37484e648d6-scripts" (OuterVolumeSpecName: "scripts") pod "aa99848c-b3da-4307-afd1-f37484e648d6" (UID: "aa99848c-b3da-4307-afd1-f37484e648d6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:21:24 crc kubenswrapper[4893]: I0314 07:21:24.553946 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa99848c-b3da-4307-afd1-f37484e648d6-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "aa99848c-b3da-4307-afd1-f37484e648d6" (UID: "aa99848c-b3da-4307-afd1-f37484e648d6"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:21:24 crc kubenswrapper[4893]: I0314 07:21:24.617853 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa99848c-b3da-4307-afd1-f37484e648d6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aa99848c-b3da-4307-afd1-f37484e648d6" (UID: "aa99848c-b3da-4307-afd1-f37484e648d6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:21:24 crc kubenswrapper[4893]: I0314 07:21:24.618440 4893 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa99848c-b3da-4307-afd1-f37484e648d6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:24 crc kubenswrapper[4893]: I0314 07:21:24.618471 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvddj\" (UniqueName: \"kubernetes.io/projected/aa99848c-b3da-4307-afd1-f37484e648d6-kube-api-access-zvddj\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:24 crc kubenswrapper[4893]: I0314 07:21:24.618484 4893 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa99848c-b3da-4307-afd1-f37484e648d6-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:24 crc kubenswrapper[4893]: I0314 07:21:24.618496 4893 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa99848c-b3da-4307-afd1-f37484e648d6-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:24 crc kubenswrapper[4893]: I0314 07:21:24.618511 4893 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aa99848c-b3da-4307-afd1-f37484e648d6-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:24 crc kubenswrapper[4893]: I0314 07:21:24.618536 4893 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa99848c-b3da-4307-afd1-f37484e648d6-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:24 crc kubenswrapper[4893]: I0314 07:21:24.639666 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa99848c-b3da-4307-afd1-f37484e648d6-config-data" (OuterVolumeSpecName: "config-data") pod "aa99848c-b3da-4307-afd1-f37484e648d6" (UID: "aa99848c-b3da-4307-afd1-f37484e648d6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:21:24 crc kubenswrapper[4893]: I0314 07:21:24.719792 4893 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa99848c-b3da-4307-afd1-f37484e648d6-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:24 crc kubenswrapper[4893]: I0314 07:21:24.938681 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77cf8fb985-qmxrz" event={"ID":"2fd95203-37e0-4d71-9fd6-a1b04882fe7b","Type":"ContainerStarted","Data":"61eaf3db7e8b3004545b51982c662d19fbcfdd2ec68b64a3868a7bad0bc49839"} Mar 14 07:21:24 crc kubenswrapper[4893]: I0314 07:21:24.938872 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77cf8fb985-qmxrz" Mar 14 07:21:24 crc kubenswrapper[4893]: I0314 07:21:24.942745 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7547ccd746-fj68b" event={"ID":"e4095938-fa00-41e6-ae21-21852f42f7bf","Type":"ContainerStarted","Data":"cfc02f2f9f6fa380bc3c871b1e0140ac2399773ce8bcdad066257299d2ee93cd"} Mar 14 07:21:24 crc kubenswrapper[4893]: I0314 07:21:24.943387 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7547ccd746-fj68b" Mar 14 07:21:24 crc kubenswrapper[4893]: I0314 07:21:24.943451 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7547ccd746-fj68b" Mar 14 07:21:24 crc kubenswrapper[4893]: I0314 07:21:24.946648 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aa99848c-b3da-4307-afd1-f37484e648d6","Type":"ContainerDied","Data":"6657b5601190b53d82e0b35bb8a5d46cd2e10e5b4b843181cbcd41dbc0349859"} Mar 14 07:21:24 crc kubenswrapper[4893]: I0314 07:21:24.946712 4893 scope.go:117] "RemoveContainer" containerID="c045a9a66c64df82f2dce8de6eaed13dc3abe648d523380f389d497f149eabce" Mar 14 07:21:24 crc kubenswrapper[4893]: I0314 07:21:24.946892 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 07:21:24 crc kubenswrapper[4893]: I0314 07:21:24.968434 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-77cf8fb985-qmxrz" podStartSLOduration=2.968413506 podStartE2EDuration="2.968413506s" podCreationTimestamp="2026-03-14 07:21:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:21:24.962234385 +0000 UTC m=+1364.224411197" watchObservedRunningTime="2026-03-14 07:21:24.968413506 +0000 UTC m=+1364.230590298" Mar 14 07:21:24 crc kubenswrapper[4893]: I0314 07:21:24.989069 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7547ccd746-fj68b" podStartSLOduration=2.9890474080000002 podStartE2EDuration="2.989047408s" podCreationTimestamp="2026-03-14 07:21:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:21:24.986853175 +0000 UTC m=+1364.249029967" watchObservedRunningTime="2026-03-14 07:21:24.989047408 +0000 UTC m=+1364.251224200" Mar 14 07:21:25 crc kubenswrapper[4893]: I0314 07:21:25.022731 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:21:25 crc kubenswrapper[4893]: I0314 07:21:25.033742 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:21:25 crc kubenswrapper[4893]: I0314 07:21:25.044763 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:21:25 crc kubenswrapper[4893]: E0314 07:21:25.045348 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa99848c-b3da-4307-afd1-f37484e648d6" containerName="sg-core" Mar 14 07:21:25 crc kubenswrapper[4893]: I0314 07:21:25.045417 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa99848c-b3da-4307-afd1-f37484e648d6" containerName="sg-core" Mar 14 07:21:25 crc kubenswrapper[4893]: E0314 07:21:25.045487 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa99848c-b3da-4307-afd1-f37484e648d6" containerName="ceilometer-central-agent" Mar 14 07:21:25 crc kubenswrapper[4893]: I0314 07:21:25.045566 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa99848c-b3da-4307-afd1-f37484e648d6" containerName="ceilometer-central-agent" Mar 14 07:21:25 crc kubenswrapper[4893]: E0314 07:21:25.045639 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa99848c-b3da-4307-afd1-f37484e648d6" containerName="ceilometer-notification-agent" Mar 14 07:21:25 crc kubenswrapper[4893]: I0314 07:21:25.045693 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa99848c-b3da-4307-afd1-f37484e648d6" containerName="ceilometer-notification-agent" Mar 14 07:21:25 crc kubenswrapper[4893]: E0314 07:21:25.045763 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa99848c-b3da-4307-afd1-f37484e648d6" containerName="proxy-httpd" Mar 14 07:21:25 crc kubenswrapper[4893]: I0314 07:21:25.045816 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa99848c-b3da-4307-afd1-f37484e648d6" containerName="proxy-httpd" Mar 14 07:21:25 crc kubenswrapper[4893]: I0314 07:21:25.046037 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa99848c-b3da-4307-afd1-f37484e648d6" containerName="sg-core" Mar 14 07:21:25 crc kubenswrapper[4893]: I0314 07:21:25.046114 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa99848c-b3da-4307-afd1-f37484e648d6" containerName="ceilometer-central-agent" Mar 14 07:21:25 crc kubenswrapper[4893]: I0314 07:21:25.046189 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa99848c-b3da-4307-afd1-f37484e648d6" containerName="ceilometer-notification-agent" Mar 14 07:21:25 crc kubenswrapper[4893]: I0314 07:21:25.048233 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa99848c-b3da-4307-afd1-f37484e648d6" containerName="proxy-httpd" Mar 14 07:21:25 crc kubenswrapper[4893]: I0314 07:21:25.052567 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 07:21:25 crc kubenswrapper[4893]: I0314 07:21:25.055200 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 14 07:21:25 crc kubenswrapper[4893]: I0314 07:21:25.058313 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 14 07:21:25 crc kubenswrapper[4893]: I0314 07:21:25.063641 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:21:25 crc kubenswrapper[4893]: I0314 07:21:25.229497 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f3014ed-22ab-43fd-bb9b-44d9a4be10b5-run-httpd\") pod \"ceilometer-0\" (UID: \"4f3014ed-22ab-43fd-bb9b-44d9a4be10b5\") " pod="openstack/ceilometer-0" Mar 14 07:21:25 crc kubenswrapper[4893]: I0314 07:21:25.230183 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4f3014ed-22ab-43fd-bb9b-44d9a4be10b5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4f3014ed-22ab-43fd-bb9b-44d9a4be10b5\") " pod="openstack/ceilometer-0" Mar 14 07:21:25 crc kubenswrapper[4893]: I0314 07:21:25.230274 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nwb4\" (UniqueName: \"kubernetes.io/projected/4f3014ed-22ab-43fd-bb9b-44d9a4be10b5-kube-api-access-7nwb4\") pod \"ceilometer-0\" (UID: \"4f3014ed-22ab-43fd-bb9b-44d9a4be10b5\") " pod="openstack/ceilometer-0" Mar 14 07:21:25 crc kubenswrapper[4893]: I0314 07:21:25.230339 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f3014ed-22ab-43fd-bb9b-44d9a4be10b5-config-data\") pod \"ceilometer-0\" (UID: \"4f3014ed-22ab-43fd-bb9b-44d9a4be10b5\") " pod="openstack/ceilometer-0" Mar 14 07:21:25 crc kubenswrapper[4893]: I0314 07:21:25.230430 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f3014ed-22ab-43fd-bb9b-44d9a4be10b5-scripts\") pod \"ceilometer-0\" (UID: \"4f3014ed-22ab-43fd-bb9b-44d9a4be10b5\") " pod="openstack/ceilometer-0" Mar 14 07:21:25 crc kubenswrapper[4893]: I0314 07:21:25.230539 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f3014ed-22ab-43fd-bb9b-44d9a4be10b5-log-httpd\") pod \"ceilometer-0\" (UID: \"4f3014ed-22ab-43fd-bb9b-44d9a4be10b5\") " pod="openstack/ceilometer-0" Mar 14 07:21:25 crc kubenswrapper[4893]: I0314 07:21:25.230666 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f3014ed-22ab-43fd-bb9b-44d9a4be10b5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4f3014ed-22ab-43fd-bb9b-44d9a4be10b5\") " pod="openstack/ceilometer-0" Mar 14 07:21:25 crc kubenswrapper[4893]: I0314 07:21:25.332510 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f3014ed-22ab-43fd-bb9b-44d9a4be10b5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4f3014ed-22ab-43fd-bb9b-44d9a4be10b5\") " pod="openstack/ceilometer-0" Mar 14 07:21:25 crc kubenswrapper[4893]: I0314 07:21:25.332812 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f3014ed-22ab-43fd-bb9b-44d9a4be10b5-run-httpd\") pod \"ceilometer-0\" (UID: \"4f3014ed-22ab-43fd-bb9b-44d9a4be10b5\") " pod="openstack/ceilometer-0" Mar 14 07:21:25 crc kubenswrapper[4893]: I0314 07:21:25.332931 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4f3014ed-22ab-43fd-bb9b-44d9a4be10b5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4f3014ed-22ab-43fd-bb9b-44d9a4be10b5\") " pod="openstack/ceilometer-0" Mar 14 07:21:25 crc kubenswrapper[4893]: I0314 07:21:25.333014 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nwb4\" (UniqueName: \"kubernetes.io/projected/4f3014ed-22ab-43fd-bb9b-44d9a4be10b5-kube-api-access-7nwb4\") pod \"ceilometer-0\" (UID: \"4f3014ed-22ab-43fd-bb9b-44d9a4be10b5\") " pod="openstack/ceilometer-0" Mar 14 07:21:25 crc kubenswrapper[4893]: I0314 07:21:25.333191 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f3014ed-22ab-43fd-bb9b-44d9a4be10b5-config-data\") pod \"ceilometer-0\" (UID: \"4f3014ed-22ab-43fd-bb9b-44d9a4be10b5\") " pod="openstack/ceilometer-0" Mar 14 07:21:25 crc kubenswrapper[4893]: I0314 07:21:25.333449 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f3014ed-22ab-43fd-bb9b-44d9a4be10b5-scripts\") pod \"ceilometer-0\" (UID: \"4f3014ed-22ab-43fd-bb9b-44d9a4be10b5\") " pod="openstack/ceilometer-0" Mar 14 07:21:25 crc kubenswrapper[4893]: I0314 07:21:25.333505 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f3014ed-22ab-43fd-bb9b-44d9a4be10b5-run-httpd\") pod \"ceilometer-0\" (UID: \"4f3014ed-22ab-43fd-bb9b-44d9a4be10b5\") " pod="openstack/ceilometer-0" Mar 14 07:21:25 crc kubenswrapper[4893]: I0314 07:21:25.333825 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f3014ed-22ab-43fd-bb9b-44d9a4be10b5-log-httpd\") pod \"ceilometer-0\" (UID: \"4f3014ed-22ab-43fd-bb9b-44d9a4be10b5\") " pod="openstack/ceilometer-0" Mar 14 07:21:25 crc kubenswrapper[4893]: I0314 07:21:25.334221 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f3014ed-22ab-43fd-bb9b-44d9a4be10b5-log-httpd\") pod \"ceilometer-0\" (UID: \"4f3014ed-22ab-43fd-bb9b-44d9a4be10b5\") " pod="openstack/ceilometer-0" Mar 14 07:21:25 crc kubenswrapper[4893]: I0314 07:21:25.344723 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4f3014ed-22ab-43fd-bb9b-44d9a4be10b5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4f3014ed-22ab-43fd-bb9b-44d9a4be10b5\") " pod="openstack/ceilometer-0" Mar 14 07:21:25 crc kubenswrapper[4893]: I0314 07:21:25.345338 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f3014ed-22ab-43fd-bb9b-44d9a4be10b5-scripts\") pod \"ceilometer-0\" (UID: \"4f3014ed-22ab-43fd-bb9b-44d9a4be10b5\") " pod="openstack/ceilometer-0" Mar 14 07:21:25 crc kubenswrapper[4893]: I0314 07:21:25.349360 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f3014ed-22ab-43fd-bb9b-44d9a4be10b5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4f3014ed-22ab-43fd-bb9b-44d9a4be10b5\") " pod="openstack/ceilometer-0" Mar 14 07:21:25 crc kubenswrapper[4893]: I0314 07:21:25.356471 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f3014ed-22ab-43fd-bb9b-44d9a4be10b5-config-data\") pod \"ceilometer-0\" (UID: \"4f3014ed-22ab-43fd-bb9b-44d9a4be10b5\") " pod="openstack/ceilometer-0" Mar 14 07:21:25 crc kubenswrapper[4893]: I0314 07:21:25.370287 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nwb4\" (UniqueName: \"kubernetes.io/projected/4f3014ed-22ab-43fd-bb9b-44d9a4be10b5-kube-api-access-7nwb4\") pod \"ceilometer-0\" (UID: \"4f3014ed-22ab-43fd-bb9b-44d9a4be10b5\") " pod="openstack/ceilometer-0" Mar 14 07:21:25 crc kubenswrapper[4893]: I0314 07:21:25.388680 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 07:21:25 crc kubenswrapper[4893]: I0314 07:21:25.403405 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa99848c-b3da-4307-afd1-f37484e648d6" path="/var/lib/kubelet/pods/aa99848c-b3da-4307-afd1-f37484e648d6/volumes" Mar 14 07:21:25 crc kubenswrapper[4893]: I0314 07:21:25.453205 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-684d6b469b-c8l2b"] Mar 14 07:21:25 crc kubenswrapper[4893]: I0314 07:21:25.455431 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-684d6b469b-c8l2b" Mar 14 07:21:25 crc kubenswrapper[4893]: I0314 07:21:25.459761 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 14 07:21:25 crc kubenswrapper[4893]: I0314 07:21:25.460249 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 14 07:21:25 crc kubenswrapper[4893]: I0314 07:21:25.486436 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-684d6b469b-c8l2b"] Mar 14 07:21:25 crc kubenswrapper[4893]: I0314 07:21:25.641065 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fefe82b2-447a-4f97-8221-7050b61ef60c-public-tls-certs\") pod \"barbican-api-684d6b469b-c8l2b\" (UID: \"fefe82b2-447a-4f97-8221-7050b61ef60c\") " pod="openstack/barbican-api-684d6b469b-c8l2b" Mar 14 07:21:25 crc kubenswrapper[4893]: I0314 07:21:25.641114 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fefe82b2-447a-4f97-8221-7050b61ef60c-config-data\") pod \"barbican-api-684d6b469b-c8l2b\" (UID: \"fefe82b2-447a-4f97-8221-7050b61ef60c\") " pod="openstack/barbican-api-684d6b469b-c8l2b" Mar 14 07:21:25 crc kubenswrapper[4893]: I0314 07:21:25.641190 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fefe82b2-447a-4f97-8221-7050b61ef60c-config-data-custom\") pod \"barbican-api-684d6b469b-c8l2b\" (UID: \"fefe82b2-447a-4f97-8221-7050b61ef60c\") " pod="openstack/barbican-api-684d6b469b-c8l2b" Mar 14 07:21:25 crc kubenswrapper[4893]: I0314 07:21:25.641282 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fefe82b2-447a-4f97-8221-7050b61ef60c-logs\") pod \"barbican-api-684d6b469b-c8l2b\" (UID: \"fefe82b2-447a-4f97-8221-7050b61ef60c\") " pod="openstack/barbican-api-684d6b469b-c8l2b" Mar 14 07:21:25 crc kubenswrapper[4893]: I0314 07:21:25.641316 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgwxv\" (UniqueName: \"kubernetes.io/projected/fefe82b2-447a-4f97-8221-7050b61ef60c-kube-api-access-tgwxv\") pod \"barbican-api-684d6b469b-c8l2b\" (UID: \"fefe82b2-447a-4f97-8221-7050b61ef60c\") " pod="openstack/barbican-api-684d6b469b-c8l2b" Mar 14 07:21:25 crc kubenswrapper[4893]: I0314 07:21:25.641475 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fefe82b2-447a-4f97-8221-7050b61ef60c-combined-ca-bundle\") pod \"barbican-api-684d6b469b-c8l2b\" (UID: \"fefe82b2-447a-4f97-8221-7050b61ef60c\") " pod="openstack/barbican-api-684d6b469b-c8l2b" Mar 14 07:21:25 crc kubenswrapper[4893]: I0314 07:21:25.641584 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fefe82b2-447a-4f97-8221-7050b61ef60c-internal-tls-certs\") pod \"barbican-api-684d6b469b-c8l2b\" (UID: \"fefe82b2-447a-4f97-8221-7050b61ef60c\") " pod="openstack/barbican-api-684d6b469b-c8l2b" Mar 14 07:21:25 crc kubenswrapper[4893]: I0314 07:21:25.743674 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgwxv\" (UniqueName: \"kubernetes.io/projected/fefe82b2-447a-4f97-8221-7050b61ef60c-kube-api-access-tgwxv\") pod \"barbican-api-684d6b469b-c8l2b\" (UID: \"fefe82b2-447a-4f97-8221-7050b61ef60c\") " pod="openstack/barbican-api-684d6b469b-c8l2b" Mar 14 07:21:25 crc kubenswrapper[4893]: I0314 07:21:25.743716 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fefe82b2-447a-4f97-8221-7050b61ef60c-combined-ca-bundle\") pod \"barbican-api-684d6b469b-c8l2b\" (UID: \"fefe82b2-447a-4f97-8221-7050b61ef60c\") " pod="openstack/barbican-api-684d6b469b-c8l2b" Mar 14 07:21:25 crc kubenswrapper[4893]: I0314 07:21:25.743744 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fefe82b2-447a-4f97-8221-7050b61ef60c-internal-tls-certs\") pod \"barbican-api-684d6b469b-c8l2b\" (UID: \"fefe82b2-447a-4f97-8221-7050b61ef60c\") " pod="openstack/barbican-api-684d6b469b-c8l2b" Mar 14 07:21:25 crc kubenswrapper[4893]: I0314 07:21:25.743809 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fefe82b2-447a-4f97-8221-7050b61ef60c-public-tls-certs\") pod \"barbican-api-684d6b469b-c8l2b\" (UID: \"fefe82b2-447a-4f97-8221-7050b61ef60c\") " pod="openstack/barbican-api-684d6b469b-c8l2b" Mar 14 07:21:25 crc kubenswrapper[4893]: I0314 07:21:25.743831 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fefe82b2-447a-4f97-8221-7050b61ef60c-config-data\") pod \"barbican-api-684d6b469b-c8l2b\" (UID: \"fefe82b2-447a-4f97-8221-7050b61ef60c\") " pod="openstack/barbican-api-684d6b469b-c8l2b" Mar 14 07:21:25 crc kubenswrapper[4893]: I0314 07:21:25.743853 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fefe82b2-447a-4f97-8221-7050b61ef60c-config-data-custom\") pod \"barbican-api-684d6b469b-c8l2b\" (UID: \"fefe82b2-447a-4f97-8221-7050b61ef60c\") " pod="openstack/barbican-api-684d6b469b-c8l2b" Mar 14 07:21:25 crc kubenswrapper[4893]: I0314 07:21:25.743927 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fefe82b2-447a-4f97-8221-7050b61ef60c-logs\") pod \"barbican-api-684d6b469b-c8l2b\" (UID: \"fefe82b2-447a-4f97-8221-7050b61ef60c\") " pod="openstack/barbican-api-684d6b469b-c8l2b" Mar 14 07:21:25 crc kubenswrapper[4893]: I0314 07:21:25.744274 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fefe82b2-447a-4f97-8221-7050b61ef60c-logs\") pod \"barbican-api-684d6b469b-c8l2b\" (UID: \"fefe82b2-447a-4f97-8221-7050b61ef60c\") " pod="openstack/barbican-api-684d6b469b-c8l2b" Mar 14 07:21:25 crc kubenswrapper[4893]: I0314 07:21:25.747925 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fefe82b2-447a-4f97-8221-7050b61ef60c-config-data\") pod \"barbican-api-684d6b469b-c8l2b\" (UID: \"fefe82b2-447a-4f97-8221-7050b61ef60c\") " pod="openstack/barbican-api-684d6b469b-c8l2b" Mar 14 07:21:25 crc kubenswrapper[4893]: I0314 07:21:25.750989 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fefe82b2-447a-4f97-8221-7050b61ef60c-public-tls-certs\") pod \"barbican-api-684d6b469b-c8l2b\" (UID: \"fefe82b2-447a-4f97-8221-7050b61ef60c\") " pod="openstack/barbican-api-684d6b469b-c8l2b" Mar 14 07:21:25 crc kubenswrapper[4893]: I0314 07:21:25.753056 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fefe82b2-447a-4f97-8221-7050b61ef60c-internal-tls-certs\") pod \"barbican-api-684d6b469b-c8l2b\" (UID: \"fefe82b2-447a-4f97-8221-7050b61ef60c\") " pod="openstack/barbican-api-684d6b469b-c8l2b" Mar 14 07:21:25 crc kubenswrapper[4893]: I0314 07:21:25.755052 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fefe82b2-447a-4f97-8221-7050b61ef60c-combined-ca-bundle\") pod \"barbican-api-684d6b469b-c8l2b\" (UID: \"fefe82b2-447a-4f97-8221-7050b61ef60c\") " pod="openstack/barbican-api-684d6b469b-c8l2b" Mar 14 07:21:25 crc kubenswrapper[4893]: I0314 07:21:25.754965 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fefe82b2-447a-4f97-8221-7050b61ef60c-config-data-custom\") pod \"barbican-api-684d6b469b-c8l2b\" (UID: \"fefe82b2-447a-4f97-8221-7050b61ef60c\") " pod="openstack/barbican-api-684d6b469b-c8l2b" Mar 14 07:21:25 crc kubenswrapper[4893]: I0314 07:21:25.765294 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgwxv\" (UniqueName: \"kubernetes.io/projected/fefe82b2-447a-4f97-8221-7050b61ef60c-kube-api-access-tgwxv\") pod \"barbican-api-684d6b469b-c8l2b\" (UID: \"fefe82b2-447a-4f97-8221-7050b61ef60c\") " pod="openstack/barbican-api-684d6b469b-c8l2b" Mar 14 07:21:25 crc kubenswrapper[4893]: I0314 07:21:25.806725 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-684d6b469b-c8l2b" Mar 14 07:21:25 crc kubenswrapper[4893]: I0314 07:21:25.810327 4893 scope.go:117] "RemoveContainer" containerID="9aefa053e67c31eb3cbb6bada27fa7330f22c04186e260a16b7a1675979550cc" Mar 14 07:21:25 crc kubenswrapper[4893]: I0314 07:21:25.881563 4893 scope.go:117] "RemoveContainer" containerID="2f317dd6ab7555b498a5f2e58f54aaf4c94f2e9694be7af371bfdad72c87110f" Mar 14 07:21:25 crc kubenswrapper[4893]: I0314 07:21:25.925626 4893 scope.go:117] "RemoveContainer" containerID="1376e2738bb5dd05eea9d91edfaf196013317f6b8119087b98c8835756a2312d" Mar 14 07:21:26 crc kubenswrapper[4893]: I0314 07:21:26.339703 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:21:26 crc kubenswrapper[4893]: I0314 07:21:26.462290 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-684d6b469b-c8l2b"] Mar 14 07:21:26 crc kubenswrapper[4893]: I0314 07:21:26.979107 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4f3014ed-22ab-43fd-bb9b-44d9a4be10b5","Type":"ContainerStarted","Data":"221f8705e00dbf2446410ebd804b21ced1ff120cd2015f6c49eda2311edf140f"} Mar 14 07:21:26 crc kubenswrapper[4893]: I0314 07:21:26.981403 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6f4f4558c4-87m4w" event={"ID":"49d7922c-f9be-40bc-ba17-ec777a331998","Type":"ContainerStarted","Data":"76671f2af7de340854a84a9a135d9964bd871f327e60af0a310986cec234a8f4"} Mar 14 07:21:26 crc kubenswrapper[4893]: I0314 07:21:26.981468 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6f4f4558c4-87m4w" event={"ID":"49d7922c-f9be-40bc-ba17-ec777a331998","Type":"ContainerStarted","Data":"dfb104ad3656c054cc0079116aa3b717e79a6e2e4bfdd1d18ecce304d17812e0"} Mar 14 07:21:26 crc kubenswrapper[4893]: I0314 07:21:26.984287 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-684d6b469b-c8l2b" event={"ID":"fefe82b2-447a-4f97-8221-7050b61ef60c","Type":"ContainerStarted","Data":"a0362c79961c85ff09cc537289495d9c32c6edf16fe15acfc1b254f825509254"} Mar 14 07:21:26 crc kubenswrapper[4893]: I0314 07:21:26.984334 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-684d6b469b-c8l2b" event={"ID":"fefe82b2-447a-4f97-8221-7050b61ef60c","Type":"ContainerStarted","Data":"142a5e92673e832e9dc3ab1aaae6b53df019680363c32f4373ec08b60c93fa74"} Mar 14 07:21:26 crc kubenswrapper[4893]: I0314 07:21:26.984354 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-684d6b469b-c8l2b" event={"ID":"fefe82b2-447a-4f97-8221-7050b61ef60c","Type":"ContainerStarted","Data":"107594f7db66cb8800e239ba66cb0bb80a38fe43aa097c5639b7bb2ebaf04975"} Mar 14 07:21:26 crc kubenswrapper[4893]: I0314 07:21:26.985355 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-684d6b469b-c8l2b" Mar 14 07:21:26 crc kubenswrapper[4893]: I0314 07:21:26.985447 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-684d6b469b-c8l2b" Mar 14 07:21:26 crc kubenswrapper[4893]: I0314 07:21:26.992569 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-68559f9fc9-2zprf" event={"ID":"1fc48c5b-eba4-4ce3-b68a-289b737dd9c4","Type":"ContainerStarted","Data":"41c90b70a47f5b7fd9f983850f85e1893e4b84b4238a9030cf46ac107b91a526"} Mar 14 07:21:26 crc kubenswrapper[4893]: I0314 07:21:26.992628 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-68559f9fc9-2zprf" event={"ID":"1fc48c5b-eba4-4ce3-b68a-289b737dd9c4","Type":"ContainerStarted","Data":"a6139b7db93a2d6935130af661309f66a9cc34448d14819a79327f7b0ac80cc2"} Mar 14 07:21:27 crc kubenswrapper[4893]: I0314 07:21:27.007167 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-6f4f4558c4-87m4w" podStartSLOduration=2.245814855 podStartE2EDuration="5.007128517s" podCreationTimestamp="2026-03-14 07:21:22 +0000 UTC" firstStartedPulling="2026-03-14 07:21:23.140053206 +0000 UTC m=+1362.402229998" lastFinishedPulling="2026-03-14 07:21:25.901366858 +0000 UTC m=+1365.163543660" observedRunningTime="2026-03-14 07:21:26.998347063 +0000 UTC m=+1366.260523855" watchObservedRunningTime="2026-03-14 07:21:27.007128517 +0000 UTC m=+1366.269305319" Mar 14 07:21:27 crc kubenswrapper[4893]: I0314 07:21:27.028777 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-684d6b469b-c8l2b" podStartSLOduration=2.028758143 podStartE2EDuration="2.028758143s" podCreationTimestamp="2026-03-14 07:21:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:21:27.02616527 +0000 UTC m=+1366.288342072" watchObservedRunningTime="2026-03-14 07:21:27.028758143 +0000 UTC m=+1366.290934935" Mar 14 07:21:28 crc kubenswrapper[4893]: I0314 07:21:28.002975 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4f3014ed-22ab-43fd-bb9b-44d9a4be10b5","Type":"ContainerStarted","Data":"314663c94fd1429e5895114a853d20946087751d2d9ddaebb38e2f68bc8422e1"} Mar 14 07:21:29 crc kubenswrapper[4893]: I0314 07:21:29.036853 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4f3014ed-22ab-43fd-bb9b-44d9a4be10b5","Type":"ContainerStarted","Data":"9f8707403ee49e1a1d27cf771e6fbaa144ef8a07c7f4c6fa2d32eb4eca8fdb02"} Mar 14 07:21:29 crc kubenswrapper[4893]: I0314 07:21:29.731368 4893 patch_prober.go:28] interesting pod/machine-config-daemon-d4x6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:21:29 crc kubenswrapper[4893]: I0314 07:21:29.731495 4893 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:21:31 crc kubenswrapper[4893]: I0314 07:21:31.429909 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-68559f9fc9-2zprf" podStartSLOduration=6.810395866 podStartE2EDuration="9.429876575s" podCreationTimestamp="2026-03-14 07:21:22 +0000 UTC" firstStartedPulling="2026-03-14 07:21:23.268024401 +0000 UTC m=+1362.530201193" lastFinishedPulling="2026-03-14 07:21:25.8875051 +0000 UTC m=+1365.149681902" observedRunningTime="2026-03-14 07:21:27.053998278 +0000 UTC m=+1366.316175080" watchObservedRunningTime="2026-03-14 07:21:31.429876575 +0000 UTC m=+1370.692053367" Mar 14 07:21:32 crc kubenswrapper[4893]: I0314 07:21:32.759661 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-77cf8fb985-qmxrz" Mar 14 07:21:32 crc kubenswrapper[4893]: I0314 07:21:32.896283 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d8b7f7f4c-klm8k"] Mar 14 07:21:32 crc kubenswrapper[4893]: I0314 07:21:32.896554 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6d8b7f7f4c-klm8k" podUID="86774cd6-a036-4ec8-8e5a-d82247af2737" containerName="dnsmasq-dns" containerID="cri-o://9fcd35de59276a79b1840facdabeb20b07e44c036d429f0836781619eb44e0b1" gracePeriod=10 Mar 14 07:21:33 crc kubenswrapper[4893]: I0314 07:21:33.074444 4893 generic.go:334] "Generic (PLEG): container finished" podID="86774cd6-a036-4ec8-8e5a-d82247af2737" containerID="9fcd35de59276a79b1840facdabeb20b07e44c036d429f0836781619eb44e0b1" exitCode=0 Mar 14 07:21:33 crc kubenswrapper[4893]: I0314 07:21:33.074544 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d8b7f7f4c-klm8k" event={"ID":"86774cd6-a036-4ec8-8e5a-d82247af2737","Type":"ContainerDied","Data":"9fcd35de59276a79b1840facdabeb20b07e44c036d429f0836781619eb44e0b1"} Mar 14 07:21:33 crc kubenswrapper[4893]: I0314 07:21:33.077311 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4f3014ed-22ab-43fd-bb9b-44d9a4be10b5","Type":"ContainerStarted","Data":"e0308c113eb3e2da6d8b5ec320e11fbaa78a704e31414c99fea30584046274f9"} Mar 14 07:21:33 crc kubenswrapper[4893]: I0314 07:21:33.698338 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d8b7f7f4c-klm8k" Mar 14 07:21:33 crc kubenswrapper[4893]: I0314 07:21:33.749634 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/86774cd6-a036-4ec8-8e5a-d82247af2737-dns-svc\") pod \"86774cd6-a036-4ec8-8e5a-d82247af2737\" (UID: \"86774cd6-a036-4ec8-8e5a-d82247af2737\") " Mar 14 07:21:33 crc kubenswrapper[4893]: I0314 07:21:33.749802 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/86774cd6-a036-4ec8-8e5a-d82247af2737-ovsdbserver-sb\") pod \"86774cd6-a036-4ec8-8e5a-d82247af2737\" (UID: \"86774cd6-a036-4ec8-8e5a-d82247af2737\") " Mar 14 07:21:33 crc kubenswrapper[4893]: I0314 07:21:33.749886 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9bl6r\" (UniqueName: \"kubernetes.io/projected/86774cd6-a036-4ec8-8e5a-d82247af2737-kube-api-access-9bl6r\") pod \"86774cd6-a036-4ec8-8e5a-d82247af2737\" (UID: \"86774cd6-a036-4ec8-8e5a-d82247af2737\") " Mar 14 07:21:33 crc kubenswrapper[4893]: I0314 07:21:33.749925 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/86774cd6-a036-4ec8-8e5a-d82247af2737-ovsdbserver-nb\") pod \"86774cd6-a036-4ec8-8e5a-d82247af2737\" (UID: \"86774cd6-a036-4ec8-8e5a-d82247af2737\") " Mar 14 07:21:33 crc kubenswrapper[4893]: I0314 07:21:33.749965 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/86774cd6-a036-4ec8-8e5a-d82247af2737-dns-swift-storage-0\") pod \"86774cd6-a036-4ec8-8e5a-d82247af2737\" (UID: \"86774cd6-a036-4ec8-8e5a-d82247af2737\") " Mar 14 07:21:33 crc kubenswrapper[4893]: I0314 07:21:33.750029 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86774cd6-a036-4ec8-8e5a-d82247af2737-config\") pod \"86774cd6-a036-4ec8-8e5a-d82247af2737\" (UID: \"86774cd6-a036-4ec8-8e5a-d82247af2737\") " Mar 14 07:21:33 crc kubenswrapper[4893]: I0314 07:21:33.763824 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86774cd6-a036-4ec8-8e5a-d82247af2737-kube-api-access-9bl6r" (OuterVolumeSpecName: "kube-api-access-9bl6r") pod "86774cd6-a036-4ec8-8e5a-d82247af2737" (UID: "86774cd6-a036-4ec8-8e5a-d82247af2737"). InnerVolumeSpecName "kube-api-access-9bl6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:21:33 crc kubenswrapper[4893]: I0314 07:21:33.821333 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86774cd6-a036-4ec8-8e5a-d82247af2737-config" (OuterVolumeSpecName: "config") pod "86774cd6-a036-4ec8-8e5a-d82247af2737" (UID: "86774cd6-a036-4ec8-8e5a-d82247af2737"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:21:33 crc kubenswrapper[4893]: I0314 07:21:33.847350 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86774cd6-a036-4ec8-8e5a-d82247af2737-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "86774cd6-a036-4ec8-8e5a-d82247af2737" (UID: "86774cd6-a036-4ec8-8e5a-d82247af2737"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:21:33 crc kubenswrapper[4893]: I0314 07:21:33.848889 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86774cd6-a036-4ec8-8e5a-d82247af2737-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "86774cd6-a036-4ec8-8e5a-d82247af2737" (UID: "86774cd6-a036-4ec8-8e5a-d82247af2737"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:21:33 crc kubenswrapper[4893]: I0314 07:21:33.851542 4893 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86774cd6-a036-4ec8-8e5a-d82247af2737-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:33 crc kubenswrapper[4893]: I0314 07:21:33.851560 4893 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/86774cd6-a036-4ec8-8e5a-d82247af2737-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:33 crc kubenswrapper[4893]: I0314 07:21:33.851569 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9bl6r\" (UniqueName: \"kubernetes.io/projected/86774cd6-a036-4ec8-8e5a-d82247af2737-kube-api-access-9bl6r\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:33 crc kubenswrapper[4893]: I0314 07:21:33.851581 4893 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/86774cd6-a036-4ec8-8e5a-d82247af2737-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:33 crc kubenswrapper[4893]: I0314 07:21:33.855563 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86774cd6-a036-4ec8-8e5a-d82247af2737-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "86774cd6-a036-4ec8-8e5a-d82247af2737" (UID: "86774cd6-a036-4ec8-8e5a-d82247af2737"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:21:33 crc kubenswrapper[4893]: I0314 07:21:33.883975 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86774cd6-a036-4ec8-8e5a-d82247af2737-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "86774cd6-a036-4ec8-8e5a-d82247af2737" (UID: "86774cd6-a036-4ec8-8e5a-d82247af2737"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:21:33 crc kubenswrapper[4893]: I0314 07:21:33.952881 4893 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/86774cd6-a036-4ec8-8e5a-d82247af2737-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:33 crc kubenswrapper[4893]: I0314 07:21:33.952922 4893 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/86774cd6-a036-4ec8-8e5a-d82247af2737-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:34 crc kubenswrapper[4893]: I0314 07:21:34.086607 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d8b7f7f4c-klm8k" event={"ID":"86774cd6-a036-4ec8-8e5a-d82247af2737","Type":"ContainerDied","Data":"2fd1965943467948c7c1175965b9971a2bd44044f146dca9426132ce1b25c66e"} Mar 14 07:21:34 crc kubenswrapper[4893]: I0314 07:21:34.086661 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d8b7f7f4c-klm8k" Mar 14 07:21:34 crc kubenswrapper[4893]: I0314 07:21:34.086947 4893 scope.go:117] "RemoveContainer" containerID="9fcd35de59276a79b1840facdabeb20b07e44c036d429f0836781619eb44e0b1" Mar 14 07:21:34 crc kubenswrapper[4893]: I0314 07:21:34.128757 4893 scope.go:117] "RemoveContainer" containerID="5377adfc2fccaa116e61b562ad604ad0d01cff2abe44fc2da96d2d4897c0057c" Mar 14 07:21:34 crc kubenswrapper[4893]: I0314 07:21:34.167967 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d8b7f7f4c-klm8k"] Mar 14 07:21:34 crc kubenswrapper[4893]: I0314 07:21:34.174275 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d8b7f7f4c-klm8k"] Mar 14 07:21:34 crc kubenswrapper[4893]: E0314 07:21:34.257808 4893 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod86774cd6_a036_4ec8_8e5a_d82247af2737.slice/crio-2fd1965943467948c7c1175965b9971a2bd44044f146dca9426132ce1b25c66e\": RecentStats: unable to find data in memory cache]" Mar 14 07:21:34 crc kubenswrapper[4893]: I0314 07:21:34.631573 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7547ccd746-fj68b" Mar 14 07:21:34 crc kubenswrapper[4893]: I0314 07:21:34.766186 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7547ccd746-fj68b" Mar 14 07:21:35 crc kubenswrapper[4893]: I0314 07:21:35.102589 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4f3014ed-22ab-43fd-bb9b-44d9a4be10b5","Type":"ContainerStarted","Data":"8767cd3be5fb08b8866fd05260904400de5bafd5f4dffc6eeb4a5849a78f01d1"} Mar 14 07:21:35 crc kubenswrapper[4893]: I0314 07:21:35.103118 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 14 07:21:35 crc kubenswrapper[4893]: I0314 07:21:35.133930 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.652088244 podStartE2EDuration="10.133907988s" podCreationTimestamp="2026-03-14 07:21:25 +0000 UTC" firstStartedPulling="2026-03-14 07:21:26.350107802 +0000 UTC m=+1365.612284594" lastFinishedPulling="2026-03-14 07:21:34.831927546 +0000 UTC m=+1374.094104338" observedRunningTime="2026-03-14 07:21:35.128194109 +0000 UTC m=+1374.390370921" watchObservedRunningTime="2026-03-14 07:21:35.133907988 +0000 UTC m=+1374.396084780" Mar 14 07:21:35 crc kubenswrapper[4893]: I0314 07:21:35.164255 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-b866f57b8-fbw4s" Mar 14 07:21:35 crc kubenswrapper[4893]: I0314 07:21:35.391792 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86774cd6-a036-4ec8-8e5a-d82247af2737" path="/var/lib/kubelet/pods/86774cd6-a036-4ec8-8e5a-d82247af2737/volumes" Mar 14 07:21:35 crc kubenswrapper[4893]: I0314 07:21:35.403149 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6bc7479fc9-jhvmx"] Mar 14 07:21:35 crc kubenswrapper[4893]: I0314 07:21:35.403618 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6bc7479fc9-jhvmx" podUID="fe041ac0-b6d6-4ce7-8d87-c91e695bcf20" containerName="neutron-api" containerID="cri-o://fb1d21418e4f5a24e0fc43ce4b7a9d513c6c2d457b8866d744b8f46d082bc8e5" gracePeriod=30 Mar 14 07:21:35 crc kubenswrapper[4893]: I0314 07:21:35.405676 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6bc7479fc9-jhvmx" podUID="fe041ac0-b6d6-4ce7-8d87-c91e695bcf20" containerName="neutron-httpd" containerID="cri-o://70e6e56d3864bf198a3fe7ff9fce7eed328301b7a8a47671c97cf9c60930d59c" gracePeriod=30 Mar 14 07:21:35 crc kubenswrapper[4893]: I0314 07:21:35.422753 4893 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-6bc7479fc9-jhvmx" podUID="fe041ac0-b6d6-4ce7-8d87-c91e695bcf20" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.160:9696/\": EOF" Mar 14 07:21:35 crc kubenswrapper[4893]: I0314 07:21:35.471712 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5cdc5f965f-t6wfv"] Mar 14 07:21:35 crc kubenswrapper[4893]: E0314 07:21:35.472146 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86774cd6-a036-4ec8-8e5a-d82247af2737" containerName="init" Mar 14 07:21:35 crc kubenswrapper[4893]: I0314 07:21:35.472161 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="86774cd6-a036-4ec8-8e5a-d82247af2737" containerName="init" Mar 14 07:21:35 crc kubenswrapper[4893]: E0314 07:21:35.472177 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86774cd6-a036-4ec8-8e5a-d82247af2737" containerName="dnsmasq-dns" Mar 14 07:21:35 crc kubenswrapper[4893]: I0314 07:21:35.472183 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="86774cd6-a036-4ec8-8e5a-d82247af2737" containerName="dnsmasq-dns" Mar 14 07:21:35 crc kubenswrapper[4893]: I0314 07:21:35.472360 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="86774cd6-a036-4ec8-8e5a-d82247af2737" containerName="dnsmasq-dns" Mar 14 07:21:35 crc kubenswrapper[4893]: I0314 07:21:35.474761 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5cdc5f965f-t6wfv" Mar 14 07:21:35 crc kubenswrapper[4893]: I0314 07:21:35.482895 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5cdc5f965f-t6wfv"] Mar 14 07:21:35 crc kubenswrapper[4893]: I0314 07:21:35.592467 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2stgm\" (UniqueName: \"kubernetes.io/projected/2bee0811-3177-4034-aa99-39158e55c44f-kube-api-access-2stgm\") pod \"neutron-5cdc5f965f-t6wfv\" (UID: \"2bee0811-3177-4034-aa99-39158e55c44f\") " pod="openstack/neutron-5cdc5f965f-t6wfv" Mar 14 07:21:35 crc kubenswrapper[4893]: I0314 07:21:35.592537 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bee0811-3177-4034-aa99-39158e55c44f-public-tls-certs\") pod \"neutron-5cdc5f965f-t6wfv\" (UID: \"2bee0811-3177-4034-aa99-39158e55c44f\") " pod="openstack/neutron-5cdc5f965f-t6wfv" Mar 14 07:21:35 crc kubenswrapper[4893]: I0314 07:21:35.592575 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bee0811-3177-4034-aa99-39158e55c44f-ovndb-tls-certs\") pod \"neutron-5cdc5f965f-t6wfv\" (UID: \"2bee0811-3177-4034-aa99-39158e55c44f\") " pod="openstack/neutron-5cdc5f965f-t6wfv" Mar 14 07:21:35 crc kubenswrapper[4893]: I0314 07:21:35.592596 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bee0811-3177-4034-aa99-39158e55c44f-internal-tls-certs\") pod \"neutron-5cdc5f965f-t6wfv\" (UID: \"2bee0811-3177-4034-aa99-39158e55c44f\") " pod="openstack/neutron-5cdc5f965f-t6wfv" Mar 14 07:21:35 crc kubenswrapper[4893]: I0314 07:21:35.592691 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2bee0811-3177-4034-aa99-39158e55c44f-config\") pod \"neutron-5cdc5f965f-t6wfv\" (UID: \"2bee0811-3177-4034-aa99-39158e55c44f\") " pod="openstack/neutron-5cdc5f965f-t6wfv" Mar 14 07:21:35 crc kubenswrapper[4893]: I0314 07:21:35.592730 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2bee0811-3177-4034-aa99-39158e55c44f-httpd-config\") pod \"neutron-5cdc5f965f-t6wfv\" (UID: \"2bee0811-3177-4034-aa99-39158e55c44f\") " pod="openstack/neutron-5cdc5f965f-t6wfv" Mar 14 07:21:35 crc kubenswrapper[4893]: I0314 07:21:35.592757 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bee0811-3177-4034-aa99-39158e55c44f-combined-ca-bundle\") pod \"neutron-5cdc5f965f-t6wfv\" (UID: \"2bee0811-3177-4034-aa99-39158e55c44f\") " pod="openstack/neutron-5cdc5f965f-t6wfv" Mar 14 07:21:35 crc kubenswrapper[4893]: I0314 07:21:35.696715 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2bee0811-3177-4034-aa99-39158e55c44f-config\") pod \"neutron-5cdc5f965f-t6wfv\" (UID: \"2bee0811-3177-4034-aa99-39158e55c44f\") " pod="openstack/neutron-5cdc5f965f-t6wfv" Mar 14 07:21:35 crc kubenswrapper[4893]: I0314 07:21:35.696853 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2bee0811-3177-4034-aa99-39158e55c44f-httpd-config\") pod \"neutron-5cdc5f965f-t6wfv\" (UID: \"2bee0811-3177-4034-aa99-39158e55c44f\") " pod="openstack/neutron-5cdc5f965f-t6wfv" Mar 14 07:21:35 crc kubenswrapper[4893]: I0314 07:21:35.696902 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bee0811-3177-4034-aa99-39158e55c44f-combined-ca-bundle\") pod \"neutron-5cdc5f965f-t6wfv\" (UID: \"2bee0811-3177-4034-aa99-39158e55c44f\") " pod="openstack/neutron-5cdc5f965f-t6wfv" Mar 14 07:21:35 crc kubenswrapper[4893]: I0314 07:21:35.696990 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2stgm\" (UniqueName: \"kubernetes.io/projected/2bee0811-3177-4034-aa99-39158e55c44f-kube-api-access-2stgm\") pod \"neutron-5cdc5f965f-t6wfv\" (UID: \"2bee0811-3177-4034-aa99-39158e55c44f\") " pod="openstack/neutron-5cdc5f965f-t6wfv" Mar 14 07:21:35 crc kubenswrapper[4893]: I0314 07:21:35.697013 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bee0811-3177-4034-aa99-39158e55c44f-public-tls-certs\") pod \"neutron-5cdc5f965f-t6wfv\" (UID: \"2bee0811-3177-4034-aa99-39158e55c44f\") " pod="openstack/neutron-5cdc5f965f-t6wfv" Mar 14 07:21:35 crc kubenswrapper[4893]: I0314 07:21:35.697080 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bee0811-3177-4034-aa99-39158e55c44f-ovndb-tls-certs\") pod \"neutron-5cdc5f965f-t6wfv\" (UID: \"2bee0811-3177-4034-aa99-39158e55c44f\") " pod="openstack/neutron-5cdc5f965f-t6wfv" Mar 14 07:21:35 crc kubenswrapper[4893]: I0314 07:21:35.697103 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bee0811-3177-4034-aa99-39158e55c44f-internal-tls-certs\") pod \"neutron-5cdc5f965f-t6wfv\" (UID: \"2bee0811-3177-4034-aa99-39158e55c44f\") " pod="openstack/neutron-5cdc5f965f-t6wfv" Mar 14 07:21:35 crc kubenswrapper[4893]: I0314 07:21:35.711797 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2bee0811-3177-4034-aa99-39158e55c44f-httpd-config\") pod \"neutron-5cdc5f965f-t6wfv\" (UID: \"2bee0811-3177-4034-aa99-39158e55c44f\") " pod="openstack/neutron-5cdc5f965f-t6wfv" Mar 14 07:21:35 crc kubenswrapper[4893]: I0314 07:21:35.712490 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bee0811-3177-4034-aa99-39158e55c44f-combined-ca-bundle\") pod \"neutron-5cdc5f965f-t6wfv\" (UID: \"2bee0811-3177-4034-aa99-39158e55c44f\") " pod="openstack/neutron-5cdc5f965f-t6wfv" Mar 14 07:21:35 crc kubenswrapper[4893]: I0314 07:21:35.713210 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bee0811-3177-4034-aa99-39158e55c44f-public-tls-certs\") pod \"neutron-5cdc5f965f-t6wfv\" (UID: \"2bee0811-3177-4034-aa99-39158e55c44f\") " pod="openstack/neutron-5cdc5f965f-t6wfv" Mar 14 07:21:35 crc kubenswrapper[4893]: I0314 07:21:35.740331 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bee0811-3177-4034-aa99-39158e55c44f-internal-tls-certs\") pod \"neutron-5cdc5f965f-t6wfv\" (UID: \"2bee0811-3177-4034-aa99-39158e55c44f\") " pod="openstack/neutron-5cdc5f965f-t6wfv" Mar 14 07:21:35 crc kubenswrapper[4893]: I0314 07:21:35.742348 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/2bee0811-3177-4034-aa99-39158e55c44f-config\") pod \"neutron-5cdc5f965f-t6wfv\" (UID: \"2bee0811-3177-4034-aa99-39158e55c44f\") " pod="openstack/neutron-5cdc5f965f-t6wfv" Mar 14 07:21:35 crc kubenswrapper[4893]: I0314 07:21:35.749221 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2stgm\" (UniqueName: \"kubernetes.io/projected/2bee0811-3177-4034-aa99-39158e55c44f-kube-api-access-2stgm\") pod \"neutron-5cdc5f965f-t6wfv\" (UID: \"2bee0811-3177-4034-aa99-39158e55c44f\") " pod="openstack/neutron-5cdc5f965f-t6wfv" Mar 14 07:21:35 crc kubenswrapper[4893]: I0314 07:21:35.749693 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bee0811-3177-4034-aa99-39158e55c44f-ovndb-tls-certs\") pod \"neutron-5cdc5f965f-t6wfv\" (UID: \"2bee0811-3177-4034-aa99-39158e55c44f\") " pod="openstack/neutron-5cdc5f965f-t6wfv" Mar 14 07:21:35 crc kubenswrapper[4893]: I0314 07:21:35.849662 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5cdc5f965f-t6wfv" Mar 14 07:21:36 crc kubenswrapper[4893]: I0314 07:21:36.133455 4893 generic.go:334] "Generic (PLEG): container finished" podID="fe041ac0-b6d6-4ce7-8d87-c91e695bcf20" containerID="70e6e56d3864bf198a3fe7ff9fce7eed328301b7a8a47671c97cf9c60930d59c" exitCode=0 Mar 14 07:21:36 crc kubenswrapper[4893]: I0314 07:21:36.133566 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bc7479fc9-jhvmx" event={"ID":"fe041ac0-b6d6-4ce7-8d87-c91e695bcf20","Type":"ContainerDied","Data":"70e6e56d3864bf198a3fe7ff9fce7eed328301b7a8a47671c97cf9c60930d59c"} Mar 14 07:21:36 crc kubenswrapper[4893]: I0314 07:21:36.607899 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5cdc5f965f-t6wfv"] Mar 14 07:21:37 crc kubenswrapper[4893]: I0314 07:21:37.153653 4893 generic.go:334] "Generic (PLEG): container finished" podID="7e3cdf1f-7963-494d-86f8-699e7401fe91" containerID="3dab1821b74126bc10d3f1995b8bb609c8b02b71125d8eef1e9212847979e89d" exitCode=0 Mar 14 07:21:37 crc kubenswrapper[4893]: I0314 07:21:37.154009 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-pmw52" event={"ID":"7e3cdf1f-7963-494d-86f8-699e7401fe91","Type":"ContainerDied","Data":"3dab1821b74126bc10d3f1995b8bb609c8b02b71125d8eef1e9212847979e89d"} Mar 14 07:21:37 crc kubenswrapper[4893]: I0314 07:21:37.163712 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5cdc5f965f-t6wfv" event={"ID":"2bee0811-3177-4034-aa99-39158e55c44f","Type":"ContainerStarted","Data":"7a372b079b7d68e2f8f76d74a0d939fbb7c8e14fb0e7f1258273cfbe77ae9b8b"} Mar 14 07:21:37 crc kubenswrapper[4893]: I0314 07:21:37.163773 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5cdc5f965f-t6wfv" event={"ID":"2bee0811-3177-4034-aa99-39158e55c44f","Type":"ContainerStarted","Data":"14a32fe9b5caad93b19511764a6626f209dfafffc629a6f33c688c048a09bb5d"} Mar 14 07:21:37 crc kubenswrapper[4893]: I0314 07:21:37.163784 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5cdc5f965f-t6wfv" event={"ID":"2bee0811-3177-4034-aa99-39158e55c44f","Type":"ContainerStarted","Data":"b198a4813a42a64738b956734b9eed27d901c2fb3560a8a6d19529237f69b1ef"} Mar 14 07:21:37 crc kubenswrapper[4893]: I0314 07:21:37.164024 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5cdc5f965f-t6wfv" Mar 14 07:21:37 crc kubenswrapper[4893]: I0314 07:21:37.216869 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5cdc5f965f-t6wfv" podStartSLOduration=2.216850736 podStartE2EDuration="2.216850736s" podCreationTimestamp="2026-03-14 07:21:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:21:37.204375012 +0000 UTC m=+1376.466551804" watchObservedRunningTime="2026-03-14 07:21:37.216850736 +0000 UTC m=+1376.479027528" Mar 14 07:21:37 crc kubenswrapper[4893]: I0314 07:21:37.863714 4893 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-6bc7479fc9-jhvmx" podUID="fe041ac0-b6d6-4ce7-8d87-c91e695bcf20" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.160:9696/\": dial tcp 10.217.0.160:9696: connect: connection refused" Mar 14 07:21:37 crc kubenswrapper[4893]: I0314 07:21:37.881715 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-684d6b469b-c8l2b" Mar 14 07:21:37 crc kubenswrapper[4893]: I0314 07:21:37.958176 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-684d6b469b-c8l2b" Mar 14 07:21:38 crc kubenswrapper[4893]: I0314 07:21:38.025249 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7547ccd746-fj68b"] Mar 14 07:21:38 crc kubenswrapper[4893]: I0314 07:21:38.025459 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7547ccd746-fj68b" podUID="e4095938-fa00-41e6-ae21-21852f42f7bf" containerName="barbican-api-log" containerID="cri-o://f83bb5b7c58518fd63d12bcf6720a0ae3316f7b583e2cf48c242df5a8edbf760" gracePeriod=30 Mar 14 07:21:38 crc kubenswrapper[4893]: I0314 07:21:38.025827 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7547ccd746-fj68b" podUID="e4095938-fa00-41e6-ae21-21852f42f7bf" containerName="barbican-api" containerID="cri-o://cfc02f2f9f6fa380bc3c871b1e0140ac2399773ce8bcdad066257299d2ee93cd" gracePeriod=30 Mar 14 07:21:38 crc kubenswrapper[4893]: I0314 07:21:38.192565 4893 generic.go:334] "Generic (PLEG): container finished" podID="e4095938-fa00-41e6-ae21-21852f42f7bf" containerID="f83bb5b7c58518fd63d12bcf6720a0ae3316f7b583e2cf48c242df5a8edbf760" exitCode=143 Mar 14 07:21:38 crc kubenswrapper[4893]: I0314 07:21:38.192783 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7547ccd746-fj68b" event={"ID":"e4095938-fa00-41e6-ae21-21852f42f7bf","Type":"ContainerDied","Data":"f83bb5b7c58518fd63d12bcf6720a0ae3316f7b583e2cf48c242df5a8edbf760"} Mar 14 07:21:38 crc kubenswrapper[4893]: I0314 07:21:38.563640 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-pmw52" Mar 14 07:21:38 crc kubenswrapper[4893]: I0314 07:21:38.596354 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e3cdf1f-7963-494d-86f8-699e7401fe91-config-data\") pod \"7e3cdf1f-7963-494d-86f8-699e7401fe91\" (UID: \"7e3cdf1f-7963-494d-86f8-699e7401fe91\") " Mar 14 07:21:38 crc kubenswrapper[4893]: I0314 07:21:38.596408 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtwm9\" (UniqueName: \"kubernetes.io/projected/7e3cdf1f-7963-494d-86f8-699e7401fe91-kube-api-access-vtwm9\") pod \"7e3cdf1f-7963-494d-86f8-699e7401fe91\" (UID: \"7e3cdf1f-7963-494d-86f8-699e7401fe91\") " Mar 14 07:21:38 crc kubenswrapper[4893]: I0314 07:21:38.596432 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e3cdf1f-7963-494d-86f8-699e7401fe91-scripts\") pod \"7e3cdf1f-7963-494d-86f8-699e7401fe91\" (UID: \"7e3cdf1f-7963-494d-86f8-699e7401fe91\") " Mar 14 07:21:38 crc kubenswrapper[4893]: I0314 07:21:38.596455 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e3cdf1f-7963-494d-86f8-699e7401fe91-combined-ca-bundle\") pod \"7e3cdf1f-7963-494d-86f8-699e7401fe91\" (UID: \"7e3cdf1f-7963-494d-86f8-699e7401fe91\") " Mar 14 07:21:38 crc kubenswrapper[4893]: I0314 07:21:38.596484 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7e3cdf1f-7963-494d-86f8-699e7401fe91-db-sync-config-data\") pod \"7e3cdf1f-7963-494d-86f8-699e7401fe91\" (UID: \"7e3cdf1f-7963-494d-86f8-699e7401fe91\") " Mar 14 07:21:38 crc kubenswrapper[4893]: I0314 07:21:38.596532 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7e3cdf1f-7963-494d-86f8-699e7401fe91-etc-machine-id\") pod \"7e3cdf1f-7963-494d-86f8-699e7401fe91\" (UID: \"7e3cdf1f-7963-494d-86f8-699e7401fe91\") " Mar 14 07:21:38 crc kubenswrapper[4893]: I0314 07:21:38.597071 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7e3cdf1f-7963-494d-86f8-699e7401fe91-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "7e3cdf1f-7963-494d-86f8-699e7401fe91" (UID: "7e3cdf1f-7963-494d-86f8-699e7401fe91"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:21:38 crc kubenswrapper[4893]: I0314 07:21:38.604689 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e3cdf1f-7963-494d-86f8-699e7401fe91-scripts" (OuterVolumeSpecName: "scripts") pod "7e3cdf1f-7963-494d-86f8-699e7401fe91" (UID: "7e3cdf1f-7963-494d-86f8-699e7401fe91"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:21:38 crc kubenswrapper[4893]: I0314 07:21:38.605699 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e3cdf1f-7963-494d-86f8-699e7401fe91-kube-api-access-vtwm9" (OuterVolumeSpecName: "kube-api-access-vtwm9") pod "7e3cdf1f-7963-494d-86f8-699e7401fe91" (UID: "7e3cdf1f-7963-494d-86f8-699e7401fe91"). InnerVolumeSpecName "kube-api-access-vtwm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:21:38 crc kubenswrapper[4893]: I0314 07:21:38.605693 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e3cdf1f-7963-494d-86f8-699e7401fe91-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "7e3cdf1f-7963-494d-86f8-699e7401fe91" (UID: "7e3cdf1f-7963-494d-86f8-699e7401fe91"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:21:38 crc kubenswrapper[4893]: I0314 07:21:38.698656 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vtwm9\" (UniqueName: \"kubernetes.io/projected/7e3cdf1f-7963-494d-86f8-699e7401fe91-kube-api-access-vtwm9\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:38 crc kubenswrapper[4893]: I0314 07:21:38.698697 4893 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e3cdf1f-7963-494d-86f8-699e7401fe91-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:38 crc kubenswrapper[4893]: I0314 07:21:38.698706 4893 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7e3cdf1f-7963-494d-86f8-699e7401fe91-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:38 crc kubenswrapper[4893]: I0314 07:21:38.698716 4893 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7e3cdf1f-7963-494d-86f8-699e7401fe91-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:38 crc kubenswrapper[4893]: I0314 07:21:38.719349 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e3cdf1f-7963-494d-86f8-699e7401fe91-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7e3cdf1f-7963-494d-86f8-699e7401fe91" (UID: "7e3cdf1f-7963-494d-86f8-699e7401fe91"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:21:38 crc kubenswrapper[4893]: I0314 07:21:38.795682 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e3cdf1f-7963-494d-86f8-699e7401fe91-config-data" (OuterVolumeSpecName: "config-data") pod "7e3cdf1f-7963-494d-86f8-699e7401fe91" (UID: "7e3cdf1f-7963-494d-86f8-699e7401fe91"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:21:38 crc kubenswrapper[4893]: I0314 07:21:38.800679 4893 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e3cdf1f-7963-494d-86f8-699e7401fe91-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:38 crc kubenswrapper[4893]: I0314 07:21:38.800727 4893 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e3cdf1f-7963-494d-86f8-699e7401fe91-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.202365 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-pmw52" event={"ID":"7e3cdf1f-7963-494d-86f8-699e7401fe91","Type":"ContainerDied","Data":"7e5ba0e85aa01810f970671e1773e20c17d177c13ca8b33514f7d251ab9a0481"} Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.202403 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e5ba0e85aa01810f970671e1773e20c17d177c13ca8b33514f7d251ab9a0481" Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.202424 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-pmw52" Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.548557 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 14 07:21:39 crc kubenswrapper[4893]: E0314 07:21:39.548912 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e3cdf1f-7963-494d-86f8-699e7401fe91" containerName="cinder-db-sync" Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.548928 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e3cdf1f-7963-494d-86f8-699e7401fe91" containerName="cinder-db-sync" Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.549114 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e3cdf1f-7963-494d-86f8-699e7401fe91" containerName="cinder-db-sync" Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.549963 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.559000 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.559030 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.559216 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.576897 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-ngkl4" Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.578987 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5547746bbf-pkbmn"] Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.580742 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5547746bbf-pkbmn" Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.601477 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.625433 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnl48\" (UniqueName: \"kubernetes.io/projected/6efc151f-afa4-4275-af88-738c5c23c651-kube-api-access-jnl48\") pod \"dnsmasq-dns-5547746bbf-pkbmn\" (UID: \"6efc151f-afa4-4275-af88-738c5c23c651\") " pod="openstack/dnsmasq-dns-5547746bbf-pkbmn" Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.625492 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6efc151f-afa4-4275-af88-738c5c23c651-ovsdbserver-nb\") pod \"dnsmasq-dns-5547746bbf-pkbmn\" (UID: \"6efc151f-afa4-4275-af88-738c5c23c651\") " pod="openstack/dnsmasq-dns-5547746bbf-pkbmn" Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.625530 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29521c0b-4522-499b-ba9a-f2248f7a0a07-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"29521c0b-4522-499b-ba9a-f2248f7a0a07\") " pod="openstack/cinder-scheduler-0" Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.625565 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29521c0b-4522-499b-ba9a-f2248f7a0a07-config-data\") pod \"cinder-scheduler-0\" (UID: \"29521c0b-4522-499b-ba9a-f2248f7a0a07\") " pod="openstack/cinder-scheduler-0" Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.625602 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtnmz\" (UniqueName: \"kubernetes.io/projected/29521c0b-4522-499b-ba9a-f2248f7a0a07-kube-api-access-jtnmz\") pod \"cinder-scheduler-0\" (UID: \"29521c0b-4522-499b-ba9a-f2248f7a0a07\") " pod="openstack/cinder-scheduler-0" Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.625633 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6efc151f-afa4-4275-af88-738c5c23c651-config\") pod \"dnsmasq-dns-5547746bbf-pkbmn\" (UID: \"6efc151f-afa4-4275-af88-738c5c23c651\") " pod="openstack/dnsmasq-dns-5547746bbf-pkbmn" Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.625688 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/29521c0b-4522-499b-ba9a-f2248f7a0a07-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"29521c0b-4522-499b-ba9a-f2248f7a0a07\") " pod="openstack/cinder-scheduler-0" Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.625741 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6efc151f-afa4-4275-af88-738c5c23c651-ovsdbserver-sb\") pod \"dnsmasq-dns-5547746bbf-pkbmn\" (UID: \"6efc151f-afa4-4275-af88-738c5c23c651\") " pod="openstack/dnsmasq-dns-5547746bbf-pkbmn" Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.625764 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6efc151f-afa4-4275-af88-738c5c23c651-dns-swift-storage-0\") pod \"dnsmasq-dns-5547746bbf-pkbmn\" (UID: \"6efc151f-afa4-4275-af88-738c5c23c651\") " pod="openstack/dnsmasq-dns-5547746bbf-pkbmn" Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.625807 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/29521c0b-4522-499b-ba9a-f2248f7a0a07-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"29521c0b-4522-499b-ba9a-f2248f7a0a07\") " pod="openstack/cinder-scheduler-0" Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.625839 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29521c0b-4522-499b-ba9a-f2248f7a0a07-scripts\") pod \"cinder-scheduler-0\" (UID: \"29521c0b-4522-499b-ba9a-f2248f7a0a07\") " pod="openstack/cinder-scheduler-0" Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.625911 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6efc151f-afa4-4275-af88-738c5c23c651-dns-svc\") pod \"dnsmasq-dns-5547746bbf-pkbmn\" (UID: \"6efc151f-afa4-4275-af88-738c5c23c651\") " pod="openstack/dnsmasq-dns-5547746bbf-pkbmn" Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.645569 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5547746bbf-pkbmn"] Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.728427 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29521c0b-4522-499b-ba9a-f2248f7a0a07-config-data\") pod \"cinder-scheduler-0\" (UID: \"29521c0b-4522-499b-ba9a-f2248f7a0a07\") " pod="openstack/cinder-scheduler-0" Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.728476 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtnmz\" (UniqueName: \"kubernetes.io/projected/29521c0b-4522-499b-ba9a-f2248f7a0a07-kube-api-access-jtnmz\") pod \"cinder-scheduler-0\" (UID: \"29521c0b-4522-499b-ba9a-f2248f7a0a07\") " pod="openstack/cinder-scheduler-0" Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.728508 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6efc151f-afa4-4275-af88-738c5c23c651-config\") pod \"dnsmasq-dns-5547746bbf-pkbmn\" (UID: \"6efc151f-afa4-4275-af88-738c5c23c651\") " pod="openstack/dnsmasq-dns-5547746bbf-pkbmn" Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.728567 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/29521c0b-4522-499b-ba9a-f2248f7a0a07-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"29521c0b-4522-499b-ba9a-f2248f7a0a07\") " pod="openstack/cinder-scheduler-0" Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.728600 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6efc151f-afa4-4275-af88-738c5c23c651-ovsdbserver-sb\") pod \"dnsmasq-dns-5547746bbf-pkbmn\" (UID: \"6efc151f-afa4-4275-af88-738c5c23c651\") " pod="openstack/dnsmasq-dns-5547746bbf-pkbmn" Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.728621 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6efc151f-afa4-4275-af88-738c5c23c651-dns-swift-storage-0\") pod \"dnsmasq-dns-5547746bbf-pkbmn\" (UID: \"6efc151f-afa4-4275-af88-738c5c23c651\") " pod="openstack/dnsmasq-dns-5547746bbf-pkbmn" Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.728659 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/29521c0b-4522-499b-ba9a-f2248f7a0a07-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"29521c0b-4522-499b-ba9a-f2248f7a0a07\") " pod="openstack/cinder-scheduler-0" Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.728691 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29521c0b-4522-499b-ba9a-f2248f7a0a07-scripts\") pod \"cinder-scheduler-0\" (UID: \"29521c0b-4522-499b-ba9a-f2248f7a0a07\") " pod="openstack/cinder-scheduler-0" Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.728750 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6efc151f-afa4-4275-af88-738c5c23c651-dns-svc\") pod \"dnsmasq-dns-5547746bbf-pkbmn\" (UID: \"6efc151f-afa4-4275-af88-738c5c23c651\") " pod="openstack/dnsmasq-dns-5547746bbf-pkbmn" Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.728776 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnl48\" (UniqueName: \"kubernetes.io/projected/6efc151f-afa4-4275-af88-738c5c23c651-kube-api-access-jnl48\") pod \"dnsmasq-dns-5547746bbf-pkbmn\" (UID: \"6efc151f-afa4-4275-af88-738c5c23c651\") " pod="openstack/dnsmasq-dns-5547746bbf-pkbmn" Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.728795 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6efc151f-afa4-4275-af88-738c5c23c651-ovsdbserver-nb\") pod \"dnsmasq-dns-5547746bbf-pkbmn\" (UID: \"6efc151f-afa4-4275-af88-738c5c23c651\") " pod="openstack/dnsmasq-dns-5547746bbf-pkbmn" Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.728811 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29521c0b-4522-499b-ba9a-f2248f7a0a07-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"29521c0b-4522-499b-ba9a-f2248f7a0a07\") " pod="openstack/cinder-scheduler-0" Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.729251 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/29521c0b-4522-499b-ba9a-f2248f7a0a07-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"29521c0b-4522-499b-ba9a-f2248f7a0a07\") " pod="openstack/cinder-scheduler-0" Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.730108 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6efc151f-afa4-4275-af88-738c5c23c651-config\") pod \"dnsmasq-dns-5547746bbf-pkbmn\" (UID: \"6efc151f-afa4-4275-af88-738c5c23c651\") " pod="openstack/dnsmasq-dns-5547746bbf-pkbmn" Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.730978 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6efc151f-afa4-4275-af88-738c5c23c651-ovsdbserver-sb\") pod \"dnsmasq-dns-5547746bbf-pkbmn\" (UID: \"6efc151f-afa4-4275-af88-738c5c23c651\") " pod="openstack/dnsmasq-dns-5547746bbf-pkbmn" Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.731312 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6efc151f-afa4-4275-af88-738c5c23c651-dns-swift-storage-0\") pod \"dnsmasq-dns-5547746bbf-pkbmn\" (UID: \"6efc151f-afa4-4275-af88-738c5c23c651\") " pod="openstack/dnsmasq-dns-5547746bbf-pkbmn" Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.731894 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6efc151f-afa4-4275-af88-738c5c23c651-dns-svc\") pod \"dnsmasq-dns-5547746bbf-pkbmn\" (UID: \"6efc151f-afa4-4275-af88-738c5c23c651\") " pod="openstack/dnsmasq-dns-5547746bbf-pkbmn" Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.733067 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6efc151f-afa4-4275-af88-738c5c23c651-ovsdbserver-nb\") pod \"dnsmasq-dns-5547746bbf-pkbmn\" (UID: \"6efc151f-afa4-4275-af88-738c5c23c651\") " pod="openstack/dnsmasq-dns-5547746bbf-pkbmn" Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.743293 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/29521c0b-4522-499b-ba9a-f2248f7a0a07-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"29521c0b-4522-499b-ba9a-f2248f7a0a07\") " pod="openstack/cinder-scheduler-0" Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.744718 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29521c0b-4522-499b-ba9a-f2248f7a0a07-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"29521c0b-4522-499b-ba9a-f2248f7a0a07\") " pod="openstack/cinder-scheduler-0" Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.745492 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29521c0b-4522-499b-ba9a-f2248f7a0a07-config-data\") pod \"cinder-scheduler-0\" (UID: \"29521c0b-4522-499b-ba9a-f2248f7a0a07\") " pod="openstack/cinder-scheduler-0" Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.750891 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29521c0b-4522-499b-ba9a-f2248f7a0a07-scripts\") pod \"cinder-scheduler-0\" (UID: \"29521c0b-4522-499b-ba9a-f2248f7a0a07\") " pod="openstack/cinder-scheduler-0" Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.755898 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtnmz\" (UniqueName: \"kubernetes.io/projected/29521c0b-4522-499b-ba9a-f2248f7a0a07-kube-api-access-jtnmz\") pod \"cinder-scheduler-0\" (UID: \"29521c0b-4522-499b-ba9a-f2248f7a0a07\") " pod="openstack/cinder-scheduler-0" Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.769654 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnl48\" (UniqueName: \"kubernetes.io/projected/6efc151f-afa4-4275-af88-738c5c23c651-kube-api-access-jnl48\") pod \"dnsmasq-dns-5547746bbf-pkbmn\" (UID: \"6efc151f-afa4-4275-af88-738c5c23c651\") " pod="openstack/dnsmasq-dns-5547746bbf-pkbmn" Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.771918 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.789674 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.795498 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.816385 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.829881 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fa48b64-9274-45fe-b2a5-b2abb1a5d27d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"5fa48b64-9274-45fe-b2a5-b2abb1a5d27d\") " pod="openstack/cinder-api-0" Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.829926 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5fa48b64-9274-45fe-b2a5-b2abb1a5d27d-scripts\") pod \"cinder-api-0\" (UID: \"5fa48b64-9274-45fe-b2a5-b2abb1a5d27d\") " pod="openstack/cinder-api-0" Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.829982 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5fa48b64-9274-45fe-b2a5-b2abb1a5d27d-config-data-custom\") pod \"cinder-api-0\" (UID: \"5fa48b64-9274-45fe-b2a5-b2abb1a5d27d\") " pod="openstack/cinder-api-0" Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.830005 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slm69\" (UniqueName: \"kubernetes.io/projected/5fa48b64-9274-45fe-b2a5-b2abb1a5d27d-kube-api-access-slm69\") pod \"cinder-api-0\" (UID: \"5fa48b64-9274-45fe-b2a5-b2abb1a5d27d\") " pod="openstack/cinder-api-0" Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.830041 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fa48b64-9274-45fe-b2a5-b2abb1a5d27d-config-data\") pod \"cinder-api-0\" (UID: \"5fa48b64-9274-45fe-b2a5-b2abb1a5d27d\") " pod="openstack/cinder-api-0" Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.830087 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5fa48b64-9274-45fe-b2a5-b2abb1a5d27d-logs\") pod \"cinder-api-0\" (UID: \"5fa48b64-9274-45fe-b2a5-b2abb1a5d27d\") " pod="openstack/cinder-api-0" Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.830114 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5fa48b64-9274-45fe-b2a5-b2abb1a5d27d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"5fa48b64-9274-45fe-b2a5-b2abb1a5d27d\") " pod="openstack/cinder-api-0" Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.880346 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.914587 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5547746bbf-pkbmn" Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.931437 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5fa48b64-9274-45fe-b2a5-b2abb1a5d27d-logs\") pod \"cinder-api-0\" (UID: \"5fa48b64-9274-45fe-b2a5-b2abb1a5d27d\") " pod="openstack/cinder-api-0" Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.931494 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5fa48b64-9274-45fe-b2a5-b2abb1a5d27d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"5fa48b64-9274-45fe-b2a5-b2abb1a5d27d\") " pod="openstack/cinder-api-0" Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.931569 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fa48b64-9274-45fe-b2a5-b2abb1a5d27d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"5fa48b64-9274-45fe-b2a5-b2abb1a5d27d\") " pod="openstack/cinder-api-0" Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.931609 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5fa48b64-9274-45fe-b2a5-b2abb1a5d27d-scripts\") pod \"cinder-api-0\" (UID: \"5fa48b64-9274-45fe-b2a5-b2abb1a5d27d\") " pod="openstack/cinder-api-0" Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.931673 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5fa48b64-9274-45fe-b2a5-b2abb1a5d27d-config-data-custom\") pod \"cinder-api-0\" (UID: \"5fa48b64-9274-45fe-b2a5-b2abb1a5d27d\") " pod="openstack/cinder-api-0" Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.931704 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slm69\" (UniqueName: \"kubernetes.io/projected/5fa48b64-9274-45fe-b2a5-b2abb1a5d27d-kube-api-access-slm69\") pod \"cinder-api-0\" (UID: \"5fa48b64-9274-45fe-b2a5-b2abb1a5d27d\") " pod="openstack/cinder-api-0" Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.931751 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fa48b64-9274-45fe-b2a5-b2abb1a5d27d-config-data\") pod \"cinder-api-0\" (UID: \"5fa48b64-9274-45fe-b2a5-b2abb1a5d27d\") " pod="openstack/cinder-api-0" Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.933932 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5fa48b64-9274-45fe-b2a5-b2abb1a5d27d-logs\") pod \"cinder-api-0\" (UID: \"5fa48b64-9274-45fe-b2a5-b2abb1a5d27d\") " pod="openstack/cinder-api-0" Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.934000 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5fa48b64-9274-45fe-b2a5-b2abb1a5d27d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"5fa48b64-9274-45fe-b2a5-b2abb1a5d27d\") " pod="openstack/cinder-api-0" Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.943374 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5fa48b64-9274-45fe-b2a5-b2abb1a5d27d-config-data-custom\") pod \"cinder-api-0\" (UID: \"5fa48b64-9274-45fe-b2a5-b2abb1a5d27d\") " pod="openstack/cinder-api-0" Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.946410 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fa48b64-9274-45fe-b2a5-b2abb1a5d27d-config-data\") pod \"cinder-api-0\" (UID: \"5fa48b64-9274-45fe-b2a5-b2abb1a5d27d\") " pod="openstack/cinder-api-0" Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.948481 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fa48b64-9274-45fe-b2a5-b2abb1a5d27d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"5fa48b64-9274-45fe-b2a5-b2abb1a5d27d\") " pod="openstack/cinder-api-0" Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.953099 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5fa48b64-9274-45fe-b2a5-b2abb1a5d27d-scripts\") pod \"cinder-api-0\" (UID: \"5fa48b64-9274-45fe-b2a5-b2abb1a5d27d\") " pod="openstack/cinder-api-0" Mar 14 07:21:39 crc kubenswrapper[4893]: I0314 07:21:39.958844 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slm69\" (UniqueName: \"kubernetes.io/projected/5fa48b64-9274-45fe-b2a5-b2abb1a5d27d-kube-api-access-slm69\") pod \"cinder-api-0\" (UID: \"5fa48b64-9274-45fe-b2a5-b2abb1a5d27d\") " pod="openstack/cinder-api-0" Mar 14 07:21:40 crc kubenswrapper[4893]: I0314 07:21:40.177620 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 14 07:21:40 crc kubenswrapper[4893]: I0314 07:21:40.432026 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5547746bbf-pkbmn"] Mar 14 07:21:40 crc kubenswrapper[4893]: W0314 07:21:40.432445 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6efc151f_afa4_4275_af88_738c5c23c651.slice/crio-d2a8a60accc816ad6f3d08f4de05c449df684739146aab2f6a80623cab884520 WatchSource:0}: Error finding container d2a8a60accc816ad6f3d08f4de05c449df684739146aab2f6a80623cab884520: Status 404 returned error can't find the container with id d2a8a60accc816ad6f3d08f4de05c449df684739146aab2f6a80623cab884520 Mar 14 07:21:40 crc kubenswrapper[4893]: I0314 07:21:40.510257 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 14 07:21:40 crc kubenswrapper[4893]: I0314 07:21:40.636881 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 14 07:21:40 crc kubenswrapper[4893]: W0314 07:21:40.642131 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5fa48b64_9274_45fe_b2a5_b2abb1a5d27d.slice/crio-ef79b7ca78822e52dc5eb59b85e265975ae60b57dd17bb68cda14410994960a4 WatchSource:0}: Error finding container ef79b7ca78822e52dc5eb59b85e265975ae60b57dd17bb68cda14410994960a4: Status 404 returned error can't find the container with id ef79b7ca78822e52dc5eb59b85e265975ae60b57dd17bb68cda14410994960a4 Mar 14 07:21:41 crc kubenswrapper[4893]: I0314 07:21:41.227907 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"29521c0b-4522-499b-ba9a-f2248f7a0a07","Type":"ContainerStarted","Data":"174262b632784fe37878a1af5d5c452460184ed035ebce7d28502f291ba19945"} Mar 14 07:21:41 crc kubenswrapper[4893]: I0314 07:21:41.229607 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5fa48b64-9274-45fe-b2a5-b2abb1a5d27d","Type":"ContainerStarted","Data":"ef79b7ca78822e52dc5eb59b85e265975ae60b57dd17bb68cda14410994960a4"} Mar 14 07:21:41 crc kubenswrapper[4893]: I0314 07:21:41.232143 4893 generic.go:334] "Generic (PLEG): container finished" podID="6efc151f-afa4-4275-af88-738c5c23c651" containerID="91afdf74937623f4834bf2f3dec17b193ce76742c7b8d5c2664e65f31c0e9c85" exitCode=0 Mar 14 07:21:41 crc kubenswrapper[4893]: I0314 07:21:41.232178 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5547746bbf-pkbmn" event={"ID":"6efc151f-afa4-4275-af88-738c5c23c651","Type":"ContainerDied","Data":"91afdf74937623f4834bf2f3dec17b193ce76742c7b8d5c2664e65f31c0e9c85"} Mar 14 07:21:41 crc kubenswrapper[4893]: I0314 07:21:41.232196 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5547746bbf-pkbmn" event={"ID":"6efc151f-afa4-4275-af88-738c5c23c651","Type":"ContainerStarted","Data":"d2a8a60accc816ad6f3d08f4de05c449df684739146aab2f6a80623cab884520"} Mar 14 07:21:41 crc kubenswrapper[4893]: I0314 07:21:41.883456 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7547ccd746-fj68b" Mar 14 07:21:42 crc kubenswrapper[4893]: I0314 07:21:42.083263 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4095938-fa00-41e6-ae21-21852f42f7bf-combined-ca-bundle\") pod \"e4095938-fa00-41e6-ae21-21852f42f7bf\" (UID: \"e4095938-fa00-41e6-ae21-21852f42f7bf\") " Mar 14 07:21:42 crc kubenswrapper[4893]: I0314 07:21:42.083331 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpl68\" (UniqueName: \"kubernetes.io/projected/e4095938-fa00-41e6-ae21-21852f42f7bf-kube-api-access-cpl68\") pod \"e4095938-fa00-41e6-ae21-21852f42f7bf\" (UID: \"e4095938-fa00-41e6-ae21-21852f42f7bf\") " Mar 14 07:21:42 crc kubenswrapper[4893]: I0314 07:21:42.083406 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4095938-fa00-41e6-ae21-21852f42f7bf-config-data\") pod \"e4095938-fa00-41e6-ae21-21852f42f7bf\" (UID: \"e4095938-fa00-41e6-ae21-21852f42f7bf\") " Mar 14 07:21:42 crc kubenswrapper[4893]: I0314 07:21:42.083451 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4095938-fa00-41e6-ae21-21852f42f7bf-config-data-custom\") pod \"e4095938-fa00-41e6-ae21-21852f42f7bf\" (UID: \"e4095938-fa00-41e6-ae21-21852f42f7bf\") " Mar 14 07:21:42 crc kubenswrapper[4893]: I0314 07:21:42.083551 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4095938-fa00-41e6-ae21-21852f42f7bf-logs\") pod \"e4095938-fa00-41e6-ae21-21852f42f7bf\" (UID: \"e4095938-fa00-41e6-ae21-21852f42f7bf\") " Mar 14 07:21:42 crc kubenswrapper[4893]: I0314 07:21:42.084193 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4095938-fa00-41e6-ae21-21852f42f7bf-logs" (OuterVolumeSpecName: "logs") pod "e4095938-fa00-41e6-ae21-21852f42f7bf" (UID: "e4095938-fa00-41e6-ae21-21852f42f7bf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:21:42 crc kubenswrapper[4893]: I0314 07:21:42.090197 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4095938-fa00-41e6-ae21-21852f42f7bf-kube-api-access-cpl68" (OuterVolumeSpecName: "kube-api-access-cpl68") pod "e4095938-fa00-41e6-ae21-21852f42f7bf" (UID: "e4095938-fa00-41e6-ae21-21852f42f7bf"). InnerVolumeSpecName "kube-api-access-cpl68". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:21:42 crc kubenswrapper[4893]: I0314 07:21:42.094804 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4095938-fa00-41e6-ae21-21852f42f7bf-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e4095938-fa00-41e6-ae21-21852f42f7bf" (UID: "e4095938-fa00-41e6-ae21-21852f42f7bf"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:21:42 crc kubenswrapper[4893]: I0314 07:21:42.129710 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4095938-fa00-41e6-ae21-21852f42f7bf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e4095938-fa00-41e6-ae21-21852f42f7bf" (UID: "e4095938-fa00-41e6-ae21-21852f42f7bf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:21:42 crc kubenswrapper[4893]: I0314 07:21:42.188570 4893 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4095938-fa00-41e6-ae21-21852f42f7bf-logs\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:42 crc kubenswrapper[4893]: I0314 07:21:42.188597 4893 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4095938-fa00-41e6-ae21-21852f42f7bf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:42 crc kubenswrapper[4893]: I0314 07:21:42.188640 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cpl68\" (UniqueName: \"kubernetes.io/projected/e4095938-fa00-41e6-ae21-21852f42f7bf-kube-api-access-cpl68\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:42 crc kubenswrapper[4893]: I0314 07:21:42.188650 4893 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4095938-fa00-41e6-ae21-21852f42f7bf-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:42 crc kubenswrapper[4893]: I0314 07:21:42.253626 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4095938-fa00-41e6-ae21-21852f42f7bf-config-data" (OuterVolumeSpecName: "config-data") pod "e4095938-fa00-41e6-ae21-21852f42f7bf" (UID: "e4095938-fa00-41e6-ae21-21852f42f7bf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:21:42 crc kubenswrapper[4893]: I0314 07:21:42.291813 4893 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4095938-fa00-41e6-ae21-21852f42f7bf-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:42 crc kubenswrapper[4893]: I0314 07:21:42.307315 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5fa48b64-9274-45fe-b2a5-b2abb1a5d27d","Type":"ContainerStarted","Data":"099c2d87bc861b01fe0c281f8edc68d61b4422eac413d2c9d22014aa13ba2cb7"} Mar 14 07:21:42 crc kubenswrapper[4893]: I0314 07:21:42.309058 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5547746bbf-pkbmn" event={"ID":"6efc151f-afa4-4275-af88-738c5c23c651","Type":"ContainerStarted","Data":"3372b430ebf12e190f0b37cb7022f5ea88eeb59494e0e98c6812b07b2b2587eb"} Mar 14 07:21:42 crc kubenswrapper[4893]: I0314 07:21:42.310137 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5547746bbf-pkbmn" Mar 14 07:21:42 crc kubenswrapper[4893]: I0314 07:21:42.314887 4893 generic.go:334] "Generic (PLEG): container finished" podID="e4095938-fa00-41e6-ae21-21852f42f7bf" containerID="cfc02f2f9f6fa380bc3c871b1e0140ac2399773ce8bcdad066257299d2ee93cd" exitCode=0 Mar 14 07:21:42 crc kubenswrapper[4893]: I0314 07:21:42.314913 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7547ccd746-fj68b" event={"ID":"e4095938-fa00-41e6-ae21-21852f42f7bf","Type":"ContainerDied","Data":"cfc02f2f9f6fa380bc3c871b1e0140ac2399773ce8bcdad066257299d2ee93cd"} Mar 14 07:21:42 crc kubenswrapper[4893]: I0314 07:21:42.314928 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7547ccd746-fj68b" event={"ID":"e4095938-fa00-41e6-ae21-21852f42f7bf","Type":"ContainerDied","Data":"c7b651d861e98dcd5f1e28325f511c49663f6e9104fa4554c6ac5299f93493e2"} Mar 14 07:21:42 crc kubenswrapper[4893]: I0314 07:21:42.314943 4893 scope.go:117] "RemoveContainer" containerID="cfc02f2f9f6fa380bc3c871b1e0140ac2399773ce8bcdad066257299d2ee93cd" Mar 14 07:21:42 crc kubenswrapper[4893]: I0314 07:21:42.315065 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7547ccd746-fj68b" Mar 14 07:21:42 crc kubenswrapper[4893]: I0314 07:21:42.369412 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5547746bbf-pkbmn" podStartSLOduration=3.369396152 podStartE2EDuration="3.369396152s" podCreationTimestamp="2026-03-14 07:21:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:21:42.328926996 +0000 UTC m=+1381.591103798" watchObservedRunningTime="2026-03-14 07:21:42.369396152 +0000 UTC m=+1381.631572944" Mar 14 07:21:42 crc kubenswrapper[4893]: I0314 07:21:42.373231 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7547ccd746-fj68b"] Mar 14 07:21:42 crc kubenswrapper[4893]: I0314 07:21:42.374021 4893 scope.go:117] "RemoveContainer" containerID="f83bb5b7c58518fd63d12bcf6720a0ae3316f7b583e2cf48c242df5a8edbf760" Mar 14 07:21:42 crc kubenswrapper[4893]: I0314 07:21:42.382967 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-7547ccd746-fj68b"] Mar 14 07:21:42 crc kubenswrapper[4893]: I0314 07:21:42.439735 4893 scope.go:117] "RemoveContainer" containerID="cfc02f2f9f6fa380bc3c871b1e0140ac2399773ce8bcdad066257299d2ee93cd" Mar 14 07:21:42 crc kubenswrapper[4893]: E0314 07:21:42.440117 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfc02f2f9f6fa380bc3c871b1e0140ac2399773ce8bcdad066257299d2ee93cd\": container with ID starting with cfc02f2f9f6fa380bc3c871b1e0140ac2399773ce8bcdad066257299d2ee93cd not found: ID does not exist" containerID="cfc02f2f9f6fa380bc3c871b1e0140ac2399773ce8bcdad066257299d2ee93cd" Mar 14 07:21:42 crc kubenswrapper[4893]: I0314 07:21:42.440144 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfc02f2f9f6fa380bc3c871b1e0140ac2399773ce8bcdad066257299d2ee93cd"} err="failed to get container status \"cfc02f2f9f6fa380bc3c871b1e0140ac2399773ce8bcdad066257299d2ee93cd\": rpc error: code = NotFound desc = could not find container \"cfc02f2f9f6fa380bc3c871b1e0140ac2399773ce8bcdad066257299d2ee93cd\": container with ID starting with cfc02f2f9f6fa380bc3c871b1e0140ac2399773ce8bcdad066257299d2ee93cd not found: ID does not exist" Mar 14 07:21:42 crc kubenswrapper[4893]: I0314 07:21:42.440166 4893 scope.go:117] "RemoveContainer" containerID="f83bb5b7c58518fd63d12bcf6720a0ae3316f7b583e2cf48c242df5a8edbf760" Mar 14 07:21:42 crc kubenswrapper[4893]: E0314 07:21:42.440459 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f83bb5b7c58518fd63d12bcf6720a0ae3316f7b583e2cf48c242df5a8edbf760\": container with ID starting with f83bb5b7c58518fd63d12bcf6720a0ae3316f7b583e2cf48c242df5a8edbf760 not found: ID does not exist" containerID="f83bb5b7c58518fd63d12bcf6720a0ae3316f7b583e2cf48c242df5a8edbf760" Mar 14 07:21:42 crc kubenswrapper[4893]: I0314 07:21:42.440504 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f83bb5b7c58518fd63d12bcf6720a0ae3316f7b583e2cf48c242df5a8edbf760"} err="failed to get container status \"f83bb5b7c58518fd63d12bcf6720a0ae3316f7b583e2cf48c242df5a8edbf760\": rpc error: code = NotFound desc = could not find container \"f83bb5b7c58518fd63d12bcf6720a0ae3316f7b583e2cf48c242df5a8edbf760\": container with ID starting with f83bb5b7c58518fd63d12bcf6720a0ae3316f7b583e2cf48c242df5a8edbf760 not found: ID does not exist" Mar 14 07:21:42 crc kubenswrapper[4893]: I0314 07:21:42.915045 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-65c5f4f57d-t9vqz" Mar 14 07:21:42 crc kubenswrapper[4893]: I0314 07:21:42.962398 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-65c5f4f57d-t9vqz" Mar 14 07:21:43 crc kubenswrapper[4893]: I0314 07:21:43.331718 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5fa48b64-9274-45fe-b2a5-b2abb1a5d27d","Type":"ContainerStarted","Data":"55d5c5dc2300c7d84e0536181a1475579b3d05fdf9e41e85bdd958be64f41fec"} Mar 14 07:21:43 crc kubenswrapper[4893]: I0314 07:21:43.332884 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 14 07:21:43 crc kubenswrapper[4893]: I0314 07:21:43.336178 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"29521c0b-4522-499b-ba9a-f2248f7a0a07","Type":"ContainerStarted","Data":"d9717e50eff9a9adfbe8030e6984221cdd339d064f3f05f3234001ebcd7ecbec"} Mar 14 07:21:43 crc kubenswrapper[4893]: I0314 07:21:43.336203 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"29521c0b-4522-499b-ba9a-f2248f7a0a07","Type":"ContainerStarted","Data":"edc8085e74dce58973128f9b98a03e41cc178be710dcdbb75c0d004e490080fd"} Mar 14 07:21:43 crc kubenswrapper[4893]: I0314 07:21:43.355044 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.355026786 podStartE2EDuration="4.355026786s" podCreationTimestamp="2026-03-14 07:21:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:21:43.353095489 +0000 UTC m=+1382.615272291" watchObservedRunningTime="2026-03-14 07:21:43.355026786 +0000 UTC m=+1382.617203578" Mar 14 07:21:43 crc kubenswrapper[4893]: I0314 07:21:43.387016 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4095938-fa00-41e6-ae21-21852f42f7bf" path="/var/lib/kubelet/pods/e4095938-fa00-41e6-ae21-21852f42f7bf/volumes" Mar 14 07:21:43 crc kubenswrapper[4893]: I0314 07:21:43.388126 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.357041491 podStartE2EDuration="4.388107471s" podCreationTimestamp="2026-03-14 07:21:39 +0000 UTC" firstStartedPulling="2026-03-14 07:21:40.524622762 +0000 UTC m=+1379.786799554" lastFinishedPulling="2026-03-14 07:21:41.555688742 +0000 UTC m=+1380.817865534" observedRunningTime="2026-03-14 07:21:43.386945553 +0000 UTC m=+1382.649122345" watchObservedRunningTime="2026-03-14 07:21:43.388107471 +0000 UTC m=+1382.650284263" Mar 14 07:21:43 crc kubenswrapper[4893]: I0314 07:21:43.423765 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 14 07:21:43 crc kubenswrapper[4893]: I0314 07:21:43.905154 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-787475576f-6cj4v" Mar 14 07:21:44 crc kubenswrapper[4893]: I0314 07:21:44.881303 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 14 07:21:45 crc kubenswrapper[4893]: I0314 07:21:45.227844 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-d4589578b-zwqpr" Mar 14 07:21:45 crc kubenswrapper[4893]: I0314 07:21:45.337297 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-d4589578b-zwqpr" Mar 14 07:21:45 crc kubenswrapper[4893]: I0314 07:21:45.382131 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="5fa48b64-9274-45fe-b2a5-b2abb1a5d27d" containerName="cinder-api-log" containerID="cri-o://099c2d87bc861b01fe0c281f8edc68d61b4422eac413d2c9d22014aa13ba2cb7" gracePeriod=30 Mar 14 07:21:45 crc kubenswrapper[4893]: I0314 07:21:45.382218 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="5fa48b64-9274-45fe-b2a5-b2abb1a5d27d" containerName="cinder-api" containerID="cri-o://55d5c5dc2300c7d84e0536181a1475579b3d05fdf9e41e85bdd958be64f41fec" gracePeriod=30 Mar 14 07:21:45 crc kubenswrapper[4893]: I0314 07:21:45.404228 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-65c5f4f57d-t9vqz"] Mar 14 07:21:45 crc kubenswrapper[4893]: I0314 07:21:45.404540 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-65c5f4f57d-t9vqz" podUID="596e7b0c-f030-4a30-870b-184071c1eab3" containerName="placement-log" containerID="cri-o://00125448b2e07210458115d4d01b1b06cc59d956fbc3e01b78b30a9a5107b0f1" gracePeriod=30 Mar 14 07:21:45 crc kubenswrapper[4893]: I0314 07:21:45.405071 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-65c5f4f57d-t9vqz" podUID="596e7b0c-f030-4a30-870b-184071c1eab3" containerName="placement-api" containerID="cri-o://31d57d680adf2fd6fcd48d4af379a1f8d6a3daa8bd9cc2c05320d23011238bb3" gracePeriod=30 Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.021699 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 14 07:21:46 crc kubenswrapper[4893]: E0314 07:21:46.022629 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4095938-fa00-41e6-ae21-21852f42f7bf" containerName="barbican-api-log" Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.022646 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4095938-fa00-41e6-ae21-21852f42f7bf" containerName="barbican-api-log" Mar 14 07:21:46 crc kubenswrapper[4893]: E0314 07:21:46.022670 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4095938-fa00-41e6-ae21-21852f42f7bf" containerName="barbican-api" Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.022677 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4095938-fa00-41e6-ae21-21852f42f7bf" containerName="barbican-api" Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.022872 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4095938-fa00-41e6-ae21-21852f42f7bf" containerName="barbican-api-log" Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.022890 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4095938-fa00-41e6-ae21-21852f42f7bf" containerName="barbican-api" Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.023667 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.025971 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.026227 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-hkxs4" Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.026777 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.039603 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.166501 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d3b47ee3-fffc-498d-a833-e1c6331c2515-openstack-config-secret\") pod \"openstackclient\" (UID: \"d3b47ee3-fffc-498d-a833-e1c6331c2515\") " pod="openstack/openstackclient" Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.166684 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d3b47ee3-fffc-498d-a833-e1c6331c2515-openstack-config\") pod \"openstackclient\" (UID: \"d3b47ee3-fffc-498d-a833-e1c6331c2515\") " pod="openstack/openstackclient" Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.166754 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwt69\" (UniqueName: \"kubernetes.io/projected/d3b47ee3-fffc-498d-a833-e1c6331c2515-kube-api-access-dwt69\") pod \"openstackclient\" (UID: \"d3b47ee3-fffc-498d-a833-e1c6331c2515\") " pod="openstack/openstackclient" Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.166782 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3b47ee3-fffc-498d-a833-e1c6331c2515-combined-ca-bundle\") pod \"openstackclient\" (UID: \"d3b47ee3-fffc-498d-a833-e1c6331c2515\") " pod="openstack/openstackclient" Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.268825 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d3b47ee3-fffc-498d-a833-e1c6331c2515-openstack-config-secret\") pod \"openstackclient\" (UID: \"d3b47ee3-fffc-498d-a833-e1c6331c2515\") " pod="openstack/openstackclient" Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.268956 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d3b47ee3-fffc-498d-a833-e1c6331c2515-openstack-config\") pod \"openstackclient\" (UID: \"d3b47ee3-fffc-498d-a833-e1c6331c2515\") " pod="openstack/openstackclient" Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.269006 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwt69\" (UniqueName: \"kubernetes.io/projected/d3b47ee3-fffc-498d-a833-e1c6331c2515-kube-api-access-dwt69\") pod \"openstackclient\" (UID: \"d3b47ee3-fffc-498d-a833-e1c6331c2515\") " pod="openstack/openstackclient" Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.269026 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3b47ee3-fffc-498d-a833-e1c6331c2515-combined-ca-bundle\") pod \"openstackclient\" (UID: \"d3b47ee3-fffc-498d-a833-e1c6331c2515\") " pod="openstack/openstackclient" Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.270141 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d3b47ee3-fffc-498d-a833-e1c6331c2515-openstack-config\") pod \"openstackclient\" (UID: \"d3b47ee3-fffc-498d-a833-e1c6331c2515\") " pod="openstack/openstackclient" Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.276656 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d3b47ee3-fffc-498d-a833-e1c6331c2515-openstack-config-secret\") pod \"openstackclient\" (UID: \"d3b47ee3-fffc-498d-a833-e1c6331c2515\") " pod="openstack/openstackclient" Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.276771 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3b47ee3-fffc-498d-a833-e1c6331c2515-combined-ca-bundle\") pod \"openstackclient\" (UID: \"d3b47ee3-fffc-498d-a833-e1c6331c2515\") " pod="openstack/openstackclient" Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.299177 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwt69\" (UniqueName: \"kubernetes.io/projected/d3b47ee3-fffc-498d-a833-e1c6331c2515-kube-api-access-dwt69\") pod \"openstackclient\" (UID: \"d3b47ee3-fffc-498d-a833-e1c6331c2515\") " pod="openstack/openstackclient" Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.304579 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.344284 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.387565 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.393235 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.410493 4893 generic.go:334] "Generic (PLEG): container finished" podID="5fa48b64-9274-45fe-b2a5-b2abb1a5d27d" containerID="55d5c5dc2300c7d84e0536181a1475579b3d05fdf9e41e85bdd958be64f41fec" exitCode=0 Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.410552 4893 generic.go:334] "Generic (PLEG): container finished" podID="5fa48b64-9274-45fe-b2a5-b2abb1a5d27d" containerID="099c2d87bc861b01fe0c281f8edc68d61b4422eac413d2c9d22014aa13ba2cb7" exitCode=143 Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.410682 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5fa48b64-9274-45fe-b2a5-b2abb1a5d27d","Type":"ContainerDied","Data":"55d5c5dc2300c7d84e0536181a1475579b3d05fdf9e41e85bdd958be64f41fec"} Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.410722 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5fa48b64-9274-45fe-b2a5-b2abb1a5d27d","Type":"ContainerDied","Data":"099c2d87bc861b01fe0c281f8edc68d61b4422eac413d2c9d22014aa13ba2cb7"} Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.410743 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5fa48b64-9274-45fe-b2a5-b2abb1a5d27d","Type":"ContainerDied","Data":"ef79b7ca78822e52dc5eb59b85e265975ae60b57dd17bb68cda14410994960a4"} Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.410785 4893 scope.go:117] "RemoveContainer" containerID="55d5c5dc2300c7d84e0536181a1475579b3d05fdf9e41e85bdd958be64f41fec" Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.411040 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.428155 4893 generic.go:334] "Generic (PLEG): container finished" podID="596e7b0c-f030-4a30-870b-184071c1eab3" containerID="00125448b2e07210458115d4d01b1b06cc59d956fbc3e01b78b30a9a5107b0f1" exitCode=143 Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.428221 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-65c5f4f57d-t9vqz" event={"ID":"596e7b0c-f030-4a30-870b-184071c1eab3","Type":"ContainerDied","Data":"00125448b2e07210458115d4d01b1b06cc59d956fbc3e01b78b30a9a5107b0f1"} Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.457751 4893 scope.go:117] "RemoveContainer" containerID="099c2d87bc861b01fe0c281f8edc68d61b4422eac413d2c9d22014aa13ba2cb7" Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.458235 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 14 07:21:46 crc kubenswrapper[4893]: E0314 07:21:46.458579 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fa48b64-9274-45fe-b2a5-b2abb1a5d27d" containerName="cinder-api" Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.458591 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fa48b64-9274-45fe-b2a5-b2abb1a5d27d" containerName="cinder-api" Mar 14 07:21:46 crc kubenswrapper[4893]: E0314 07:21:46.458631 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fa48b64-9274-45fe-b2a5-b2abb1a5d27d" containerName="cinder-api-log" Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.458636 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fa48b64-9274-45fe-b2a5-b2abb1a5d27d" containerName="cinder-api-log" Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.458811 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fa48b64-9274-45fe-b2a5-b2abb1a5d27d" containerName="cinder-api" Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.458824 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fa48b64-9274-45fe-b2a5-b2abb1a5d27d" containerName="cinder-api-log" Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.459421 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.470227 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.476530 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5fa48b64-9274-45fe-b2a5-b2abb1a5d27d-etc-machine-id\") pod \"5fa48b64-9274-45fe-b2a5-b2abb1a5d27d\" (UID: \"5fa48b64-9274-45fe-b2a5-b2abb1a5d27d\") " Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.476596 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fa48b64-9274-45fe-b2a5-b2abb1a5d27d-config-data\") pod \"5fa48b64-9274-45fe-b2a5-b2abb1a5d27d\" (UID: \"5fa48b64-9274-45fe-b2a5-b2abb1a5d27d\") " Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.476733 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5fa48b64-9274-45fe-b2a5-b2abb1a5d27d-logs\") pod \"5fa48b64-9274-45fe-b2a5-b2abb1a5d27d\" (UID: \"5fa48b64-9274-45fe-b2a5-b2abb1a5d27d\") " Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.476802 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5fa48b64-9274-45fe-b2a5-b2abb1a5d27d-config-data-custom\") pod \"5fa48b64-9274-45fe-b2a5-b2abb1a5d27d\" (UID: \"5fa48b64-9274-45fe-b2a5-b2abb1a5d27d\") " Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.476908 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5fa48b64-9274-45fe-b2a5-b2abb1a5d27d-scripts\") pod \"5fa48b64-9274-45fe-b2a5-b2abb1a5d27d\" (UID: \"5fa48b64-9274-45fe-b2a5-b2abb1a5d27d\") " Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.476943 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-slm69\" (UniqueName: \"kubernetes.io/projected/5fa48b64-9274-45fe-b2a5-b2abb1a5d27d-kube-api-access-slm69\") pod \"5fa48b64-9274-45fe-b2a5-b2abb1a5d27d\" (UID: \"5fa48b64-9274-45fe-b2a5-b2abb1a5d27d\") " Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.476964 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fa48b64-9274-45fe-b2a5-b2abb1a5d27d-combined-ca-bundle\") pod \"5fa48b64-9274-45fe-b2a5-b2abb1a5d27d\" (UID: \"5fa48b64-9274-45fe-b2a5-b2abb1a5d27d\") " Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.479105 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5fa48b64-9274-45fe-b2a5-b2abb1a5d27d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "5fa48b64-9274-45fe-b2a5-b2abb1a5d27d" (UID: "5fa48b64-9274-45fe-b2a5-b2abb1a5d27d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.479359 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5fa48b64-9274-45fe-b2a5-b2abb1a5d27d-logs" (OuterVolumeSpecName: "logs") pod "5fa48b64-9274-45fe-b2a5-b2abb1a5d27d" (UID: "5fa48b64-9274-45fe-b2a5-b2abb1a5d27d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.480387 4893 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5fa48b64-9274-45fe-b2a5-b2abb1a5d27d-logs\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.480417 4893 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5fa48b64-9274-45fe-b2a5-b2abb1a5d27d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.488668 4893 scope.go:117] "RemoveContainer" containerID="55d5c5dc2300c7d84e0536181a1475579b3d05fdf9e41e85bdd958be64f41fec" Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.492474 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fa48b64-9274-45fe-b2a5-b2abb1a5d27d-kube-api-access-slm69" (OuterVolumeSpecName: "kube-api-access-slm69") pod "5fa48b64-9274-45fe-b2a5-b2abb1a5d27d" (UID: "5fa48b64-9274-45fe-b2a5-b2abb1a5d27d"). InnerVolumeSpecName "kube-api-access-slm69". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:21:46 crc kubenswrapper[4893]: E0314 07:21:46.492609 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55d5c5dc2300c7d84e0536181a1475579b3d05fdf9e41e85bdd958be64f41fec\": container with ID starting with 55d5c5dc2300c7d84e0536181a1475579b3d05fdf9e41e85bdd958be64f41fec not found: ID does not exist" containerID="55d5c5dc2300c7d84e0536181a1475579b3d05fdf9e41e85bdd958be64f41fec" Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.492644 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55d5c5dc2300c7d84e0536181a1475579b3d05fdf9e41e85bdd958be64f41fec"} err="failed to get container status \"55d5c5dc2300c7d84e0536181a1475579b3d05fdf9e41e85bdd958be64f41fec\": rpc error: code = NotFound desc = could not find container \"55d5c5dc2300c7d84e0536181a1475579b3d05fdf9e41e85bdd958be64f41fec\": container with ID starting with 55d5c5dc2300c7d84e0536181a1475579b3d05fdf9e41e85bdd958be64f41fec not found: ID does not exist" Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.492665 4893 scope.go:117] "RemoveContainer" containerID="099c2d87bc861b01fe0c281f8edc68d61b4422eac413d2c9d22014aa13ba2cb7" Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.493492 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fa48b64-9274-45fe-b2a5-b2abb1a5d27d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5fa48b64-9274-45fe-b2a5-b2abb1a5d27d" (UID: "5fa48b64-9274-45fe-b2a5-b2abb1a5d27d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.498642 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fa48b64-9274-45fe-b2a5-b2abb1a5d27d-scripts" (OuterVolumeSpecName: "scripts") pod "5fa48b64-9274-45fe-b2a5-b2abb1a5d27d" (UID: "5fa48b64-9274-45fe-b2a5-b2abb1a5d27d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:21:46 crc kubenswrapper[4893]: E0314 07:21:46.498649 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"099c2d87bc861b01fe0c281f8edc68d61b4422eac413d2c9d22014aa13ba2cb7\": container with ID starting with 099c2d87bc861b01fe0c281f8edc68d61b4422eac413d2c9d22014aa13ba2cb7 not found: ID does not exist" containerID="099c2d87bc861b01fe0c281f8edc68d61b4422eac413d2c9d22014aa13ba2cb7" Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.498688 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"099c2d87bc861b01fe0c281f8edc68d61b4422eac413d2c9d22014aa13ba2cb7"} err="failed to get container status \"099c2d87bc861b01fe0c281f8edc68d61b4422eac413d2c9d22014aa13ba2cb7\": rpc error: code = NotFound desc = could not find container \"099c2d87bc861b01fe0c281f8edc68d61b4422eac413d2c9d22014aa13ba2cb7\": container with ID starting with 099c2d87bc861b01fe0c281f8edc68d61b4422eac413d2c9d22014aa13ba2cb7 not found: ID does not exist" Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.498712 4893 scope.go:117] "RemoveContainer" containerID="55d5c5dc2300c7d84e0536181a1475579b3d05fdf9e41e85bdd958be64f41fec" Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.501634 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55d5c5dc2300c7d84e0536181a1475579b3d05fdf9e41e85bdd958be64f41fec"} err="failed to get container status \"55d5c5dc2300c7d84e0536181a1475579b3d05fdf9e41e85bdd958be64f41fec\": rpc error: code = NotFound desc = could not find container \"55d5c5dc2300c7d84e0536181a1475579b3d05fdf9e41e85bdd958be64f41fec\": container with ID starting with 55d5c5dc2300c7d84e0536181a1475579b3d05fdf9e41e85bdd958be64f41fec not found: ID does not exist" Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.501662 4893 scope.go:117] "RemoveContainer" containerID="099c2d87bc861b01fe0c281f8edc68d61b4422eac413d2c9d22014aa13ba2cb7" Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.502042 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"099c2d87bc861b01fe0c281f8edc68d61b4422eac413d2c9d22014aa13ba2cb7"} err="failed to get container status \"099c2d87bc861b01fe0c281f8edc68d61b4422eac413d2c9d22014aa13ba2cb7\": rpc error: code = NotFound desc = could not find container \"099c2d87bc861b01fe0c281f8edc68d61b4422eac413d2c9d22014aa13ba2cb7\": container with ID starting with 099c2d87bc861b01fe0c281f8edc68d61b4422eac413d2c9d22014aa13ba2cb7 not found: ID does not exist" Mar 14 07:21:46 crc kubenswrapper[4893]: E0314 07:21:46.526947 4893 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 14 07:21:46 crc kubenswrapper[4893]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_d3b47ee3-fffc-498d-a833-e1c6331c2515_0(6b111da02677629415a95ab821e6b5a91805aa1cc9a573debfa8c0f9de111fb7): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"6b111da02677629415a95ab821e6b5a91805aa1cc9a573debfa8c0f9de111fb7" Netns:"/var/run/netns/697146c3-b3a6-4843-b816-d741f945a68d" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=6b111da02677629415a95ab821e6b5a91805aa1cc9a573debfa8c0f9de111fb7;K8S_POD_UID=d3b47ee3-fffc-498d-a833-e1c6331c2515" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/d3b47ee3-fffc-498d-a833-e1c6331c2515]: expected pod UID "d3b47ee3-fffc-498d-a833-e1c6331c2515" but got "3a9bdb3e-d59b-4614-a1dd-bc6212ba104c" from Kube API Mar 14 07:21:46 crc kubenswrapper[4893]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 14 07:21:46 crc kubenswrapper[4893]: > Mar 14 07:21:46 crc kubenswrapper[4893]: E0314 07:21:46.527016 4893 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 14 07:21:46 crc kubenswrapper[4893]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_d3b47ee3-fffc-498d-a833-e1c6331c2515_0(6b111da02677629415a95ab821e6b5a91805aa1cc9a573debfa8c0f9de111fb7): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"6b111da02677629415a95ab821e6b5a91805aa1cc9a573debfa8c0f9de111fb7" Netns:"/var/run/netns/697146c3-b3a6-4843-b816-d741f945a68d" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=6b111da02677629415a95ab821e6b5a91805aa1cc9a573debfa8c0f9de111fb7;K8S_POD_UID=d3b47ee3-fffc-498d-a833-e1c6331c2515" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/d3b47ee3-fffc-498d-a833-e1c6331c2515]: expected pod UID "d3b47ee3-fffc-498d-a833-e1c6331c2515" but got "3a9bdb3e-d59b-4614-a1dd-bc6212ba104c" from Kube API Mar 14 07:21:46 crc kubenswrapper[4893]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 14 07:21:46 crc kubenswrapper[4893]: > pod="openstack/openstackclient" Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.554800 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fa48b64-9274-45fe-b2a5-b2abb1a5d27d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5fa48b64-9274-45fe-b2a5-b2abb1a5d27d" (UID: "5fa48b64-9274-45fe-b2a5-b2abb1a5d27d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.567412 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fa48b64-9274-45fe-b2a5-b2abb1a5d27d-config-data" (OuterVolumeSpecName: "config-data") pod "5fa48b64-9274-45fe-b2a5-b2abb1a5d27d" (UID: "5fa48b64-9274-45fe-b2a5-b2abb1a5d27d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.581447 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3a9bdb3e-d59b-4614-a1dd-bc6212ba104c-openstack-config-secret\") pod \"openstackclient\" (UID: \"3a9bdb3e-d59b-4614-a1dd-bc6212ba104c\") " pod="openstack/openstackclient" Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.581498 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a9bdb3e-d59b-4614-a1dd-bc6212ba104c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"3a9bdb3e-d59b-4614-a1dd-bc6212ba104c\") " pod="openstack/openstackclient" Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.581601 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3a9bdb3e-d59b-4614-a1dd-bc6212ba104c-openstack-config\") pod \"openstackclient\" (UID: \"3a9bdb3e-d59b-4614-a1dd-bc6212ba104c\") " pod="openstack/openstackclient" Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.581748 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4fvm\" (UniqueName: \"kubernetes.io/projected/3a9bdb3e-d59b-4614-a1dd-bc6212ba104c-kube-api-access-g4fvm\") pod \"openstackclient\" (UID: \"3a9bdb3e-d59b-4614-a1dd-bc6212ba104c\") " pod="openstack/openstackclient" Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.581939 4893 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5fa48b64-9274-45fe-b2a5-b2abb1a5d27d-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.581958 4893 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5fa48b64-9274-45fe-b2a5-b2abb1a5d27d-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.581970 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-slm69\" (UniqueName: \"kubernetes.io/projected/5fa48b64-9274-45fe-b2a5-b2abb1a5d27d-kube-api-access-slm69\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.581981 4893 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fa48b64-9274-45fe-b2a5-b2abb1a5d27d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.581991 4893 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fa48b64-9274-45fe-b2a5-b2abb1a5d27d-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.682998 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3a9bdb3e-d59b-4614-a1dd-bc6212ba104c-openstack-config-secret\") pod \"openstackclient\" (UID: \"3a9bdb3e-d59b-4614-a1dd-bc6212ba104c\") " pod="openstack/openstackclient" Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.683101 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a9bdb3e-d59b-4614-a1dd-bc6212ba104c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"3a9bdb3e-d59b-4614-a1dd-bc6212ba104c\") " pod="openstack/openstackclient" Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.683657 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3a9bdb3e-d59b-4614-a1dd-bc6212ba104c-openstack-config\") pod \"openstackclient\" (UID: \"3a9bdb3e-d59b-4614-a1dd-bc6212ba104c\") " pod="openstack/openstackclient" Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.683708 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4fvm\" (UniqueName: \"kubernetes.io/projected/3a9bdb3e-d59b-4614-a1dd-bc6212ba104c-kube-api-access-g4fvm\") pod \"openstackclient\" (UID: \"3a9bdb3e-d59b-4614-a1dd-bc6212ba104c\") " pod="openstack/openstackclient" Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.684435 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3a9bdb3e-d59b-4614-a1dd-bc6212ba104c-openstack-config\") pod \"openstackclient\" (UID: \"3a9bdb3e-d59b-4614-a1dd-bc6212ba104c\") " pod="openstack/openstackclient" Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.688213 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3a9bdb3e-d59b-4614-a1dd-bc6212ba104c-openstack-config-secret\") pod \"openstackclient\" (UID: \"3a9bdb3e-d59b-4614-a1dd-bc6212ba104c\") " pod="openstack/openstackclient" Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.688286 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a9bdb3e-d59b-4614-a1dd-bc6212ba104c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"3a9bdb3e-d59b-4614-a1dd-bc6212ba104c\") " pod="openstack/openstackclient" Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.700206 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4fvm\" (UniqueName: \"kubernetes.io/projected/3a9bdb3e-d59b-4614-a1dd-bc6212ba104c-kube-api-access-g4fvm\") pod \"openstackclient\" (UID: \"3a9bdb3e-d59b-4614-a1dd-bc6212ba104c\") " pod="openstack/openstackclient" Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.790237 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.795955 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.807599 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.883203 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.887493 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.892063 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.893094 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.893165 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 14 07:21:46 crc kubenswrapper[4893]: I0314 07:21:46.901108 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 14 07:21:47 crc kubenswrapper[4893]: I0314 07:21:47.001683 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab16027b-4fcf-42bf-b586-a7b8ff348305-scripts\") pod \"cinder-api-0\" (UID: \"ab16027b-4fcf-42bf-b586-a7b8ff348305\") " pod="openstack/cinder-api-0" Mar 14 07:21:47 crc kubenswrapper[4893]: I0314 07:21:47.001747 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab16027b-4fcf-42bf-b586-a7b8ff348305-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ab16027b-4fcf-42bf-b586-a7b8ff348305\") " pod="openstack/cinder-api-0" Mar 14 07:21:47 crc kubenswrapper[4893]: I0314 07:21:47.001771 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab16027b-4fcf-42bf-b586-a7b8ff348305-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"ab16027b-4fcf-42bf-b586-a7b8ff348305\") " pod="openstack/cinder-api-0" Mar 14 07:21:47 crc kubenswrapper[4893]: I0314 07:21:47.001840 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab16027b-4fcf-42bf-b586-a7b8ff348305-logs\") pod \"cinder-api-0\" (UID: \"ab16027b-4fcf-42bf-b586-a7b8ff348305\") " pod="openstack/cinder-api-0" Mar 14 07:21:47 crc kubenswrapper[4893]: I0314 07:21:47.001858 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab16027b-4fcf-42bf-b586-a7b8ff348305-config-data\") pod \"cinder-api-0\" (UID: \"ab16027b-4fcf-42bf-b586-a7b8ff348305\") " pod="openstack/cinder-api-0" Mar 14 07:21:47 crc kubenswrapper[4893]: I0314 07:21:47.001877 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ab16027b-4fcf-42bf-b586-a7b8ff348305-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ab16027b-4fcf-42bf-b586-a7b8ff348305\") " pod="openstack/cinder-api-0" Mar 14 07:21:47 crc kubenswrapper[4893]: I0314 07:21:47.002104 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab16027b-4fcf-42bf-b586-a7b8ff348305-public-tls-certs\") pod \"cinder-api-0\" (UID: \"ab16027b-4fcf-42bf-b586-a7b8ff348305\") " pod="openstack/cinder-api-0" Mar 14 07:21:47 crc kubenswrapper[4893]: I0314 07:21:47.002203 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcgpx\" (UniqueName: \"kubernetes.io/projected/ab16027b-4fcf-42bf-b586-a7b8ff348305-kube-api-access-tcgpx\") pod \"cinder-api-0\" (UID: \"ab16027b-4fcf-42bf-b586-a7b8ff348305\") " pod="openstack/cinder-api-0" Mar 14 07:21:47 crc kubenswrapper[4893]: I0314 07:21:47.002244 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ab16027b-4fcf-42bf-b586-a7b8ff348305-config-data-custom\") pod \"cinder-api-0\" (UID: \"ab16027b-4fcf-42bf-b586-a7b8ff348305\") " pod="openstack/cinder-api-0" Mar 14 07:21:47 crc kubenswrapper[4893]: I0314 07:21:47.113507 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab16027b-4fcf-42bf-b586-a7b8ff348305-scripts\") pod \"cinder-api-0\" (UID: \"ab16027b-4fcf-42bf-b586-a7b8ff348305\") " pod="openstack/cinder-api-0" Mar 14 07:21:47 crc kubenswrapper[4893]: I0314 07:21:47.113566 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab16027b-4fcf-42bf-b586-a7b8ff348305-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ab16027b-4fcf-42bf-b586-a7b8ff348305\") " pod="openstack/cinder-api-0" Mar 14 07:21:47 crc kubenswrapper[4893]: I0314 07:21:47.113592 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab16027b-4fcf-42bf-b586-a7b8ff348305-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"ab16027b-4fcf-42bf-b586-a7b8ff348305\") " pod="openstack/cinder-api-0" Mar 14 07:21:47 crc kubenswrapper[4893]: I0314 07:21:47.113663 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab16027b-4fcf-42bf-b586-a7b8ff348305-logs\") pod \"cinder-api-0\" (UID: \"ab16027b-4fcf-42bf-b586-a7b8ff348305\") " pod="openstack/cinder-api-0" Mar 14 07:21:47 crc kubenswrapper[4893]: I0314 07:21:47.113681 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab16027b-4fcf-42bf-b586-a7b8ff348305-config-data\") pod \"cinder-api-0\" (UID: \"ab16027b-4fcf-42bf-b586-a7b8ff348305\") " pod="openstack/cinder-api-0" Mar 14 07:21:47 crc kubenswrapper[4893]: I0314 07:21:47.113700 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ab16027b-4fcf-42bf-b586-a7b8ff348305-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ab16027b-4fcf-42bf-b586-a7b8ff348305\") " pod="openstack/cinder-api-0" Mar 14 07:21:47 crc kubenswrapper[4893]: I0314 07:21:47.113741 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab16027b-4fcf-42bf-b586-a7b8ff348305-public-tls-certs\") pod \"cinder-api-0\" (UID: \"ab16027b-4fcf-42bf-b586-a7b8ff348305\") " pod="openstack/cinder-api-0" Mar 14 07:21:47 crc kubenswrapper[4893]: I0314 07:21:47.113777 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcgpx\" (UniqueName: \"kubernetes.io/projected/ab16027b-4fcf-42bf-b586-a7b8ff348305-kube-api-access-tcgpx\") pod \"cinder-api-0\" (UID: \"ab16027b-4fcf-42bf-b586-a7b8ff348305\") " pod="openstack/cinder-api-0" Mar 14 07:21:47 crc kubenswrapper[4893]: I0314 07:21:47.113798 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ab16027b-4fcf-42bf-b586-a7b8ff348305-config-data-custom\") pod \"cinder-api-0\" (UID: \"ab16027b-4fcf-42bf-b586-a7b8ff348305\") " pod="openstack/cinder-api-0" Mar 14 07:21:47 crc kubenswrapper[4893]: I0314 07:21:47.114587 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ab16027b-4fcf-42bf-b586-a7b8ff348305-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ab16027b-4fcf-42bf-b586-a7b8ff348305\") " pod="openstack/cinder-api-0" Mar 14 07:21:47 crc kubenswrapper[4893]: I0314 07:21:47.115718 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab16027b-4fcf-42bf-b586-a7b8ff348305-logs\") pod \"cinder-api-0\" (UID: \"ab16027b-4fcf-42bf-b586-a7b8ff348305\") " pod="openstack/cinder-api-0" Mar 14 07:21:47 crc kubenswrapper[4893]: I0314 07:21:47.138108 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab16027b-4fcf-42bf-b586-a7b8ff348305-scripts\") pod \"cinder-api-0\" (UID: \"ab16027b-4fcf-42bf-b586-a7b8ff348305\") " pod="openstack/cinder-api-0" Mar 14 07:21:47 crc kubenswrapper[4893]: I0314 07:21:47.138990 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab16027b-4fcf-42bf-b586-a7b8ff348305-config-data\") pod \"cinder-api-0\" (UID: \"ab16027b-4fcf-42bf-b586-a7b8ff348305\") " pod="openstack/cinder-api-0" Mar 14 07:21:47 crc kubenswrapper[4893]: I0314 07:21:47.144078 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab16027b-4fcf-42bf-b586-a7b8ff348305-public-tls-certs\") pod \"cinder-api-0\" (UID: \"ab16027b-4fcf-42bf-b586-a7b8ff348305\") " pod="openstack/cinder-api-0" Mar 14 07:21:47 crc kubenswrapper[4893]: I0314 07:21:47.144545 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab16027b-4fcf-42bf-b586-a7b8ff348305-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"ab16027b-4fcf-42bf-b586-a7b8ff348305\") " pod="openstack/cinder-api-0" Mar 14 07:21:47 crc kubenswrapper[4893]: I0314 07:21:47.145343 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab16027b-4fcf-42bf-b586-a7b8ff348305-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ab16027b-4fcf-42bf-b586-a7b8ff348305\") " pod="openstack/cinder-api-0" Mar 14 07:21:47 crc kubenswrapper[4893]: I0314 07:21:47.150210 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcgpx\" (UniqueName: \"kubernetes.io/projected/ab16027b-4fcf-42bf-b586-a7b8ff348305-kube-api-access-tcgpx\") pod \"cinder-api-0\" (UID: \"ab16027b-4fcf-42bf-b586-a7b8ff348305\") " pod="openstack/cinder-api-0" Mar 14 07:21:47 crc kubenswrapper[4893]: I0314 07:21:47.150304 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ab16027b-4fcf-42bf-b586-a7b8ff348305-config-data-custom\") pod \"cinder-api-0\" (UID: \"ab16027b-4fcf-42bf-b586-a7b8ff348305\") " pod="openstack/cinder-api-0" Mar 14 07:21:47 crc kubenswrapper[4893]: I0314 07:21:47.226972 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 14 07:21:47 crc kubenswrapper[4893]: I0314 07:21:47.305396 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 14 07:21:47 crc kubenswrapper[4893]: W0314 07:21:47.316225 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3a9bdb3e_d59b_4614_a1dd_bc6212ba104c.slice/crio-5c8a77e3f169607a6741b74385aaef162d7bbb11dd89b69138604e0191abbb54 WatchSource:0}: Error finding container 5c8a77e3f169607a6741b74385aaef162d7bbb11dd89b69138604e0191abbb54: Status 404 returned error can't find the container with id 5c8a77e3f169607a6741b74385aaef162d7bbb11dd89b69138604e0191abbb54 Mar 14 07:21:47 crc kubenswrapper[4893]: I0314 07:21:47.398840 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fa48b64-9274-45fe-b2a5-b2abb1a5d27d" path="/var/lib/kubelet/pods/5fa48b64-9274-45fe-b2a5-b2abb1a5d27d/volumes" Mar 14 07:21:47 crc kubenswrapper[4893]: I0314 07:21:47.446255 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"3a9bdb3e-d59b-4614-a1dd-bc6212ba104c","Type":"ContainerStarted","Data":"5c8a77e3f169607a6741b74385aaef162d7bbb11dd89b69138604e0191abbb54"} Mar 14 07:21:47 crc kubenswrapper[4893]: I0314 07:21:47.446276 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 14 07:21:47 crc kubenswrapper[4893]: I0314 07:21:47.454535 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 14 07:21:47 crc kubenswrapper[4893]: I0314 07:21:47.458156 4893 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="d3b47ee3-fffc-498d-a833-e1c6331c2515" podUID="3a9bdb3e-d59b-4614-a1dd-bc6212ba104c" Mar 14 07:21:47 crc kubenswrapper[4893]: I0314 07:21:47.631470 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3b47ee3-fffc-498d-a833-e1c6331c2515-combined-ca-bundle\") pod \"d3b47ee3-fffc-498d-a833-e1c6331c2515\" (UID: \"d3b47ee3-fffc-498d-a833-e1c6331c2515\") " Mar 14 07:21:47 crc kubenswrapper[4893]: I0314 07:21:47.631578 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwt69\" (UniqueName: \"kubernetes.io/projected/d3b47ee3-fffc-498d-a833-e1c6331c2515-kube-api-access-dwt69\") pod \"d3b47ee3-fffc-498d-a833-e1c6331c2515\" (UID: \"d3b47ee3-fffc-498d-a833-e1c6331c2515\") " Mar 14 07:21:47 crc kubenswrapper[4893]: I0314 07:21:47.631614 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d3b47ee3-fffc-498d-a833-e1c6331c2515-openstack-config\") pod \"d3b47ee3-fffc-498d-a833-e1c6331c2515\" (UID: \"d3b47ee3-fffc-498d-a833-e1c6331c2515\") " Mar 14 07:21:47 crc kubenswrapper[4893]: I0314 07:21:47.631656 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d3b47ee3-fffc-498d-a833-e1c6331c2515-openstack-config-secret\") pod \"d3b47ee3-fffc-498d-a833-e1c6331c2515\" (UID: \"d3b47ee3-fffc-498d-a833-e1c6331c2515\") " Mar 14 07:21:47 crc kubenswrapper[4893]: I0314 07:21:47.633164 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3b47ee3-fffc-498d-a833-e1c6331c2515-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "d3b47ee3-fffc-498d-a833-e1c6331c2515" (UID: "d3b47ee3-fffc-498d-a833-e1c6331c2515"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:21:47 crc kubenswrapper[4893]: I0314 07:21:47.636438 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3b47ee3-fffc-498d-a833-e1c6331c2515-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d3b47ee3-fffc-498d-a833-e1c6331c2515" (UID: "d3b47ee3-fffc-498d-a833-e1c6331c2515"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:21:47 crc kubenswrapper[4893]: I0314 07:21:47.637293 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3b47ee3-fffc-498d-a833-e1c6331c2515-kube-api-access-dwt69" (OuterVolumeSpecName: "kube-api-access-dwt69") pod "d3b47ee3-fffc-498d-a833-e1c6331c2515" (UID: "d3b47ee3-fffc-498d-a833-e1c6331c2515"). InnerVolumeSpecName "kube-api-access-dwt69". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:21:47 crc kubenswrapper[4893]: I0314 07:21:47.641596 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3b47ee3-fffc-498d-a833-e1c6331c2515-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "d3b47ee3-fffc-498d-a833-e1c6331c2515" (UID: "d3b47ee3-fffc-498d-a833-e1c6331c2515"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:21:47 crc kubenswrapper[4893]: I0314 07:21:47.722717 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 14 07:21:47 crc kubenswrapper[4893]: I0314 07:21:47.733663 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwt69\" (UniqueName: \"kubernetes.io/projected/d3b47ee3-fffc-498d-a833-e1c6331c2515-kube-api-access-dwt69\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:47 crc kubenswrapper[4893]: I0314 07:21:47.733709 4893 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d3b47ee3-fffc-498d-a833-e1c6331c2515-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:47 crc kubenswrapper[4893]: I0314 07:21:47.733722 4893 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d3b47ee3-fffc-498d-a833-e1c6331c2515-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:47 crc kubenswrapper[4893]: I0314 07:21:47.733734 4893 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3b47ee3-fffc-498d-a833-e1c6331c2515-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:48 crc kubenswrapper[4893]: I0314 07:21:48.464250 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 14 07:21:48 crc kubenswrapper[4893]: I0314 07:21:48.464414 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ab16027b-4fcf-42bf-b586-a7b8ff348305","Type":"ContainerStarted","Data":"492cd7441a800dc10a30bd9b6a28dd3d89e262a94b290571698a0218593d8d7f"} Mar 14 07:21:48 crc kubenswrapper[4893]: I0314 07:21:48.465295 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ab16027b-4fcf-42bf-b586-a7b8ff348305","Type":"ContainerStarted","Data":"2b3f223f3d3be067b7af41b7ac07c3c891459c19442c72cfd982aa0ff3500e8a"} Mar 14 07:21:48 crc kubenswrapper[4893]: I0314 07:21:48.480063 4893 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="d3b47ee3-fffc-498d-a833-e1c6331c2515" podUID="3a9bdb3e-d59b-4614-a1dd-bc6212ba104c" Mar 14 07:21:49 crc kubenswrapper[4893]: I0314 07:21:49.349332 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-65c5f4f57d-t9vqz" Mar 14 07:21:49 crc kubenswrapper[4893]: I0314 07:21:49.412865 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3b47ee3-fffc-498d-a833-e1c6331c2515" path="/var/lib/kubelet/pods/d3b47ee3-fffc-498d-a833-e1c6331c2515/volumes" Mar 14 07:21:49 crc kubenswrapper[4893]: I0314 07:21:49.462858 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/596e7b0c-f030-4a30-870b-184071c1eab3-config-data\") pod \"596e7b0c-f030-4a30-870b-184071c1eab3\" (UID: \"596e7b0c-f030-4a30-870b-184071c1eab3\") " Mar 14 07:21:49 crc kubenswrapper[4893]: I0314 07:21:49.462950 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/596e7b0c-f030-4a30-870b-184071c1eab3-internal-tls-certs\") pod \"596e7b0c-f030-4a30-870b-184071c1eab3\" (UID: \"596e7b0c-f030-4a30-870b-184071c1eab3\") " Mar 14 07:21:49 crc kubenswrapper[4893]: I0314 07:21:49.463036 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/596e7b0c-f030-4a30-870b-184071c1eab3-combined-ca-bundle\") pod \"596e7b0c-f030-4a30-870b-184071c1eab3\" (UID: \"596e7b0c-f030-4a30-870b-184071c1eab3\") " Mar 14 07:21:49 crc kubenswrapper[4893]: I0314 07:21:49.463135 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cjlh\" (UniqueName: \"kubernetes.io/projected/596e7b0c-f030-4a30-870b-184071c1eab3-kube-api-access-8cjlh\") pod \"596e7b0c-f030-4a30-870b-184071c1eab3\" (UID: \"596e7b0c-f030-4a30-870b-184071c1eab3\") " Mar 14 07:21:49 crc kubenswrapper[4893]: I0314 07:21:49.463167 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/596e7b0c-f030-4a30-870b-184071c1eab3-scripts\") pod \"596e7b0c-f030-4a30-870b-184071c1eab3\" (UID: \"596e7b0c-f030-4a30-870b-184071c1eab3\") " Mar 14 07:21:49 crc kubenswrapper[4893]: I0314 07:21:49.463204 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/596e7b0c-f030-4a30-870b-184071c1eab3-logs\") pod \"596e7b0c-f030-4a30-870b-184071c1eab3\" (UID: \"596e7b0c-f030-4a30-870b-184071c1eab3\") " Mar 14 07:21:49 crc kubenswrapper[4893]: I0314 07:21:49.463257 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/596e7b0c-f030-4a30-870b-184071c1eab3-public-tls-certs\") pod \"596e7b0c-f030-4a30-870b-184071c1eab3\" (UID: \"596e7b0c-f030-4a30-870b-184071c1eab3\") " Mar 14 07:21:49 crc kubenswrapper[4893]: I0314 07:21:49.474347 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/596e7b0c-f030-4a30-870b-184071c1eab3-logs" (OuterVolumeSpecName: "logs") pod "596e7b0c-f030-4a30-870b-184071c1eab3" (UID: "596e7b0c-f030-4a30-870b-184071c1eab3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:21:49 crc kubenswrapper[4893]: I0314 07:21:49.484572 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/596e7b0c-f030-4a30-870b-184071c1eab3-kube-api-access-8cjlh" (OuterVolumeSpecName: "kube-api-access-8cjlh") pod "596e7b0c-f030-4a30-870b-184071c1eab3" (UID: "596e7b0c-f030-4a30-870b-184071c1eab3"). InnerVolumeSpecName "kube-api-access-8cjlh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:21:49 crc kubenswrapper[4893]: I0314 07:21:49.484804 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/596e7b0c-f030-4a30-870b-184071c1eab3-scripts" (OuterVolumeSpecName: "scripts") pod "596e7b0c-f030-4a30-870b-184071c1eab3" (UID: "596e7b0c-f030-4a30-870b-184071c1eab3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:21:49 crc kubenswrapper[4893]: I0314 07:21:49.503536 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ab16027b-4fcf-42bf-b586-a7b8ff348305","Type":"ContainerStarted","Data":"471e6c6aa09a8b1a609a1618d206f8f19bac722872a90829f538c1ebf60510ec"} Mar 14 07:21:49 crc kubenswrapper[4893]: I0314 07:21:49.503820 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 14 07:21:49 crc kubenswrapper[4893]: I0314 07:21:49.509486 4893 generic.go:334] "Generic (PLEG): container finished" podID="596e7b0c-f030-4a30-870b-184071c1eab3" containerID="31d57d680adf2fd6fcd48d4af379a1f8d6a3daa8bd9cc2c05320d23011238bb3" exitCode=0 Mar 14 07:21:49 crc kubenswrapper[4893]: I0314 07:21:49.509567 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-65c5f4f57d-t9vqz" event={"ID":"596e7b0c-f030-4a30-870b-184071c1eab3","Type":"ContainerDied","Data":"31d57d680adf2fd6fcd48d4af379a1f8d6a3daa8bd9cc2c05320d23011238bb3"} Mar 14 07:21:49 crc kubenswrapper[4893]: I0314 07:21:49.509617 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-65c5f4f57d-t9vqz" event={"ID":"596e7b0c-f030-4a30-870b-184071c1eab3","Type":"ContainerDied","Data":"2873c24a7dd33d43255b2d00ad7f3a07b2907efd749752fffd5260424f1ad838"} Mar 14 07:21:49 crc kubenswrapper[4893]: I0314 07:21:49.509634 4893 scope.go:117] "RemoveContainer" containerID="31d57d680adf2fd6fcd48d4af379a1f8d6a3daa8bd9cc2c05320d23011238bb3" Mar 14 07:21:49 crc kubenswrapper[4893]: I0314 07:21:49.509703 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-65c5f4f57d-t9vqz" Mar 14 07:21:49 crc kubenswrapper[4893]: I0314 07:21:49.542133 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.542107846 podStartE2EDuration="3.542107846s" podCreationTimestamp="2026-03-14 07:21:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:21:49.535479295 +0000 UTC m=+1388.797656107" watchObservedRunningTime="2026-03-14 07:21:49.542107846 +0000 UTC m=+1388.804284638" Mar 14 07:21:49 crc kubenswrapper[4893]: I0314 07:21:49.574808 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/596e7b0c-f030-4a30-870b-184071c1eab3-config-data" (OuterVolumeSpecName: "config-data") pod "596e7b0c-f030-4a30-870b-184071c1eab3" (UID: "596e7b0c-f030-4a30-870b-184071c1eab3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:21:49 crc kubenswrapper[4893]: I0314 07:21:49.575222 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/596e7b0c-f030-4a30-870b-184071c1eab3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "596e7b0c-f030-4a30-870b-184071c1eab3" (UID: "596e7b0c-f030-4a30-870b-184071c1eab3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:21:49 crc kubenswrapper[4893]: I0314 07:21:49.575597 4893 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/596e7b0c-f030-4a30-870b-184071c1eab3-logs\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:49 crc kubenswrapper[4893]: I0314 07:21:49.575621 4893 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/596e7b0c-f030-4a30-870b-184071c1eab3-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:49 crc kubenswrapper[4893]: I0314 07:21:49.575630 4893 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/596e7b0c-f030-4a30-870b-184071c1eab3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:49 crc kubenswrapper[4893]: I0314 07:21:49.575640 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8cjlh\" (UniqueName: \"kubernetes.io/projected/596e7b0c-f030-4a30-870b-184071c1eab3-kube-api-access-8cjlh\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:49 crc kubenswrapper[4893]: I0314 07:21:49.575650 4893 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/596e7b0c-f030-4a30-870b-184071c1eab3-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:49 crc kubenswrapper[4893]: I0314 07:21:49.647676 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/596e7b0c-f030-4a30-870b-184071c1eab3-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "596e7b0c-f030-4a30-870b-184071c1eab3" (UID: "596e7b0c-f030-4a30-870b-184071c1eab3"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:21:49 crc kubenswrapper[4893]: I0314 07:21:49.666620 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/596e7b0c-f030-4a30-870b-184071c1eab3-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "596e7b0c-f030-4a30-870b-184071c1eab3" (UID: "596e7b0c-f030-4a30-870b-184071c1eab3"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:21:49 crc kubenswrapper[4893]: I0314 07:21:49.676921 4893 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/596e7b0c-f030-4a30-870b-184071c1eab3-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:49 crc kubenswrapper[4893]: I0314 07:21:49.676951 4893 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/596e7b0c-f030-4a30-870b-184071c1eab3-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:49 crc kubenswrapper[4893]: I0314 07:21:49.733370 4893 scope.go:117] "RemoveContainer" containerID="00125448b2e07210458115d4d01b1b06cc59d956fbc3e01b78b30a9a5107b0f1" Mar 14 07:21:49 crc kubenswrapper[4893]: I0314 07:21:49.752268 4893 scope.go:117] "RemoveContainer" containerID="31d57d680adf2fd6fcd48d4af379a1f8d6a3daa8bd9cc2c05320d23011238bb3" Mar 14 07:21:49 crc kubenswrapper[4893]: E0314 07:21:49.752847 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31d57d680adf2fd6fcd48d4af379a1f8d6a3daa8bd9cc2c05320d23011238bb3\": container with ID starting with 31d57d680adf2fd6fcd48d4af379a1f8d6a3daa8bd9cc2c05320d23011238bb3 not found: ID does not exist" containerID="31d57d680adf2fd6fcd48d4af379a1f8d6a3daa8bd9cc2c05320d23011238bb3" Mar 14 07:21:49 crc kubenswrapper[4893]: I0314 07:21:49.752909 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31d57d680adf2fd6fcd48d4af379a1f8d6a3daa8bd9cc2c05320d23011238bb3"} err="failed to get container status \"31d57d680adf2fd6fcd48d4af379a1f8d6a3daa8bd9cc2c05320d23011238bb3\": rpc error: code = NotFound desc = could not find container \"31d57d680adf2fd6fcd48d4af379a1f8d6a3daa8bd9cc2c05320d23011238bb3\": container with ID starting with 31d57d680adf2fd6fcd48d4af379a1f8d6a3daa8bd9cc2c05320d23011238bb3 not found: ID does not exist" Mar 14 07:21:49 crc kubenswrapper[4893]: I0314 07:21:49.752938 4893 scope.go:117] "RemoveContainer" containerID="00125448b2e07210458115d4d01b1b06cc59d956fbc3e01b78b30a9a5107b0f1" Mar 14 07:21:49 crc kubenswrapper[4893]: E0314 07:21:49.753221 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00125448b2e07210458115d4d01b1b06cc59d956fbc3e01b78b30a9a5107b0f1\": container with ID starting with 00125448b2e07210458115d4d01b1b06cc59d956fbc3e01b78b30a9a5107b0f1 not found: ID does not exist" containerID="00125448b2e07210458115d4d01b1b06cc59d956fbc3e01b78b30a9a5107b0f1" Mar 14 07:21:49 crc kubenswrapper[4893]: I0314 07:21:49.753272 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00125448b2e07210458115d4d01b1b06cc59d956fbc3e01b78b30a9a5107b0f1"} err="failed to get container status \"00125448b2e07210458115d4d01b1b06cc59d956fbc3e01b78b30a9a5107b0f1\": rpc error: code = NotFound desc = could not find container \"00125448b2e07210458115d4d01b1b06cc59d956fbc3e01b78b30a9a5107b0f1\": container with ID starting with 00125448b2e07210458115d4d01b1b06cc59d956fbc3e01b78b30a9a5107b0f1 not found: ID does not exist" Mar 14 07:21:49 crc kubenswrapper[4893]: I0314 07:21:49.850577 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-65c5f4f57d-t9vqz"] Mar 14 07:21:49 crc kubenswrapper[4893]: I0314 07:21:49.859649 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-65c5f4f57d-t9vqz"] Mar 14 07:21:49 crc kubenswrapper[4893]: I0314 07:21:49.915922 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5547746bbf-pkbmn" Mar 14 07:21:49 crc kubenswrapper[4893]: I0314 07:21:49.975554 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77cf8fb985-qmxrz"] Mar 14 07:21:49 crc kubenswrapper[4893]: I0314 07:21:49.975815 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-77cf8fb985-qmxrz" podUID="2fd95203-37e0-4d71-9fd6-a1b04882fe7b" containerName="dnsmasq-dns" containerID="cri-o://61eaf3db7e8b3004545b51982c662d19fbcfdd2ec68b64a3868a7bad0bc49839" gracePeriod=10 Mar 14 07:21:50 crc kubenswrapper[4893]: I0314 07:21:50.291723 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 14 07:21:50 crc kubenswrapper[4893]: I0314 07:21:50.366055 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 14 07:21:50 crc kubenswrapper[4893]: I0314 07:21:50.523281 4893 generic.go:334] "Generic (PLEG): container finished" podID="2fd95203-37e0-4d71-9fd6-a1b04882fe7b" containerID="61eaf3db7e8b3004545b51982c662d19fbcfdd2ec68b64a3868a7bad0bc49839" exitCode=0 Mar 14 07:21:50 crc kubenswrapper[4893]: I0314 07:21:50.523336 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77cf8fb985-qmxrz" event={"ID":"2fd95203-37e0-4d71-9fd6-a1b04882fe7b","Type":"ContainerDied","Data":"61eaf3db7e8b3004545b51982c662d19fbcfdd2ec68b64a3868a7bad0bc49839"} Mar 14 07:21:50 crc kubenswrapper[4893]: I0314 07:21:50.525985 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="29521c0b-4522-499b-ba9a-f2248f7a0a07" containerName="cinder-scheduler" containerID="cri-o://edc8085e74dce58973128f9b98a03e41cc178be710dcdbb75c0d004e490080fd" gracePeriod=30 Mar 14 07:21:50 crc kubenswrapper[4893]: I0314 07:21:50.526127 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="29521c0b-4522-499b-ba9a-f2248f7a0a07" containerName="probe" containerID="cri-o://d9717e50eff9a9adfbe8030e6984221cdd339d064f3f05f3234001ebcd7ecbec" gracePeriod=30 Mar 14 07:21:50 crc kubenswrapper[4893]: I0314 07:21:50.765116 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77cf8fb985-qmxrz" Mar 14 07:21:50 crc kubenswrapper[4893]: I0314 07:21:50.905086 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2fd95203-37e0-4d71-9fd6-a1b04882fe7b-dns-swift-storage-0\") pod \"2fd95203-37e0-4d71-9fd6-a1b04882fe7b\" (UID: \"2fd95203-37e0-4d71-9fd6-a1b04882fe7b\") " Mar 14 07:21:50 crc kubenswrapper[4893]: I0314 07:21:50.905190 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5j6x2\" (UniqueName: \"kubernetes.io/projected/2fd95203-37e0-4d71-9fd6-a1b04882fe7b-kube-api-access-5j6x2\") pod \"2fd95203-37e0-4d71-9fd6-a1b04882fe7b\" (UID: \"2fd95203-37e0-4d71-9fd6-a1b04882fe7b\") " Mar 14 07:21:50 crc kubenswrapper[4893]: I0314 07:21:50.905273 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fd95203-37e0-4d71-9fd6-a1b04882fe7b-config\") pod \"2fd95203-37e0-4d71-9fd6-a1b04882fe7b\" (UID: \"2fd95203-37e0-4d71-9fd6-a1b04882fe7b\") " Mar 14 07:21:50 crc kubenswrapper[4893]: I0314 07:21:50.905304 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2fd95203-37e0-4d71-9fd6-a1b04882fe7b-ovsdbserver-nb\") pod \"2fd95203-37e0-4d71-9fd6-a1b04882fe7b\" (UID: \"2fd95203-37e0-4d71-9fd6-a1b04882fe7b\") " Mar 14 07:21:50 crc kubenswrapper[4893]: I0314 07:21:50.905334 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2fd95203-37e0-4d71-9fd6-a1b04882fe7b-ovsdbserver-sb\") pod \"2fd95203-37e0-4d71-9fd6-a1b04882fe7b\" (UID: \"2fd95203-37e0-4d71-9fd6-a1b04882fe7b\") " Mar 14 07:21:50 crc kubenswrapper[4893]: I0314 07:21:50.905363 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2fd95203-37e0-4d71-9fd6-a1b04882fe7b-dns-svc\") pod \"2fd95203-37e0-4d71-9fd6-a1b04882fe7b\" (UID: \"2fd95203-37e0-4d71-9fd6-a1b04882fe7b\") " Mar 14 07:21:50 crc kubenswrapper[4893]: I0314 07:21:50.912388 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fd95203-37e0-4d71-9fd6-a1b04882fe7b-kube-api-access-5j6x2" (OuterVolumeSpecName: "kube-api-access-5j6x2") pod "2fd95203-37e0-4d71-9fd6-a1b04882fe7b" (UID: "2fd95203-37e0-4d71-9fd6-a1b04882fe7b"). InnerVolumeSpecName "kube-api-access-5j6x2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:21:50 crc kubenswrapper[4893]: I0314 07:21:50.977259 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fd95203-37e0-4d71-9fd6-a1b04882fe7b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2fd95203-37e0-4d71-9fd6-a1b04882fe7b" (UID: "2fd95203-37e0-4d71-9fd6-a1b04882fe7b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:21:50 crc kubenswrapper[4893]: I0314 07:21:50.986060 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fd95203-37e0-4d71-9fd6-a1b04882fe7b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2fd95203-37e0-4d71-9fd6-a1b04882fe7b" (UID: "2fd95203-37e0-4d71-9fd6-a1b04882fe7b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:21:50 crc kubenswrapper[4893]: I0314 07:21:50.997292 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fd95203-37e0-4d71-9fd6-a1b04882fe7b-config" (OuterVolumeSpecName: "config") pod "2fd95203-37e0-4d71-9fd6-a1b04882fe7b" (UID: "2fd95203-37e0-4d71-9fd6-a1b04882fe7b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:21:51 crc kubenswrapper[4893]: I0314 07:21:51.010975 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fd95203-37e0-4d71-9fd6-a1b04882fe7b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2fd95203-37e0-4d71-9fd6-a1b04882fe7b" (UID: "2fd95203-37e0-4d71-9fd6-a1b04882fe7b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:21:51 crc kubenswrapper[4893]: I0314 07:21:51.011408 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2fd95203-37e0-4d71-9fd6-a1b04882fe7b-ovsdbserver-nb\") pod \"2fd95203-37e0-4d71-9fd6-a1b04882fe7b\" (UID: \"2fd95203-37e0-4d71-9fd6-a1b04882fe7b\") " Mar 14 07:21:51 crc kubenswrapper[4893]: W0314 07:21:51.011512 4893 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/2fd95203-37e0-4d71-9fd6-a1b04882fe7b/volumes/kubernetes.io~configmap/ovsdbserver-nb Mar 14 07:21:51 crc kubenswrapper[4893]: I0314 07:21:51.011541 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fd95203-37e0-4d71-9fd6-a1b04882fe7b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2fd95203-37e0-4d71-9fd6-a1b04882fe7b" (UID: "2fd95203-37e0-4d71-9fd6-a1b04882fe7b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:21:51 crc kubenswrapper[4893]: I0314 07:21:51.011982 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fd95203-37e0-4d71-9fd6-a1b04882fe7b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2fd95203-37e0-4d71-9fd6-a1b04882fe7b" (UID: "2fd95203-37e0-4d71-9fd6-a1b04882fe7b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:21:51 crc kubenswrapper[4893]: I0314 07:21:51.012289 4893 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2fd95203-37e0-4d71-9fd6-a1b04882fe7b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:51 crc kubenswrapper[4893]: I0314 07:21:51.012372 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5j6x2\" (UniqueName: \"kubernetes.io/projected/2fd95203-37e0-4d71-9fd6-a1b04882fe7b-kube-api-access-5j6x2\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:51 crc kubenswrapper[4893]: I0314 07:21:51.012454 4893 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fd95203-37e0-4d71-9fd6-a1b04882fe7b-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:51 crc kubenswrapper[4893]: I0314 07:21:51.012539 4893 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2fd95203-37e0-4d71-9fd6-a1b04882fe7b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:51 crc kubenswrapper[4893]: I0314 07:21:51.012606 4893 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2fd95203-37e0-4d71-9fd6-a1b04882fe7b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:51 crc kubenswrapper[4893]: I0314 07:21:51.012704 4893 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2fd95203-37e0-4d71-9fd6-a1b04882fe7b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:51 crc kubenswrapper[4893]: I0314 07:21:51.390809 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="596e7b0c-f030-4a30-870b-184071c1eab3" path="/var/lib/kubelet/pods/596e7b0c-f030-4a30-870b-184071c1eab3/volumes" Mar 14 07:21:51 crc kubenswrapper[4893]: I0314 07:21:51.537953 4893 generic.go:334] "Generic (PLEG): container finished" podID="29521c0b-4522-499b-ba9a-f2248f7a0a07" containerID="d9717e50eff9a9adfbe8030e6984221cdd339d064f3f05f3234001ebcd7ecbec" exitCode=0 Mar 14 07:21:51 crc kubenswrapper[4893]: I0314 07:21:51.538301 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"29521c0b-4522-499b-ba9a-f2248f7a0a07","Type":"ContainerDied","Data":"d9717e50eff9a9adfbe8030e6984221cdd339d064f3f05f3234001ebcd7ecbec"} Mar 14 07:21:51 crc kubenswrapper[4893]: I0314 07:21:51.541464 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77cf8fb985-qmxrz" event={"ID":"2fd95203-37e0-4d71-9fd6-a1b04882fe7b","Type":"ContainerDied","Data":"2266f7a2ed2d44b471c0c433e13eae4905c8605ad53fb429d70d51ede88c6eb5"} Mar 14 07:21:51 crc kubenswrapper[4893]: I0314 07:21:51.541500 4893 scope.go:117] "RemoveContainer" containerID="61eaf3db7e8b3004545b51982c662d19fbcfdd2ec68b64a3868a7bad0bc49839" Mar 14 07:21:51 crc kubenswrapper[4893]: I0314 07:21:51.541615 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77cf8fb985-qmxrz" Mar 14 07:21:51 crc kubenswrapper[4893]: I0314 07:21:51.564105 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77cf8fb985-qmxrz"] Mar 14 07:21:51 crc kubenswrapper[4893]: I0314 07:21:51.568753 4893 scope.go:117] "RemoveContainer" containerID="a1a1fb73e4b20dc9ca888200e4e9e9dc0978fc9b586d57aee895efacec634de8" Mar 14 07:21:51 crc kubenswrapper[4893]: I0314 07:21:51.571297 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77cf8fb985-qmxrz"] Mar 14 07:21:53 crc kubenswrapper[4893]: I0314 07:21:53.394053 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fd95203-37e0-4d71-9fd6-a1b04882fe7b" path="/var/lib/kubelet/pods/2fd95203-37e0-4d71-9fd6-a1b04882fe7b/volumes" Mar 14 07:21:53 crc kubenswrapper[4893]: I0314 07:21:53.576241 4893 generic.go:334] "Generic (PLEG): container finished" podID="fe041ac0-b6d6-4ce7-8d87-c91e695bcf20" containerID="fb1d21418e4f5a24e0fc43ce4b7a9d513c6c2d457b8866d744b8f46d082bc8e5" exitCode=0 Mar 14 07:21:53 crc kubenswrapper[4893]: I0314 07:21:53.576277 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bc7479fc9-jhvmx" event={"ID":"fe041ac0-b6d6-4ce7-8d87-c91e695bcf20","Type":"ContainerDied","Data":"fb1d21418e4f5a24e0fc43ce4b7a9d513c6c2d457b8866d744b8f46d082bc8e5"} Mar 14 07:21:53 crc kubenswrapper[4893]: I0314 07:21:53.852957 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-7776dc7c77-9lm7q"] Mar 14 07:21:53 crc kubenswrapper[4893]: E0314 07:21:53.853590 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fd95203-37e0-4d71-9fd6-a1b04882fe7b" containerName="dnsmasq-dns" Mar 14 07:21:53 crc kubenswrapper[4893]: I0314 07:21:53.853658 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fd95203-37e0-4d71-9fd6-a1b04882fe7b" containerName="dnsmasq-dns" Mar 14 07:21:53 crc kubenswrapper[4893]: E0314 07:21:53.853732 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="596e7b0c-f030-4a30-870b-184071c1eab3" containerName="placement-log" Mar 14 07:21:53 crc kubenswrapper[4893]: I0314 07:21:53.853813 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="596e7b0c-f030-4a30-870b-184071c1eab3" containerName="placement-log" Mar 14 07:21:53 crc kubenswrapper[4893]: E0314 07:21:53.853890 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="596e7b0c-f030-4a30-870b-184071c1eab3" containerName="placement-api" Mar 14 07:21:53 crc kubenswrapper[4893]: I0314 07:21:53.853940 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="596e7b0c-f030-4a30-870b-184071c1eab3" containerName="placement-api" Mar 14 07:21:53 crc kubenswrapper[4893]: E0314 07:21:53.854003 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fd95203-37e0-4d71-9fd6-a1b04882fe7b" containerName="init" Mar 14 07:21:53 crc kubenswrapper[4893]: I0314 07:21:53.854066 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fd95203-37e0-4d71-9fd6-a1b04882fe7b" containerName="init" Mar 14 07:21:53 crc kubenswrapper[4893]: I0314 07:21:53.854284 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fd95203-37e0-4d71-9fd6-a1b04882fe7b" containerName="dnsmasq-dns" Mar 14 07:21:53 crc kubenswrapper[4893]: I0314 07:21:53.854356 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="596e7b0c-f030-4a30-870b-184071c1eab3" containerName="placement-log" Mar 14 07:21:53 crc kubenswrapper[4893]: I0314 07:21:53.854425 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="596e7b0c-f030-4a30-870b-184071c1eab3" containerName="placement-api" Mar 14 07:21:53 crc kubenswrapper[4893]: I0314 07:21:53.855333 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7776dc7c77-9lm7q" Mar 14 07:21:53 crc kubenswrapper[4893]: I0314 07:21:53.857529 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 14 07:21:53 crc kubenswrapper[4893]: I0314 07:21:53.857574 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 14 07:21:53 crc kubenswrapper[4893]: I0314 07:21:53.859176 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 14 07:21:53 crc kubenswrapper[4893]: I0314 07:21:53.865708 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7776dc7c77-9lm7q"] Mar 14 07:21:53 crc kubenswrapper[4893]: I0314 07:21:53.970059 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/203abd37-654f-480c-8a9d-719d767aec4d-config-data\") pod \"swift-proxy-7776dc7c77-9lm7q\" (UID: \"203abd37-654f-480c-8a9d-719d767aec4d\") " pod="openstack/swift-proxy-7776dc7c77-9lm7q" Mar 14 07:21:53 crc kubenswrapper[4893]: I0314 07:21:53.970141 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/203abd37-654f-480c-8a9d-719d767aec4d-internal-tls-certs\") pod \"swift-proxy-7776dc7c77-9lm7q\" (UID: \"203abd37-654f-480c-8a9d-719d767aec4d\") " pod="openstack/swift-proxy-7776dc7c77-9lm7q" Mar 14 07:21:53 crc kubenswrapper[4893]: I0314 07:21:53.970192 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/203abd37-654f-480c-8a9d-719d767aec4d-etc-swift\") pod \"swift-proxy-7776dc7c77-9lm7q\" (UID: \"203abd37-654f-480c-8a9d-719d767aec4d\") " pod="openstack/swift-proxy-7776dc7c77-9lm7q" Mar 14 07:21:53 crc kubenswrapper[4893]: I0314 07:21:53.970207 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/203abd37-654f-480c-8a9d-719d767aec4d-public-tls-certs\") pod \"swift-proxy-7776dc7c77-9lm7q\" (UID: \"203abd37-654f-480c-8a9d-719d767aec4d\") " pod="openstack/swift-proxy-7776dc7c77-9lm7q" Mar 14 07:21:53 crc kubenswrapper[4893]: I0314 07:21:53.970224 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/203abd37-654f-480c-8a9d-719d767aec4d-log-httpd\") pod \"swift-proxy-7776dc7c77-9lm7q\" (UID: \"203abd37-654f-480c-8a9d-719d767aec4d\") " pod="openstack/swift-proxy-7776dc7c77-9lm7q" Mar 14 07:21:53 crc kubenswrapper[4893]: I0314 07:21:53.970411 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95l7h\" (UniqueName: \"kubernetes.io/projected/203abd37-654f-480c-8a9d-719d767aec4d-kube-api-access-95l7h\") pod \"swift-proxy-7776dc7c77-9lm7q\" (UID: \"203abd37-654f-480c-8a9d-719d767aec4d\") " pod="openstack/swift-proxy-7776dc7c77-9lm7q" Mar 14 07:21:53 crc kubenswrapper[4893]: I0314 07:21:53.970536 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/203abd37-654f-480c-8a9d-719d767aec4d-combined-ca-bundle\") pod \"swift-proxy-7776dc7c77-9lm7q\" (UID: \"203abd37-654f-480c-8a9d-719d767aec4d\") " pod="openstack/swift-proxy-7776dc7c77-9lm7q" Mar 14 07:21:53 crc kubenswrapper[4893]: I0314 07:21:53.970575 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/203abd37-654f-480c-8a9d-719d767aec4d-run-httpd\") pod \"swift-proxy-7776dc7c77-9lm7q\" (UID: \"203abd37-654f-480c-8a9d-719d767aec4d\") " pod="openstack/swift-proxy-7776dc7c77-9lm7q" Mar 14 07:21:53 crc kubenswrapper[4893]: I0314 07:21:53.981580 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:21:53 crc kubenswrapper[4893]: I0314 07:21:53.981850 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4f3014ed-22ab-43fd-bb9b-44d9a4be10b5" containerName="ceilometer-central-agent" containerID="cri-o://314663c94fd1429e5895114a853d20946087751d2d9ddaebb38e2f68bc8422e1" gracePeriod=30 Mar 14 07:21:53 crc kubenswrapper[4893]: I0314 07:21:53.982031 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4f3014ed-22ab-43fd-bb9b-44d9a4be10b5" containerName="ceilometer-notification-agent" containerID="cri-o://9f8707403ee49e1a1d27cf771e6fbaa144ef8a07c7f4c6fa2d32eb4eca8fdb02" gracePeriod=30 Mar 14 07:21:53 crc kubenswrapper[4893]: I0314 07:21:53.982030 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4f3014ed-22ab-43fd-bb9b-44d9a4be10b5" containerName="proxy-httpd" containerID="cri-o://8767cd3be5fb08b8866fd05260904400de5bafd5f4dffc6eeb4a5849a78f01d1" gracePeriod=30 Mar 14 07:21:53 crc kubenswrapper[4893]: I0314 07:21:53.982874 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4f3014ed-22ab-43fd-bb9b-44d9a4be10b5" containerName="sg-core" containerID="cri-o://e0308c113eb3e2da6d8b5ec320e11fbaa78a704e31414c99fea30584046274f9" gracePeriod=30 Mar 14 07:21:53 crc kubenswrapper[4893]: I0314 07:21:53.990414 4893 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="4f3014ed-22ab-43fd-bb9b-44d9a4be10b5" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.168:3000/\": EOF" Mar 14 07:21:54 crc kubenswrapper[4893]: I0314 07:21:54.072354 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/203abd37-654f-480c-8a9d-719d767aec4d-run-httpd\") pod \"swift-proxy-7776dc7c77-9lm7q\" (UID: \"203abd37-654f-480c-8a9d-719d767aec4d\") " pod="openstack/swift-proxy-7776dc7c77-9lm7q" Mar 14 07:21:54 crc kubenswrapper[4893]: I0314 07:21:54.072420 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/203abd37-654f-480c-8a9d-719d767aec4d-config-data\") pod \"swift-proxy-7776dc7c77-9lm7q\" (UID: \"203abd37-654f-480c-8a9d-719d767aec4d\") " pod="openstack/swift-proxy-7776dc7c77-9lm7q" Mar 14 07:21:54 crc kubenswrapper[4893]: I0314 07:21:54.072470 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/203abd37-654f-480c-8a9d-719d767aec4d-internal-tls-certs\") pod \"swift-proxy-7776dc7c77-9lm7q\" (UID: \"203abd37-654f-480c-8a9d-719d767aec4d\") " pod="openstack/swift-proxy-7776dc7c77-9lm7q" Mar 14 07:21:54 crc kubenswrapper[4893]: I0314 07:21:54.072498 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/203abd37-654f-480c-8a9d-719d767aec4d-etc-swift\") pod \"swift-proxy-7776dc7c77-9lm7q\" (UID: \"203abd37-654f-480c-8a9d-719d767aec4d\") " pod="openstack/swift-proxy-7776dc7c77-9lm7q" Mar 14 07:21:54 crc kubenswrapper[4893]: I0314 07:21:54.072513 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/203abd37-654f-480c-8a9d-719d767aec4d-public-tls-certs\") pod \"swift-proxy-7776dc7c77-9lm7q\" (UID: \"203abd37-654f-480c-8a9d-719d767aec4d\") " pod="openstack/swift-proxy-7776dc7c77-9lm7q" Mar 14 07:21:54 crc kubenswrapper[4893]: I0314 07:21:54.072546 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/203abd37-654f-480c-8a9d-719d767aec4d-log-httpd\") pod \"swift-proxy-7776dc7c77-9lm7q\" (UID: \"203abd37-654f-480c-8a9d-719d767aec4d\") " pod="openstack/swift-proxy-7776dc7c77-9lm7q" Mar 14 07:21:54 crc kubenswrapper[4893]: I0314 07:21:54.072598 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95l7h\" (UniqueName: \"kubernetes.io/projected/203abd37-654f-480c-8a9d-719d767aec4d-kube-api-access-95l7h\") pod \"swift-proxy-7776dc7c77-9lm7q\" (UID: \"203abd37-654f-480c-8a9d-719d767aec4d\") " pod="openstack/swift-proxy-7776dc7c77-9lm7q" Mar 14 07:21:54 crc kubenswrapper[4893]: I0314 07:21:54.072633 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/203abd37-654f-480c-8a9d-719d767aec4d-combined-ca-bundle\") pod \"swift-proxy-7776dc7c77-9lm7q\" (UID: \"203abd37-654f-480c-8a9d-719d767aec4d\") " pod="openstack/swift-proxy-7776dc7c77-9lm7q" Mar 14 07:21:54 crc kubenswrapper[4893]: I0314 07:21:54.081141 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/203abd37-654f-480c-8a9d-719d767aec4d-combined-ca-bundle\") pod \"swift-proxy-7776dc7c77-9lm7q\" (UID: \"203abd37-654f-480c-8a9d-719d767aec4d\") " pod="openstack/swift-proxy-7776dc7c77-9lm7q" Mar 14 07:21:54 crc kubenswrapper[4893]: I0314 07:21:54.081891 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/203abd37-654f-480c-8a9d-719d767aec4d-run-httpd\") pod \"swift-proxy-7776dc7c77-9lm7q\" (UID: \"203abd37-654f-480c-8a9d-719d767aec4d\") " pod="openstack/swift-proxy-7776dc7c77-9lm7q" Mar 14 07:21:54 crc kubenswrapper[4893]: I0314 07:21:54.082954 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/203abd37-654f-480c-8a9d-719d767aec4d-log-httpd\") pod \"swift-proxy-7776dc7c77-9lm7q\" (UID: \"203abd37-654f-480c-8a9d-719d767aec4d\") " pod="openstack/swift-proxy-7776dc7c77-9lm7q" Mar 14 07:21:54 crc kubenswrapper[4893]: I0314 07:21:54.084475 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/203abd37-654f-480c-8a9d-719d767aec4d-config-data\") pod \"swift-proxy-7776dc7c77-9lm7q\" (UID: \"203abd37-654f-480c-8a9d-719d767aec4d\") " pod="openstack/swift-proxy-7776dc7c77-9lm7q" Mar 14 07:21:54 crc kubenswrapper[4893]: I0314 07:21:54.084602 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/203abd37-654f-480c-8a9d-719d767aec4d-etc-swift\") pod \"swift-proxy-7776dc7c77-9lm7q\" (UID: \"203abd37-654f-480c-8a9d-719d767aec4d\") " pod="openstack/swift-proxy-7776dc7c77-9lm7q" Mar 14 07:21:54 crc kubenswrapper[4893]: I0314 07:21:54.089740 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/203abd37-654f-480c-8a9d-719d767aec4d-internal-tls-certs\") pod \"swift-proxy-7776dc7c77-9lm7q\" (UID: \"203abd37-654f-480c-8a9d-719d767aec4d\") " pod="openstack/swift-proxy-7776dc7c77-9lm7q" Mar 14 07:21:54 crc kubenswrapper[4893]: I0314 07:21:54.091399 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/203abd37-654f-480c-8a9d-719d767aec4d-public-tls-certs\") pod \"swift-proxy-7776dc7c77-9lm7q\" (UID: \"203abd37-654f-480c-8a9d-719d767aec4d\") " pod="openstack/swift-proxy-7776dc7c77-9lm7q" Mar 14 07:21:54 crc kubenswrapper[4893]: I0314 07:21:54.102241 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95l7h\" (UniqueName: \"kubernetes.io/projected/203abd37-654f-480c-8a9d-719d767aec4d-kube-api-access-95l7h\") pod \"swift-proxy-7776dc7c77-9lm7q\" (UID: \"203abd37-654f-480c-8a9d-719d767aec4d\") " pod="openstack/swift-proxy-7776dc7c77-9lm7q" Mar 14 07:21:54 crc kubenswrapper[4893]: I0314 07:21:54.173562 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7776dc7c77-9lm7q" Mar 14 07:21:54 crc kubenswrapper[4893]: I0314 07:21:54.591164 4893 generic.go:334] "Generic (PLEG): container finished" podID="4f3014ed-22ab-43fd-bb9b-44d9a4be10b5" containerID="8767cd3be5fb08b8866fd05260904400de5bafd5f4dffc6eeb4a5849a78f01d1" exitCode=0 Mar 14 07:21:54 crc kubenswrapper[4893]: I0314 07:21:54.591201 4893 generic.go:334] "Generic (PLEG): container finished" podID="4f3014ed-22ab-43fd-bb9b-44d9a4be10b5" containerID="e0308c113eb3e2da6d8b5ec320e11fbaa78a704e31414c99fea30584046274f9" exitCode=2 Mar 14 07:21:54 crc kubenswrapper[4893]: I0314 07:21:54.591213 4893 generic.go:334] "Generic (PLEG): container finished" podID="4f3014ed-22ab-43fd-bb9b-44d9a4be10b5" containerID="9f8707403ee49e1a1d27cf771e6fbaa144ef8a07c7f4c6fa2d32eb4eca8fdb02" exitCode=0 Mar 14 07:21:54 crc kubenswrapper[4893]: I0314 07:21:54.591222 4893 generic.go:334] "Generic (PLEG): container finished" podID="4f3014ed-22ab-43fd-bb9b-44d9a4be10b5" containerID="314663c94fd1429e5895114a853d20946087751d2d9ddaebb38e2f68bc8422e1" exitCode=0 Mar 14 07:21:54 crc kubenswrapper[4893]: I0314 07:21:54.591271 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4f3014ed-22ab-43fd-bb9b-44d9a4be10b5","Type":"ContainerDied","Data":"8767cd3be5fb08b8866fd05260904400de5bafd5f4dffc6eeb4a5849a78f01d1"} Mar 14 07:21:54 crc kubenswrapper[4893]: I0314 07:21:54.591303 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4f3014ed-22ab-43fd-bb9b-44d9a4be10b5","Type":"ContainerDied","Data":"e0308c113eb3e2da6d8b5ec320e11fbaa78a704e31414c99fea30584046274f9"} Mar 14 07:21:54 crc kubenswrapper[4893]: I0314 07:21:54.591317 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4f3014ed-22ab-43fd-bb9b-44d9a4be10b5","Type":"ContainerDied","Data":"9f8707403ee49e1a1d27cf771e6fbaa144ef8a07c7f4c6fa2d32eb4eca8fdb02"} Mar 14 07:21:54 crc kubenswrapper[4893]: I0314 07:21:54.591329 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4f3014ed-22ab-43fd-bb9b-44d9a4be10b5","Type":"ContainerDied","Data":"314663c94fd1429e5895114a853d20946087751d2d9ddaebb38e2f68bc8422e1"} Mar 14 07:21:54 crc kubenswrapper[4893]: I0314 07:21:54.594431 4893 generic.go:334] "Generic (PLEG): container finished" podID="29521c0b-4522-499b-ba9a-f2248f7a0a07" containerID="edc8085e74dce58973128f9b98a03e41cc178be710dcdbb75c0d004e490080fd" exitCode=0 Mar 14 07:21:54 crc kubenswrapper[4893]: I0314 07:21:54.594451 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"29521c0b-4522-499b-ba9a-f2248f7a0a07","Type":"ContainerDied","Data":"edc8085e74dce58973128f9b98a03e41cc178be710dcdbb75c0d004e490080fd"} Mar 14 07:21:55 crc kubenswrapper[4893]: I0314 07:21:55.390260 4893 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="4f3014ed-22ab-43fd-bb9b-44d9a4be10b5" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.168:3000/\": dial tcp 10.217.0.168:3000: connect: connection refused" Mar 14 07:21:59 crc kubenswrapper[4893]: I0314 07:21:59.207466 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 14 07:21:59 crc kubenswrapper[4893]: I0314 07:21:59.731171 4893 patch_prober.go:28] interesting pod/machine-config-daemon-d4x6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:21:59 crc kubenswrapper[4893]: I0314 07:21:59.731847 4893 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:21:59 crc kubenswrapper[4893]: I0314 07:21:59.782176 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 14 07:21:59 crc kubenswrapper[4893]: I0314 07:21:59.810276 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29521c0b-4522-499b-ba9a-f2248f7a0a07-scripts\") pod \"29521c0b-4522-499b-ba9a-f2248f7a0a07\" (UID: \"29521c0b-4522-499b-ba9a-f2248f7a0a07\") " Mar 14 07:21:59 crc kubenswrapper[4893]: I0314 07:21:59.810349 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/29521c0b-4522-499b-ba9a-f2248f7a0a07-config-data-custom\") pod \"29521c0b-4522-499b-ba9a-f2248f7a0a07\" (UID: \"29521c0b-4522-499b-ba9a-f2248f7a0a07\") " Mar 14 07:21:59 crc kubenswrapper[4893]: I0314 07:21:59.810408 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/29521c0b-4522-499b-ba9a-f2248f7a0a07-etc-machine-id\") pod \"29521c0b-4522-499b-ba9a-f2248f7a0a07\" (UID: \"29521c0b-4522-499b-ba9a-f2248f7a0a07\") " Mar 14 07:21:59 crc kubenswrapper[4893]: I0314 07:21:59.810485 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29521c0b-4522-499b-ba9a-f2248f7a0a07-combined-ca-bundle\") pod \"29521c0b-4522-499b-ba9a-f2248f7a0a07\" (UID: \"29521c0b-4522-499b-ba9a-f2248f7a0a07\") " Mar 14 07:21:59 crc kubenswrapper[4893]: I0314 07:21:59.810686 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtnmz\" (UniqueName: \"kubernetes.io/projected/29521c0b-4522-499b-ba9a-f2248f7a0a07-kube-api-access-jtnmz\") pod \"29521c0b-4522-499b-ba9a-f2248f7a0a07\" (UID: \"29521c0b-4522-499b-ba9a-f2248f7a0a07\") " Mar 14 07:21:59 crc kubenswrapper[4893]: I0314 07:21:59.810734 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29521c0b-4522-499b-ba9a-f2248f7a0a07-config-data\") pod \"29521c0b-4522-499b-ba9a-f2248f7a0a07\" (UID: \"29521c0b-4522-499b-ba9a-f2248f7a0a07\") " Mar 14 07:21:59 crc kubenswrapper[4893]: I0314 07:21:59.810947 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/29521c0b-4522-499b-ba9a-f2248f7a0a07-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "29521c0b-4522-499b-ba9a-f2248f7a0a07" (UID: "29521c0b-4522-499b-ba9a-f2248f7a0a07"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:21:59 crc kubenswrapper[4893]: I0314 07:21:59.811378 4893 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/29521c0b-4522-499b-ba9a-f2248f7a0a07-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:59 crc kubenswrapper[4893]: I0314 07:21:59.819358 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29521c0b-4522-499b-ba9a-f2248f7a0a07-kube-api-access-jtnmz" (OuterVolumeSpecName: "kube-api-access-jtnmz") pod "29521c0b-4522-499b-ba9a-f2248f7a0a07" (UID: "29521c0b-4522-499b-ba9a-f2248f7a0a07"). InnerVolumeSpecName "kube-api-access-jtnmz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:21:59 crc kubenswrapper[4893]: I0314 07:21:59.821299 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29521c0b-4522-499b-ba9a-f2248f7a0a07-scripts" (OuterVolumeSpecName: "scripts") pod "29521c0b-4522-499b-ba9a-f2248f7a0a07" (UID: "29521c0b-4522-499b-ba9a-f2248f7a0a07"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:21:59 crc kubenswrapper[4893]: I0314 07:21:59.822002 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29521c0b-4522-499b-ba9a-f2248f7a0a07-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "29521c0b-4522-499b-ba9a-f2248f7a0a07" (UID: "29521c0b-4522-499b-ba9a-f2248f7a0a07"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:21:59 crc kubenswrapper[4893]: I0314 07:21:59.875864 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29521c0b-4522-499b-ba9a-f2248f7a0a07-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "29521c0b-4522-499b-ba9a-f2248f7a0a07" (UID: "29521c0b-4522-499b-ba9a-f2248f7a0a07"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:21:59 crc kubenswrapper[4893]: I0314 07:21:59.913635 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtnmz\" (UniqueName: \"kubernetes.io/projected/29521c0b-4522-499b-ba9a-f2248f7a0a07-kube-api-access-jtnmz\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:59 crc kubenswrapper[4893]: I0314 07:21:59.913902 4893 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29521c0b-4522-499b-ba9a-f2248f7a0a07-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:59 crc kubenswrapper[4893]: I0314 07:21:59.914034 4893 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/29521c0b-4522-499b-ba9a-f2248f7a0a07-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:59 crc kubenswrapper[4893]: I0314 07:21:59.914139 4893 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29521c0b-4522-499b-ba9a-f2248f7a0a07-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:59 crc kubenswrapper[4893]: I0314 07:21:59.924383 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29521c0b-4522-499b-ba9a-f2248f7a0a07-config-data" (OuterVolumeSpecName: "config-data") pod "29521c0b-4522-499b-ba9a-f2248f7a0a07" (UID: "29521c0b-4522-499b-ba9a-f2248f7a0a07"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:21:59 crc kubenswrapper[4893]: I0314 07:21:59.973932 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 07:21:59 crc kubenswrapper[4893]: I0314 07:21:59.974180 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="7acb9d97-b4c7-49b4-90f0-e1d5d97d2581" containerName="glance-log" containerID="cri-o://7c0fa865cdda16c5e7099733a49b13b73a32bab1ba4edabe119ddc0ae6452bb4" gracePeriod=30 Mar 14 07:21:59 crc kubenswrapper[4893]: I0314 07:21:59.974289 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="7acb9d97-b4c7-49b4-90f0-e1d5d97d2581" containerName="glance-httpd" containerID="cri-o://7789f926434baf6d22b935982b2af6e952f4a729d8d70fb74ec2c13da50d7032" gracePeriod=30 Mar 14 07:21:59 crc kubenswrapper[4893]: I0314 07:21:59.993701 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.016792 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7nwb4\" (UniqueName: \"kubernetes.io/projected/4f3014ed-22ab-43fd-bb9b-44d9a4be10b5-kube-api-access-7nwb4\") pod \"4f3014ed-22ab-43fd-bb9b-44d9a4be10b5\" (UID: \"4f3014ed-22ab-43fd-bb9b-44d9a4be10b5\") " Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.016873 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f3014ed-22ab-43fd-bb9b-44d9a4be10b5-config-data\") pod \"4f3014ed-22ab-43fd-bb9b-44d9a4be10b5\" (UID: \"4f3014ed-22ab-43fd-bb9b-44d9a4be10b5\") " Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.016947 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f3014ed-22ab-43fd-bb9b-44d9a4be10b5-log-httpd\") pod \"4f3014ed-22ab-43fd-bb9b-44d9a4be10b5\" (UID: \"4f3014ed-22ab-43fd-bb9b-44d9a4be10b5\") " Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.016980 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f3014ed-22ab-43fd-bb9b-44d9a4be10b5-scripts\") pod \"4f3014ed-22ab-43fd-bb9b-44d9a4be10b5\" (UID: \"4f3014ed-22ab-43fd-bb9b-44d9a4be10b5\") " Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.017007 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4f3014ed-22ab-43fd-bb9b-44d9a4be10b5-sg-core-conf-yaml\") pod \"4f3014ed-22ab-43fd-bb9b-44d9a4be10b5\" (UID: \"4f3014ed-22ab-43fd-bb9b-44d9a4be10b5\") " Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.017038 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f3014ed-22ab-43fd-bb9b-44d9a4be10b5-combined-ca-bundle\") pod \"4f3014ed-22ab-43fd-bb9b-44d9a4be10b5\" (UID: \"4f3014ed-22ab-43fd-bb9b-44d9a4be10b5\") " Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.017092 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f3014ed-22ab-43fd-bb9b-44d9a4be10b5-run-httpd\") pod \"4f3014ed-22ab-43fd-bb9b-44d9a4be10b5\" (UID: \"4f3014ed-22ab-43fd-bb9b-44d9a4be10b5\") " Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.017467 4893 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29521c0b-4522-499b-ba9a-f2248f7a0a07-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.017952 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f3014ed-22ab-43fd-bb9b-44d9a4be10b5-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4f3014ed-22ab-43fd-bb9b-44d9a4be10b5" (UID: "4f3014ed-22ab-43fd-bb9b-44d9a4be10b5"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.031390 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f3014ed-22ab-43fd-bb9b-44d9a4be10b5-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4f3014ed-22ab-43fd-bb9b-44d9a4be10b5" (UID: "4f3014ed-22ab-43fd-bb9b-44d9a4be10b5"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.031486 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f3014ed-22ab-43fd-bb9b-44d9a4be10b5-scripts" (OuterVolumeSpecName: "scripts") pod "4f3014ed-22ab-43fd-bb9b-44d9a4be10b5" (UID: "4f3014ed-22ab-43fd-bb9b-44d9a4be10b5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.031988 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f3014ed-22ab-43fd-bb9b-44d9a4be10b5-kube-api-access-7nwb4" (OuterVolumeSpecName: "kube-api-access-7nwb4") pod "4f3014ed-22ab-43fd-bb9b-44d9a4be10b5" (UID: "4f3014ed-22ab-43fd-bb9b-44d9a4be10b5"). InnerVolumeSpecName "kube-api-access-7nwb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.103915 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f3014ed-22ab-43fd-bb9b-44d9a4be10b5-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4f3014ed-22ab-43fd-bb9b-44d9a4be10b5" (UID: "4f3014ed-22ab-43fd-bb9b-44d9a4be10b5"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.133047 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7nwb4\" (UniqueName: \"kubernetes.io/projected/4f3014ed-22ab-43fd-bb9b-44d9a4be10b5-kube-api-access-7nwb4\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.133442 4893 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f3014ed-22ab-43fd-bb9b-44d9a4be10b5-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.133471 4893 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f3014ed-22ab-43fd-bb9b-44d9a4be10b5-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.133487 4893 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4f3014ed-22ab-43fd-bb9b-44d9a4be10b5-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.133500 4893 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f3014ed-22ab-43fd-bb9b-44d9a4be10b5-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.176148 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557882-7m6mq"] Mar 14 07:22:00 crc kubenswrapper[4893]: E0314 07:22:00.177362 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f3014ed-22ab-43fd-bb9b-44d9a4be10b5" containerName="ceilometer-central-agent" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.177392 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f3014ed-22ab-43fd-bb9b-44d9a4be10b5" containerName="ceilometer-central-agent" Mar 14 07:22:00 crc kubenswrapper[4893]: E0314 07:22:00.177421 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f3014ed-22ab-43fd-bb9b-44d9a4be10b5" containerName="ceilometer-notification-agent" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.177430 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f3014ed-22ab-43fd-bb9b-44d9a4be10b5" containerName="ceilometer-notification-agent" Mar 14 07:22:00 crc kubenswrapper[4893]: E0314 07:22:00.177452 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f3014ed-22ab-43fd-bb9b-44d9a4be10b5" containerName="sg-core" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.177462 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f3014ed-22ab-43fd-bb9b-44d9a4be10b5" containerName="sg-core" Mar 14 07:22:00 crc kubenswrapper[4893]: E0314 07:22:00.177477 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29521c0b-4522-499b-ba9a-f2248f7a0a07" containerName="cinder-scheduler" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.177485 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="29521c0b-4522-499b-ba9a-f2248f7a0a07" containerName="cinder-scheduler" Mar 14 07:22:00 crc kubenswrapper[4893]: E0314 07:22:00.177548 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f3014ed-22ab-43fd-bb9b-44d9a4be10b5" containerName="proxy-httpd" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.177561 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f3014ed-22ab-43fd-bb9b-44d9a4be10b5" containerName="proxy-httpd" Mar 14 07:22:00 crc kubenswrapper[4893]: E0314 07:22:00.177596 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29521c0b-4522-499b-ba9a-f2248f7a0a07" containerName="probe" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.177606 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="29521c0b-4522-499b-ba9a-f2248f7a0a07" containerName="probe" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.178041 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f3014ed-22ab-43fd-bb9b-44d9a4be10b5" containerName="ceilometer-notification-agent" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.178072 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="29521c0b-4522-499b-ba9a-f2248f7a0a07" containerName="cinder-scheduler" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.178108 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f3014ed-22ab-43fd-bb9b-44d9a4be10b5" containerName="proxy-httpd" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.178129 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f3014ed-22ab-43fd-bb9b-44d9a4be10b5" containerName="sg-core" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.178155 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f3014ed-22ab-43fd-bb9b-44d9a4be10b5" containerName="ceilometer-central-agent" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.178175 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="29521c0b-4522-499b-ba9a-f2248f7a0a07" containerName="probe" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.179221 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557882-7m6mq" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.183585 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.183647 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.187451 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-44qb7" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.218370 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f3014ed-22ab-43fd-bb9b-44d9a4be10b5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4f3014ed-22ab-43fd-bb9b-44d9a4be10b5" (UID: "4f3014ed-22ab-43fd-bb9b-44d9a4be10b5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.229222 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557882-7m6mq"] Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.235683 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5mm5\" (UniqueName: \"kubernetes.io/projected/eba46be1-6a5c-4665-aebf-6b243dec4ed7-kube-api-access-w5mm5\") pod \"auto-csr-approver-29557882-7m6mq\" (UID: \"eba46be1-6a5c-4665-aebf-6b243dec4ed7\") " pod="openshift-infra/auto-csr-approver-29557882-7m6mq" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.235915 4893 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f3014ed-22ab-43fd-bb9b-44d9a4be10b5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.244731 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f3014ed-22ab-43fd-bb9b-44d9a4be10b5-config-data" (OuterVolumeSpecName: "config-data") pod "4f3014ed-22ab-43fd-bb9b-44d9a4be10b5" (UID: "4f3014ed-22ab-43fd-bb9b-44d9a4be10b5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.284120 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7776dc7c77-9lm7q"] Mar 14 07:22:00 crc kubenswrapper[4893]: W0314 07:22:00.310217 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod203abd37_654f_480c_8a9d_719d767aec4d.slice/crio-fcc09e4de4f9872e828eb1c3dc4431a94be74768e7c23a24e47a9ed68a0404cc WatchSource:0}: Error finding container fcc09e4de4f9872e828eb1c3dc4431a94be74768e7c23a24e47a9ed68a0404cc: Status 404 returned error can't find the container with id fcc09e4de4f9872e828eb1c3dc4431a94be74768e7c23a24e47a9ed68a0404cc Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.339730 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5mm5\" (UniqueName: \"kubernetes.io/projected/eba46be1-6a5c-4665-aebf-6b243dec4ed7-kube-api-access-w5mm5\") pod \"auto-csr-approver-29557882-7m6mq\" (UID: \"eba46be1-6a5c-4665-aebf-6b243dec4ed7\") " pod="openshift-infra/auto-csr-approver-29557882-7m6mq" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.340420 4893 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f3014ed-22ab-43fd-bb9b-44d9a4be10b5-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.360428 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5mm5\" (UniqueName: \"kubernetes.io/projected/eba46be1-6a5c-4665-aebf-6b243dec4ed7-kube-api-access-w5mm5\") pod \"auto-csr-approver-29557882-7m6mq\" (UID: \"eba46be1-6a5c-4665-aebf-6b243dec4ed7\") " pod="openshift-infra/auto-csr-approver-29557882-7m6mq" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.488149 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557882-7m6mq" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.491118 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6bc7479fc9-jhvmx" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.542860 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe041ac0-b6d6-4ce7-8d87-c91e695bcf20-combined-ca-bundle\") pod \"fe041ac0-b6d6-4ce7-8d87-c91e695bcf20\" (UID: \"fe041ac0-b6d6-4ce7-8d87-c91e695bcf20\") " Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.542908 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fe041ac0-b6d6-4ce7-8d87-c91e695bcf20-httpd-config\") pod \"fe041ac0-b6d6-4ce7-8d87-c91e695bcf20\" (UID: \"fe041ac0-b6d6-4ce7-8d87-c91e695bcf20\") " Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.542933 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe041ac0-b6d6-4ce7-8d87-c91e695bcf20-internal-tls-certs\") pod \"fe041ac0-b6d6-4ce7-8d87-c91e695bcf20\" (UID: \"fe041ac0-b6d6-4ce7-8d87-c91e695bcf20\") " Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.542965 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe041ac0-b6d6-4ce7-8d87-c91e695bcf20-ovndb-tls-certs\") pod \"fe041ac0-b6d6-4ce7-8d87-c91e695bcf20\" (UID: \"fe041ac0-b6d6-4ce7-8d87-c91e695bcf20\") " Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.543035 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fe041ac0-b6d6-4ce7-8d87-c91e695bcf20-config\") pod \"fe041ac0-b6d6-4ce7-8d87-c91e695bcf20\" (UID: \"fe041ac0-b6d6-4ce7-8d87-c91e695bcf20\") " Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.543056 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe041ac0-b6d6-4ce7-8d87-c91e695bcf20-public-tls-certs\") pod \"fe041ac0-b6d6-4ce7-8d87-c91e695bcf20\" (UID: \"fe041ac0-b6d6-4ce7-8d87-c91e695bcf20\") " Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.543123 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbsrc\" (UniqueName: \"kubernetes.io/projected/fe041ac0-b6d6-4ce7-8d87-c91e695bcf20-kube-api-access-vbsrc\") pod \"fe041ac0-b6d6-4ce7-8d87-c91e695bcf20\" (UID: \"fe041ac0-b6d6-4ce7-8d87-c91e695bcf20\") " Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.565189 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe041ac0-b6d6-4ce7-8d87-c91e695bcf20-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "fe041ac0-b6d6-4ce7-8d87-c91e695bcf20" (UID: "fe041ac0-b6d6-4ce7-8d87-c91e695bcf20"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.581216 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe041ac0-b6d6-4ce7-8d87-c91e695bcf20-kube-api-access-vbsrc" (OuterVolumeSpecName: "kube-api-access-vbsrc") pod "fe041ac0-b6d6-4ce7-8d87-c91e695bcf20" (UID: "fe041ac0-b6d6-4ce7-8d87-c91e695bcf20"). InnerVolumeSpecName "kube-api-access-vbsrc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.641068 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe041ac0-b6d6-4ce7-8d87-c91e695bcf20-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "fe041ac0-b6d6-4ce7-8d87-c91e695bcf20" (UID: "fe041ac0-b6d6-4ce7-8d87-c91e695bcf20"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.647747 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe041ac0-b6d6-4ce7-8d87-c91e695bcf20-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "fe041ac0-b6d6-4ce7-8d87-c91e695bcf20" (UID: "fe041ac0-b6d6-4ce7-8d87-c91e695bcf20"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.649218 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbsrc\" (UniqueName: \"kubernetes.io/projected/fe041ac0-b6d6-4ce7-8d87-c91e695bcf20-kube-api-access-vbsrc\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.649298 4893 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fe041ac0-b6d6-4ce7-8d87-c91e695bcf20-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.649365 4893 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe041ac0-b6d6-4ce7-8d87-c91e695bcf20-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.649425 4893 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe041ac0-b6d6-4ce7-8d87-c91e695bcf20-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.669844 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe041ac0-b6d6-4ce7-8d87-c91e695bcf20-config" (OuterVolumeSpecName: "config") pod "fe041ac0-b6d6-4ce7-8d87-c91e695bcf20" (UID: "fe041ac0-b6d6-4ce7-8d87-c91e695bcf20"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.675116 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4f3014ed-22ab-43fd-bb9b-44d9a4be10b5","Type":"ContainerDied","Data":"221f8705e00dbf2446410ebd804b21ced1ff120cd2015f6c49eda2311edf140f"} Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.675168 4893 scope.go:117] "RemoveContainer" containerID="8767cd3be5fb08b8866fd05260904400de5bafd5f4dffc6eeb4a5849a78f01d1" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.675320 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.678090 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe041ac0-b6d6-4ce7-8d87-c91e695bcf20-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fe041ac0-b6d6-4ce7-8d87-c91e695bcf20" (UID: "fe041ac0-b6d6-4ce7-8d87-c91e695bcf20"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.684486 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"29521c0b-4522-499b-ba9a-f2248f7a0a07","Type":"ContainerDied","Data":"174262b632784fe37878a1af5d5c452460184ed035ebce7d28502f291ba19945"} Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.684667 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.699661 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe041ac0-b6d6-4ce7-8d87-c91e695bcf20-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "fe041ac0-b6d6-4ce7-8d87-c91e695bcf20" (UID: "fe041ac0-b6d6-4ce7-8d87-c91e695bcf20"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.711734 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"3a9bdb3e-d59b-4614-a1dd-bc6212ba104c","Type":"ContainerStarted","Data":"743b3347e9bfc67743c1d12d642d3877d05a6b762a5e1bfa35ad0c3f14efe090"} Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.718071 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bc7479fc9-jhvmx" event={"ID":"fe041ac0-b6d6-4ce7-8d87-c91e695bcf20","Type":"ContainerDied","Data":"abdce33177212cde9de28ef362cd6d21bd60d8807b089f1be2aef8c4f7eebcfa"} Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.718231 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6bc7479fc9-jhvmx" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.736144 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.736210 4893 scope.go:117] "RemoveContainer" containerID="e0308c113eb3e2da6d8b5ec320e11fbaa78a704e31414c99fea30584046274f9" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.741753 4893 generic.go:334] "Generic (PLEG): container finished" podID="7acb9d97-b4c7-49b4-90f0-e1d5d97d2581" containerID="7c0fa865cdda16c5e7099733a49b13b73a32bab1ba4edabe119ddc0ae6452bb4" exitCode=143 Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.742011 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7acb9d97-b4c7-49b4-90f0-e1d5d97d2581","Type":"ContainerDied","Data":"7c0fa865cdda16c5e7099733a49b13b73a32bab1ba4edabe119ddc0ae6452bb4"} Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.750247 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.530700144 podStartE2EDuration="14.750225592s" podCreationTimestamp="2026-03-14 07:21:46 +0000 UTC" firstStartedPulling="2026-03-14 07:21:47.318450653 +0000 UTC m=+1386.580627455" lastFinishedPulling="2026-03-14 07:21:59.537976121 +0000 UTC m=+1398.800152903" observedRunningTime="2026-03-14 07:22:00.749145906 +0000 UTC m=+1400.011322698" watchObservedRunningTime="2026-03-14 07:22:00.750225592 +0000 UTC m=+1400.012402384" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.750983 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.751024 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7776dc7c77-9lm7q" event={"ID":"203abd37-654f-480c-8a9d-719d767aec4d","Type":"ContainerStarted","Data":"fcc09e4de4f9872e828eb1c3dc4431a94be74768e7c23a24e47a9ed68a0404cc"} Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.752109 4893 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/fe041ac0-b6d6-4ce7-8d87-c91e695bcf20-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.752130 4893 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe041ac0-b6d6-4ce7-8d87-c91e695bcf20-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.752140 4893 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe041ac0-b6d6-4ce7-8d87-c91e695bcf20-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.777696 4893 scope.go:117] "RemoveContainer" containerID="9f8707403ee49e1a1d27cf771e6fbaa144ef8a07c7f4c6fa2d32eb4eca8fdb02" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.781716 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:22:00 crc kubenswrapper[4893]: E0314 07:22:00.782162 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe041ac0-b6d6-4ce7-8d87-c91e695bcf20" containerName="neutron-api" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.782176 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe041ac0-b6d6-4ce7-8d87-c91e695bcf20" containerName="neutron-api" Mar 14 07:22:00 crc kubenswrapper[4893]: E0314 07:22:00.782194 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe041ac0-b6d6-4ce7-8d87-c91e695bcf20" containerName="neutron-httpd" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.782200 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe041ac0-b6d6-4ce7-8d87-c91e695bcf20" containerName="neutron-httpd" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.782398 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe041ac0-b6d6-4ce7-8d87-c91e695bcf20" containerName="neutron-httpd" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.782409 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe041ac0-b6d6-4ce7-8d87-c91e695bcf20" containerName="neutron-api" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.792511 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.796506 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.801854 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.804571 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.828907 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.843054 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.848822 4893 scope.go:117] "RemoveContainer" containerID="314663c94fd1429e5895114a853d20946087751d2d9ddaebb38e2f68bc8422e1" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.853971 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcgnb\" (UniqueName: \"kubernetes.io/projected/31912b7a-b762-438d-820f-84fc2f15ff85-kube-api-access-hcgnb\") pod \"ceilometer-0\" (UID: \"31912b7a-b762-438d-820f-84fc2f15ff85\") " pod="openstack/ceilometer-0" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.854013 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/31912b7a-b762-438d-820f-84fc2f15ff85-run-httpd\") pod \"ceilometer-0\" (UID: \"31912b7a-b762-438d-820f-84fc2f15ff85\") " pod="openstack/ceilometer-0" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.854052 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31912b7a-b762-438d-820f-84fc2f15ff85-config-data\") pod \"ceilometer-0\" (UID: \"31912b7a-b762-438d-820f-84fc2f15ff85\") " pod="openstack/ceilometer-0" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.854078 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31912b7a-b762-438d-820f-84fc2f15ff85-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"31912b7a-b762-438d-820f-84fc2f15ff85\") " pod="openstack/ceilometer-0" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.854126 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/31912b7a-b762-438d-820f-84fc2f15ff85-log-httpd\") pod \"ceilometer-0\" (UID: \"31912b7a-b762-438d-820f-84fc2f15ff85\") " pod="openstack/ceilometer-0" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.854157 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31912b7a-b762-438d-820f-84fc2f15ff85-scripts\") pod \"ceilometer-0\" (UID: \"31912b7a-b762-438d-820f-84fc2f15ff85\") " pod="openstack/ceilometer-0" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.854199 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/31912b7a-b762-438d-820f-84fc2f15ff85-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"31912b7a-b762-438d-820f-84fc2f15ff85\") " pod="openstack/ceilometer-0" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.860643 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.862124 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.866087 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.877408 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6bc7479fc9-jhvmx"] Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.881367 4893 scope.go:117] "RemoveContainer" containerID="d9717e50eff9a9adfbe8030e6984221cdd339d064f3f05f3234001ebcd7ecbec" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.904589 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.911240 4893 scope.go:117] "RemoveContainer" containerID="edc8085e74dce58973128f9b98a03e41cc178be710dcdbb75c0d004e490080fd" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.911849 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6bc7479fc9-jhvmx"] Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.935655 4893 scope.go:117] "RemoveContainer" containerID="70e6e56d3864bf198a3fe7ff9fce7eed328301b7a8a47671c97cf9c60930d59c" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.955622 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1239ac87-7084-45c6-9eef-ecab07108656-scripts\") pod \"cinder-scheduler-0\" (UID: \"1239ac87-7084-45c6-9eef-ecab07108656\") " pod="openstack/cinder-scheduler-0" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.955708 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/31912b7a-b762-438d-820f-84fc2f15ff85-log-httpd\") pod \"ceilometer-0\" (UID: \"31912b7a-b762-438d-820f-84fc2f15ff85\") " pod="openstack/ceilometer-0" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.955747 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31912b7a-b762-438d-820f-84fc2f15ff85-scripts\") pod \"ceilometer-0\" (UID: \"31912b7a-b762-438d-820f-84fc2f15ff85\") " pod="openstack/ceilometer-0" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.955782 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/31912b7a-b762-438d-820f-84fc2f15ff85-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"31912b7a-b762-438d-820f-84fc2f15ff85\") " pod="openstack/ceilometer-0" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.955806 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1239ac87-7084-45c6-9eef-ecab07108656-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1239ac87-7084-45c6-9eef-ecab07108656\") " pod="openstack/cinder-scheduler-0" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.955873 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcgnb\" (UniqueName: \"kubernetes.io/projected/31912b7a-b762-438d-820f-84fc2f15ff85-kube-api-access-hcgnb\") pod \"ceilometer-0\" (UID: \"31912b7a-b762-438d-820f-84fc2f15ff85\") " pod="openstack/ceilometer-0" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.955892 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/31912b7a-b762-438d-820f-84fc2f15ff85-run-httpd\") pod \"ceilometer-0\" (UID: \"31912b7a-b762-438d-820f-84fc2f15ff85\") " pod="openstack/ceilometer-0" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.955917 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5dvb\" (UniqueName: \"kubernetes.io/projected/1239ac87-7084-45c6-9eef-ecab07108656-kube-api-access-z5dvb\") pod \"cinder-scheduler-0\" (UID: \"1239ac87-7084-45c6-9eef-ecab07108656\") " pod="openstack/cinder-scheduler-0" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.955939 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31912b7a-b762-438d-820f-84fc2f15ff85-config-data\") pod \"ceilometer-0\" (UID: \"31912b7a-b762-438d-820f-84fc2f15ff85\") " pod="openstack/ceilometer-0" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.955956 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1239ac87-7084-45c6-9eef-ecab07108656-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1239ac87-7084-45c6-9eef-ecab07108656\") " pod="openstack/cinder-scheduler-0" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.955973 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1239ac87-7084-45c6-9eef-ecab07108656-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1239ac87-7084-45c6-9eef-ecab07108656\") " pod="openstack/cinder-scheduler-0" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.955987 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1239ac87-7084-45c6-9eef-ecab07108656-config-data\") pod \"cinder-scheduler-0\" (UID: \"1239ac87-7084-45c6-9eef-ecab07108656\") " pod="openstack/cinder-scheduler-0" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.956005 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31912b7a-b762-438d-820f-84fc2f15ff85-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"31912b7a-b762-438d-820f-84fc2f15ff85\") " pod="openstack/ceilometer-0" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.957088 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/31912b7a-b762-438d-820f-84fc2f15ff85-log-httpd\") pod \"ceilometer-0\" (UID: \"31912b7a-b762-438d-820f-84fc2f15ff85\") " pod="openstack/ceilometer-0" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.958595 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/31912b7a-b762-438d-820f-84fc2f15ff85-run-httpd\") pod \"ceilometer-0\" (UID: \"31912b7a-b762-438d-820f-84fc2f15ff85\") " pod="openstack/ceilometer-0" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.961972 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31912b7a-b762-438d-820f-84fc2f15ff85-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"31912b7a-b762-438d-820f-84fc2f15ff85\") " pod="openstack/ceilometer-0" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.962538 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31912b7a-b762-438d-820f-84fc2f15ff85-scripts\") pod \"ceilometer-0\" (UID: \"31912b7a-b762-438d-820f-84fc2f15ff85\") " pod="openstack/ceilometer-0" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.962920 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31912b7a-b762-438d-820f-84fc2f15ff85-config-data\") pod \"ceilometer-0\" (UID: \"31912b7a-b762-438d-820f-84fc2f15ff85\") " pod="openstack/ceilometer-0" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.964114 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/31912b7a-b762-438d-820f-84fc2f15ff85-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"31912b7a-b762-438d-820f-84fc2f15ff85\") " pod="openstack/ceilometer-0" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.983974 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcgnb\" (UniqueName: \"kubernetes.io/projected/31912b7a-b762-438d-820f-84fc2f15ff85-kube-api-access-hcgnb\") pod \"ceilometer-0\" (UID: \"31912b7a-b762-438d-820f-84fc2f15ff85\") " pod="openstack/ceilometer-0" Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.992240 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-lpk6v"] Mar 14 07:22:00 crc kubenswrapper[4893]: I0314 07:22:00.993413 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-lpk6v" Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.006757 4893 scope.go:117] "RemoveContainer" containerID="fb1d21418e4f5a24e0fc43ce4b7a9d513c6c2d457b8866d744b8f46d082bc8e5" Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.027046 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-lpk6v"] Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.045562 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-bbgd4"] Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.047274 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-bbgd4" Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.053822 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-bbgd4"] Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.058217 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1239ac87-7084-45c6-9eef-ecab07108656-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1239ac87-7084-45c6-9eef-ecab07108656\") " pod="openstack/cinder-scheduler-0" Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.058263 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1239ac87-7084-45c6-9eef-ecab07108656-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1239ac87-7084-45c6-9eef-ecab07108656\") " pod="openstack/cinder-scheduler-0" Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.058294 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1239ac87-7084-45c6-9eef-ecab07108656-config-data\") pod \"cinder-scheduler-0\" (UID: \"1239ac87-7084-45c6-9eef-ecab07108656\") " pod="openstack/cinder-scheduler-0" Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.058314 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1239ac87-7084-45c6-9eef-ecab07108656-scripts\") pod \"cinder-scheduler-0\" (UID: \"1239ac87-7084-45c6-9eef-ecab07108656\") " pod="openstack/cinder-scheduler-0" Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.058377 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f50f3b01-c781-4d53-8c0c-62cb289ebbde-operator-scripts\") pod \"nova-api-db-create-lpk6v\" (UID: \"f50f3b01-c781-4d53-8c0c-62cb289ebbde\") " pod="openstack/nova-api-db-create-lpk6v" Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.061715 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rws76\" (UniqueName: \"kubernetes.io/projected/f50f3b01-c781-4d53-8c0c-62cb289ebbde-kube-api-access-rws76\") pod \"nova-api-db-create-lpk6v\" (UID: \"f50f3b01-c781-4d53-8c0c-62cb289ebbde\") " pod="openstack/nova-api-db-create-lpk6v" Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.061774 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1239ac87-7084-45c6-9eef-ecab07108656-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1239ac87-7084-45c6-9eef-ecab07108656\") " pod="openstack/cinder-scheduler-0" Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.061940 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5dvb\" (UniqueName: \"kubernetes.io/projected/1239ac87-7084-45c6-9eef-ecab07108656-kube-api-access-z5dvb\") pod \"cinder-scheduler-0\" (UID: \"1239ac87-7084-45c6-9eef-ecab07108656\") " pod="openstack/cinder-scheduler-0" Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.065134 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1239ac87-7084-45c6-9eef-ecab07108656-config-data\") pod \"cinder-scheduler-0\" (UID: \"1239ac87-7084-45c6-9eef-ecab07108656\") " pod="openstack/cinder-scheduler-0" Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.065411 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1239ac87-7084-45c6-9eef-ecab07108656-scripts\") pod \"cinder-scheduler-0\" (UID: \"1239ac87-7084-45c6-9eef-ecab07108656\") " pod="openstack/cinder-scheduler-0" Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.065550 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1239ac87-7084-45c6-9eef-ecab07108656-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1239ac87-7084-45c6-9eef-ecab07108656\") " pod="openstack/cinder-scheduler-0" Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.067290 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557882-7m6mq"] Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.067886 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1239ac87-7084-45c6-9eef-ecab07108656-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1239ac87-7084-45c6-9eef-ecab07108656\") " pod="openstack/cinder-scheduler-0" Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.068240 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1239ac87-7084-45c6-9eef-ecab07108656-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1239ac87-7084-45c6-9eef-ecab07108656\") " pod="openstack/cinder-scheduler-0" Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.082235 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5dvb\" (UniqueName: \"kubernetes.io/projected/1239ac87-7084-45c6-9eef-ecab07108656-kube-api-access-z5dvb\") pod \"cinder-scheduler-0\" (UID: \"1239ac87-7084-45c6-9eef-ecab07108656\") " pod="openstack/cinder-scheduler-0" Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.082836 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-p7mnh"] Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.084221 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-p7mnh" Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.102459 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-p7mnh"] Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.120589 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.163502 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7s9hd\" (UniqueName: \"kubernetes.io/projected/481d0c9c-3ac9-4bae-bd8c-52489deda58c-kube-api-access-7s9hd\") pod \"nova-cell1-db-create-p7mnh\" (UID: \"481d0c9c-3ac9-4bae-bd8c-52489deda58c\") " pod="openstack/nova-cell1-db-create-p7mnh" Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.163572 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/481d0c9c-3ac9-4bae-bd8c-52489deda58c-operator-scripts\") pod \"nova-cell1-db-create-p7mnh\" (UID: \"481d0c9c-3ac9-4bae-bd8c-52489deda58c\") " pod="openstack/nova-cell1-db-create-p7mnh" Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.163635 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e685f75c-8311-4120-88ca-be0e6060d132-operator-scripts\") pod \"nova-cell0-db-create-bbgd4\" (UID: \"e685f75c-8311-4120-88ca-be0e6060d132\") " pod="openstack/nova-cell0-db-create-bbgd4" Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.163660 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5frv\" (UniqueName: \"kubernetes.io/projected/e685f75c-8311-4120-88ca-be0e6060d132-kube-api-access-j5frv\") pod \"nova-cell0-db-create-bbgd4\" (UID: \"e685f75c-8311-4120-88ca-be0e6060d132\") " pod="openstack/nova-cell0-db-create-bbgd4" Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.163704 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f50f3b01-c781-4d53-8c0c-62cb289ebbde-operator-scripts\") pod \"nova-api-db-create-lpk6v\" (UID: \"f50f3b01-c781-4d53-8c0c-62cb289ebbde\") " pod="openstack/nova-api-db-create-lpk6v" Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.163907 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rws76\" (UniqueName: \"kubernetes.io/projected/f50f3b01-c781-4d53-8c0c-62cb289ebbde-kube-api-access-rws76\") pod \"nova-api-db-create-lpk6v\" (UID: \"f50f3b01-c781-4d53-8c0c-62cb289ebbde\") " pod="openstack/nova-api-db-create-lpk6v" Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.164649 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f50f3b01-c781-4d53-8c0c-62cb289ebbde-operator-scripts\") pod \"nova-api-db-create-lpk6v\" (UID: \"f50f3b01-c781-4d53-8c0c-62cb289ebbde\") " pod="openstack/nova-api-db-create-lpk6v" Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.180321 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rws76\" (UniqueName: \"kubernetes.io/projected/f50f3b01-c781-4d53-8c0c-62cb289ebbde-kube-api-access-rws76\") pod \"nova-api-db-create-lpk6v\" (UID: \"f50f3b01-c781-4d53-8c0c-62cb289ebbde\") " pod="openstack/nova-api-db-create-lpk6v" Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.183535 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.246569 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-f82a-account-create-update-99zsr"] Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.248619 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-f82a-account-create-update-99zsr" Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.254903 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.259438 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-f82a-account-create-update-99zsr"] Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.283669 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/435a4f9d-9f5b-4f9f-bc4a-b4651831bac1-operator-scripts\") pod \"nova-api-f82a-account-create-update-99zsr\" (UID: \"435a4f9d-9f5b-4f9f-bc4a-b4651831bac1\") " pod="openstack/nova-api-f82a-account-create-update-99zsr" Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.283898 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fww67\" (UniqueName: \"kubernetes.io/projected/435a4f9d-9f5b-4f9f-bc4a-b4651831bac1-kube-api-access-fww67\") pod \"nova-api-f82a-account-create-update-99zsr\" (UID: \"435a4f9d-9f5b-4f9f-bc4a-b4651831bac1\") " pod="openstack/nova-api-f82a-account-create-update-99zsr" Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.284119 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7s9hd\" (UniqueName: \"kubernetes.io/projected/481d0c9c-3ac9-4bae-bd8c-52489deda58c-kube-api-access-7s9hd\") pod \"nova-cell1-db-create-p7mnh\" (UID: \"481d0c9c-3ac9-4bae-bd8c-52489deda58c\") " pod="openstack/nova-cell1-db-create-p7mnh" Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.284165 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/481d0c9c-3ac9-4bae-bd8c-52489deda58c-operator-scripts\") pod \"nova-cell1-db-create-p7mnh\" (UID: \"481d0c9c-3ac9-4bae-bd8c-52489deda58c\") " pod="openstack/nova-cell1-db-create-p7mnh" Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.284261 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e685f75c-8311-4120-88ca-be0e6060d132-operator-scripts\") pod \"nova-cell0-db-create-bbgd4\" (UID: \"e685f75c-8311-4120-88ca-be0e6060d132\") " pod="openstack/nova-cell0-db-create-bbgd4" Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.284279 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5frv\" (UniqueName: \"kubernetes.io/projected/e685f75c-8311-4120-88ca-be0e6060d132-kube-api-access-j5frv\") pod \"nova-cell0-db-create-bbgd4\" (UID: \"e685f75c-8311-4120-88ca-be0e6060d132\") " pod="openstack/nova-cell0-db-create-bbgd4" Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.286199 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e685f75c-8311-4120-88ca-be0e6060d132-operator-scripts\") pod \"nova-cell0-db-create-bbgd4\" (UID: \"e685f75c-8311-4120-88ca-be0e6060d132\") " pod="openstack/nova-cell0-db-create-bbgd4" Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.288335 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/481d0c9c-3ac9-4bae-bd8c-52489deda58c-operator-scripts\") pod \"nova-cell1-db-create-p7mnh\" (UID: \"481d0c9c-3ac9-4bae-bd8c-52489deda58c\") " pod="openstack/nova-cell1-db-create-p7mnh" Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.314432 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7s9hd\" (UniqueName: \"kubernetes.io/projected/481d0c9c-3ac9-4bae-bd8c-52489deda58c-kube-api-access-7s9hd\") pod \"nova-cell1-db-create-p7mnh\" (UID: \"481d0c9c-3ac9-4bae-bd8c-52489deda58c\") " pod="openstack/nova-cell1-db-create-p7mnh" Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.329120 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5frv\" (UniqueName: \"kubernetes.io/projected/e685f75c-8311-4120-88ca-be0e6060d132-kube-api-access-j5frv\") pod \"nova-cell0-db-create-bbgd4\" (UID: \"e685f75c-8311-4120-88ca-be0e6060d132\") " pod="openstack/nova-cell0-db-create-bbgd4" Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.335740 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-lpk6v" Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.367168 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-bbgd4" Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.387731 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/435a4f9d-9f5b-4f9f-bc4a-b4651831bac1-operator-scripts\") pod \"nova-api-f82a-account-create-update-99zsr\" (UID: \"435a4f9d-9f5b-4f9f-bc4a-b4651831bac1\") " pod="openstack/nova-api-f82a-account-create-update-99zsr" Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.401674 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/435a4f9d-9f5b-4f9f-bc4a-b4651831bac1-operator-scripts\") pod \"nova-api-f82a-account-create-update-99zsr\" (UID: \"435a4f9d-9f5b-4f9f-bc4a-b4651831bac1\") " pod="openstack/nova-api-f82a-account-create-update-99zsr" Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.401972 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fww67\" (UniqueName: \"kubernetes.io/projected/435a4f9d-9f5b-4f9f-bc4a-b4651831bac1-kube-api-access-fww67\") pod \"nova-api-f82a-account-create-update-99zsr\" (UID: \"435a4f9d-9f5b-4f9f-bc4a-b4651831bac1\") " pod="openstack/nova-api-f82a-account-create-update-99zsr" Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.443185 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-p7mnh" Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.456040 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29521c0b-4522-499b-ba9a-f2248f7a0a07" path="/var/lib/kubelet/pods/29521c0b-4522-499b-ba9a-f2248f7a0a07/volumes" Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.458426 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f3014ed-22ab-43fd-bb9b-44d9a4be10b5" path="/var/lib/kubelet/pods/4f3014ed-22ab-43fd-bb9b-44d9a4be10b5/volumes" Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.464497 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe041ac0-b6d6-4ce7-8d87-c91e695bcf20" path="/var/lib/kubelet/pods/fe041ac0-b6d6-4ce7-8d87-c91e695bcf20/volumes" Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.469977 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-259c-account-create-update-g6bdg"] Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.471555 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-259c-account-create-update-g6bdg" Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.472789 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fww67\" (UniqueName: \"kubernetes.io/projected/435a4f9d-9f5b-4f9f-bc4a-b4651831bac1-kube-api-access-fww67\") pod \"nova-api-f82a-account-create-update-99zsr\" (UID: \"435a4f9d-9f5b-4f9f-bc4a-b4651831bac1\") " pod="openstack/nova-api-f82a-account-create-update-99zsr" Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.476769 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.479286 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-259c-account-create-update-g6bdg"] Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.590184 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-f82a-account-create-update-99zsr" Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.610631 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f09ba521-7c5d-4b30-902d-d7634a77e369-operator-scripts\") pod \"nova-cell0-259c-account-create-update-g6bdg\" (UID: \"f09ba521-7c5d-4b30-902d-d7634a77e369\") " pod="openstack/nova-cell0-259c-account-create-update-g6bdg" Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.610707 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85bxz\" (UniqueName: \"kubernetes.io/projected/f09ba521-7c5d-4b30-902d-d7634a77e369-kube-api-access-85bxz\") pod \"nova-cell0-259c-account-create-update-g6bdg\" (UID: \"f09ba521-7c5d-4b30-902d-d7634a77e369\") " pod="openstack/nova-cell0-259c-account-create-update-g6bdg" Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.668212 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-33a2-account-create-update-tbdb9"] Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.669768 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-33a2-account-create-update-tbdb9" Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.680783 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-33a2-account-create-update-tbdb9"] Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.683734 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.712622 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f09ba521-7c5d-4b30-902d-d7634a77e369-operator-scripts\") pod \"nova-cell0-259c-account-create-update-g6bdg\" (UID: \"f09ba521-7c5d-4b30-902d-d7634a77e369\") " pod="openstack/nova-cell0-259c-account-create-update-g6bdg" Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.712699 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggzn7\" (UniqueName: \"kubernetes.io/projected/b113c1f3-1fdd-4fd0-806e-42707f79ba1e-kube-api-access-ggzn7\") pod \"nova-cell1-33a2-account-create-update-tbdb9\" (UID: \"b113c1f3-1fdd-4fd0-806e-42707f79ba1e\") " pod="openstack/nova-cell1-33a2-account-create-update-tbdb9" Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.712721 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85bxz\" (UniqueName: \"kubernetes.io/projected/f09ba521-7c5d-4b30-902d-d7634a77e369-kube-api-access-85bxz\") pod \"nova-cell0-259c-account-create-update-g6bdg\" (UID: \"f09ba521-7c5d-4b30-902d-d7634a77e369\") " pod="openstack/nova-cell0-259c-account-create-update-g6bdg" Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.712791 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b113c1f3-1fdd-4fd0-806e-42707f79ba1e-operator-scripts\") pod \"nova-cell1-33a2-account-create-update-tbdb9\" (UID: \"b113c1f3-1fdd-4fd0-806e-42707f79ba1e\") " pod="openstack/nova-cell1-33a2-account-create-update-tbdb9" Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.714009 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f09ba521-7c5d-4b30-902d-d7634a77e369-operator-scripts\") pod \"nova-cell0-259c-account-create-update-g6bdg\" (UID: \"f09ba521-7c5d-4b30-902d-d7634a77e369\") " pod="openstack/nova-cell0-259c-account-create-update-g6bdg" Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.744154 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85bxz\" (UniqueName: \"kubernetes.io/projected/f09ba521-7c5d-4b30-902d-d7634a77e369-kube-api-access-85bxz\") pod \"nova-cell0-259c-account-create-update-g6bdg\" (UID: \"f09ba521-7c5d-4b30-902d-d7634a77e369\") " pod="openstack/nova-cell0-259c-account-create-update-g6bdg" Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.777776 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.807154 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.813754 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557882-7m6mq" event={"ID":"eba46be1-6a5c-4665-aebf-6b243dec4ed7","Type":"ContainerStarted","Data":"39a8cf740031ef08b89ca65bc20f98050d01114901e5535d34932812c3393b07"} Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.815105 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggzn7\" (UniqueName: \"kubernetes.io/projected/b113c1f3-1fdd-4fd0-806e-42707f79ba1e-kube-api-access-ggzn7\") pod \"nova-cell1-33a2-account-create-update-tbdb9\" (UID: \"b113c1f3-1fdd-4fd0-806e-42707f79ba1e\") " pod="openstack/nova-cell1-33a2-account-create-update-tbdb9" Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.815206 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b113c1f3-1fdd-4fd0-806e-42707f79ba1e-operator-scripts\") pod \"nova-cell1-33a2-account-create-update-tbdb9\" (UID: \"b113c1f3-1fdd-4fd0-806e-42707f79ba1e\") " pod="openstack/nova-cell1-33a2-account-create-update-tbdb9" Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.816060 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b113c1f3-1fdd-4fd0-806e-42707f79ba1e-operator-scripts\") pod \"nova-cell1-33a2-account-create-update-tbdb9\" (UID: \"b113c1f3-1fdd-4fd0-806e-42707f79ba1e\") " pod="openstack/nova-cell1-33a2-account-create-update-tbdb9" Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.822880 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7776dc7c77-9lm7q" event={"ID":"203abd37-654f-480c-8a9d-719d767aec4d","Type":"ContainerStarted","Data":"52854fc52e28bf94ad21c38ff77654cb2e7662a16eb28502030a3a7f5acaab4f"} Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.822947 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7776dc7c77-9lm7q" event={"ID":"203abd37-654f-480c-8a9d-719d767aec4d","Type":"ContainerStarted","Data":"26fa9847a83bdbc2805cf9345b971f151cfc501d5ecd4dfe251d3518ad82f625"} Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.823617 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7776dc7c77-9lm7q" Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.823653 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7776dc7c77-9lm7q" Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.840839 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggzn7\" (UniqueName: \"kubernetes.io/projected/b113c1f3-1fdd-4fd0-806e-42707f79ba1e-kube-api-access-ggzn7\") pod \"nova-cell1-33a2-account-create-update-tbdb9\" (UID: \"b113c1f3-1fdd-4fd0-806e-42707f79ba1e\") " pod="openstack/nova-cell1-33a2-account-create-update-tbdb9" Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.863013 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-7776dc7c77-9lm7q" podStartSLOduration=8.862991131 podStartE2EDuration="8.862991131s" podCreationTimestamp="2026-03-14 07:21:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:22:01.851633365 +0000 UTC m=+1401.113810247" watchObservedRunningTime="2026-03-14 07:22:01.862991131 +0000 UTC m=+1401.125167923" Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.976070 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-259c-account-create-update-g6bdg" Mar 14 07:22:01 crc kubenswrapper[4893]: I0314 07:22:01.998254 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-33a2-account-create-update-tbdb9" Mar 14 07:22:02 crc kubenswrapper[4893]: I0314 07:22:02.043293 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-lpk6v"] Mar 14 07:22:02 crc kubenswrapper[4893]: I0314 07:22:02.210218 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-p7mnh"] Mar 14 07:22:02 crc kubenswrapper[4893]: I0314 07:22:02.224859 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-bbgd4"] Mar 14 07:22:02 crc kubenswrapper[4893]: W0314 07:22:02.227558 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod481d0c9c_3ac9_4bae_bd8c_52489deda58c.slice/crio-1edc68fcc86d80fa096639a8aef62520284d052bcbf85934aa2035ccb7c85ce8 WatchSource:0}: Error finding container 1edc68fcc86d80fa096639a8aef62520284d052bcbf85934aa2035ccb7c85ce8: Status 404 returned error can't find the container with id 1edc68fcc86d80fa096639a8aef62520284d052bcbf85934aa2035ccb7c85ce8 Mar 14 07:22:02 crc kubenswrapper[4893]: I0314 07:22:02.389883 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-f82a-account-create-update-99zsr"] Mar 14 07:22:02 crc kubenswrapper[4893]: I0314 07:22:02.614897 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-259c-account-create-update-g6bdg"] Mar 14 07:22:02 crc kubenswrapper[4893]: I0314 07:22:02.707831 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-33a2-account-create-update-tbdb9"] Mar 14 07:22:02 crc kubenswrapper[4893]: I0314 07:22:02.867754 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"31912b7a-b762-438d-820f-84fc2f15ff85","Type":"ContainerStarted","Data":"c58a47973eaecdc8e02227a8b12a8df31f92d1ce64dcf9511756574e58a81857"} Mar 14 07:22:02 crc kubenswrapper[4893]: I0314 07:22:02.882769 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-lpk6v" event={"ID":"f50f3b01-c781-4d53-8c0c-62cb289ebbde","Type":"ContainerStarted","Data":"0ff8d81af854d517f8b4724b88564193aeee08ef8708f6f38437ce2e9ccf15a3"} Mar 14 07:22:02 crc kubenswrapper[4893]: I0314 07:22:02.882968 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-lpk6v" event={"ID":"f50f3b01-c781-4d53-8c0c-62cb289ebbde","Type":"ContainerStarted","Data":"05347266b0b7a2cf9b6a0416cfe3f28596eb1f2d0c5e113b6bfd558ed7a206f4"} Mar 14 07:22:02 crc kubenswrapper[4893]: I0314 07:22:02.895896 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-f82a-account-create-update-99zsr" event={"ID":"435a4f9d-9f5b-4f9f-bc4a-b4651831bac1","Type":"ContainerStarted","Data":"e091e6359069072e95d849579519dc6127d715d0682d72f22a1ab3894b0216e7"} Mar 14 07:22:02 crc kubenswrapper[4893]: I0314 07:22:02.905316 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-lpk6v" podStartSLOduration=2.905303866 podStartE2EDuration="2.905303866s" podCreationTimestamp="2026-03-14 07:22:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:22:02.896364298 +0000 UTC m=+1402.158541090" watchObservedRunningTime="2026-03-14 07:22:02.905303866 +0000 UTC m=+1402.167480658" Mar 14 07:22:02 crc kubenswrapper[4893]: I0314 07:22:02.911714 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1239ac87-7084-45c6-9eef-ecab07108656","Type":"ContainerStarted","Data":"3b1b5b753d3d3c389a58958b6db29604cea9261ff16da90a1c0179249689611d"} Mar 14 07:22:02 crc kubenswrapper[4893]: I0314 07:22:02.919633 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-bbgd4" event={"ID":"e685f75c-8311-4120-88ca-be0e6060d132","Type":"ContainerStarted","Data":"4145b69299be2b3f60904fff51b0470316df9d6499550625f6c1a91b3089e847"} Mar 14 07:22:02 crc kubenswrapper[4893]: I0314 07:22:02.937162 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-p7mnh" event={"ID":"481d0c9c-3ac9-4bae-bd8c-52489deda58c","Type":"ContainerStarted","Data":"19931e7934a437538c7634c56ea710f8724a76205f9ecedc8b6d7e42a3abc4dd"} Mar 14 07:22:02 crc kubenswrapper[4893]: I0314 07:22:02.937198 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-p7mnh" event={"ID":"481d0c9c-3ac9-4bae-bd8c-52489deda58c","Type":"ContainerStarted","Data":"1edc68fcc86d80fa096639a8aef62520284d052bcbf85934aa2035ccb7c85ce8"} Mar 14 07:22:02 crc kubenswrapper[4893]: I0314 07:22:02.965857 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-p7mnh" podStartSLOduration=2.96583797 podStartE2EDuration="2.96583797s" podCreationTimestamp="2026-03-14 07:22:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:22:02.960893369 +0000 UTC m=+1402.223070151" watchObservedRunningTime="2026-03-14 07:22:02.96583797 +0000 UTC m=+1402.228014762" Mar 14 07:22:03 crc kubenswrapper[4893]: W0314 07:22:03.017590 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf09ba521_7c5d_4b30_902d_d7634a77e369.slice/crio-fb8f8994ed099008f89f993c685df68001911b5d498e559b32440364880031ec WatchSource:0}: Error finding container fb8f8994ed099008f89f993c685df68001911b5d498e559b32440364880031ec: Status 404 returned error can't find the container with id fb8f8994ed099008f89f993c685df68001911b5d498e559b32440364880031ec Mar 14 07:22:03 crc kubenswrapper[4893]: I0314 07:22:03.154666 4893 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="7acb9d97-b4c7-49b4-90f0-e1d5d97d2581" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.156:9292/healthcheck\": read tcp 10.217.0.2:52066->10.217.0.156:9292: read: connection reset by peer" Mar 14 07:22:03 crc kubenswrapper[4893]: I0314 07:22:03.154666 4893 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="7acb9d97-b4c7-49b4-90f0-e1d5d97d2581" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.156:9292/healthcheck\": read tcp 10.217.0.2:52078->10.217.0.156:9292: read: connection reset by peer" Mar 14 07:22:03 crc kubenswrapper[4893]: I0314 07:22:03.952254 4893 generic.go:334] "Generic (PLEG): container finished" podID="7acb9d97-b4c7-49b4-90f0-e1d5d97d2581" containerID="7789f926434baf6d22b935982b2af6e952f4a729d8d70fb74ec2c13da50d7032" exitCode=0 Mar 14 07:22:03 crc kubenswrapper[4893]: I0314 07:22:03.952383 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7acb9d97-b4c7-49b4-90f0-e1d5d97d2581","Type":"ContainerDied","Data":"7789f926434baf6d22b935982b2af6e952f4a729d8d70fb74ec2c13da50d7032"} Mar 14 07:22:03 crc kubenswrapper[4893]: I0314 07:22:03.954837 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7acb9d97-b4c7-49b4-90f0-e1d5d97d2581","Type":"ContainerDied","Data":"b9e5bbe5677b8d1412b58ad0cec33d0fed3151ffc4aa57e601365ce0294a2b78"} Mar 14 07:22:03 crc kubenswrapper[4893]: I0314 07:22:03.954852 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9e5bbe5677b8d1412b58ad0cec33d0fed3151ffc4aa57e601365ce0294a2b78" Mar 14 07:22:03 crc kubenswrapper[4893]: I0314 07:22:03.957205 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"31912b7a-b762-438d-820f-84fc2f15ff85","Type":"ContainerStarted","Data":"71b61bbcb36ee4b29d89e6fb6180dc6c4d178973e2003b6cff8a03ab59ae3be1"} Mar 14 07:22:03 crc kubenswrapper[4893]: I0314 07:22:03.964543 4893 generic.go:334] "Generic (PLEG): container finished" podID="f50f3b01-c781-4d53-8c0c-62cb289ebbde" containerID="0ff8d81af854d517f8b4724b88564193aeee08ef8708f6f38437ce2e9ccf15a3" exitCode=0 Mar 14 07:22:03 crc kubenswrapper[4893]: I0314 07:22:03.964614 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-lpk6v" event={"ID":"f50f3b01-c781-4d53-8c0c-62cb289ebbde","Type":"ContainerDied","Data":"0ff8d81af854d517f8b4724b88564193aeee08ef8708f6f38437ce2e9ccf15a3"} Mar 14 07:22:03 crc kubenswrapper[4893]: I0314 07:22:03.971152 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1239ac87-7084-45c6-9eef-ecab07108656","Type":"ContainerStarted","Data":"200f0500dd95736bf725b120a1467c014a0ae3d6596c940427a6b0fa69a9ddd9"} Mar 14 07:22:03 crc kubenswrapper[4893]: I0314 07:22:03.980359 4893 generic.go:334] "Generic (PLEG): container finished" podID="e685f75c-8311-4120-88ca-be0e6060d132" containerID="78c6cc2dea39de360a990b14c55febaf458a97e1d80a447d463589cdbf99d88b" exitCode=0 Mar 14 07:22:03 crc kubenswrapper[4893]: I0314 07:22:03.980955 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-bbgd4" event={"ID":"e685f75c-8311-4120-88ca-be0e6060d132","Type":"ContainerDied","Data":"78c6cc2dea39de360a990b14c55febaf458a97e1d80a447d463589cdbf99d88b"} Mar 14 07:22:03 crc kubenswrapper[4893]: I0314 07:22:03.983190 4893 generic.go:334] "Generic (PLEG): container finished" podID="481d0c9c-3ac9-4bae-bd8c-52489deda58c" containerID="19931e7934a437538c7634c56ea710f8724a76205f9ecedc8b6d7e42a3abc4dd" exitCode=0 Mar 14 07:22:03 crc kubenswrapper[4893]: I0314 07:22:03.983283 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-p7mnh" event={"ID":"481d0c9c-3ac9-4bae-bd8c-52489deda58c","Type":"ContainerDied","Data":"19931e7934a437538c7634c56ea710f8724a76205f9ecedc8b6d7e42a3abc4dd"} Mar 14 07:22:03 crc kubenswrapper[4893]: I0314 07:22:03.985171 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-259c-account-create-update-g6bdg" event={"ID":"f09ba521-7c5d-4b30-902d-d7634a77e369","Type":"ContainerStarted","Data":"838c803bb68c10a1197d0b8154267c2281ac4430fa3451398720d14bddbc3c4a"} Mar 14 07:22:03 crc kubenswrapper[4893]: I0314 07:22:03.985198 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-259c-account-create-update-g6bdg" event={"ID":"f09ba521-7c5d-4b30-902d-d7634a77e369","Type":"ContainerStarted","Data":"fb8f8994ed099008f89f993c685df68001911b5d498e559b32440364880031ec"} Mar 14 07:22:03 crc kubenswrapper[4893]: I0314 07:22:03.991450 4893 generic.go:334] "Generic (PLEG): container finished" podID="eba46be1-6a5c-4665-aebf-6b243dec4ed7" containerID="1dda442ed9ea7342ce7b0e9960fc52e71ada89b41249ed7867a630b253b5adb5" exitCode=0 Mar 14 07:22:03 crc kubenswrapper[4893]: I0314 07:22:03.991533 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557882-7m6mq" event={"ID":"eba46be1-6a5c-4665-aebf-6b243dec4ed7","Type":"ContainerDied","Data":"1dda442ed9ea7342ce7b0e9960fc52e71ada89b41249ed7867a630b253b5adb5"} Mar 14 07:22:03 crc kubenswrapper[4893]: I0314 07:22:03.999335 4893 generic.go:334] "Generic (PLEG): container finished" podID="435a4f9d-9f5b-4f9f-bc4a-b4651831bac1" containerID="9ed2f5c118d7fd5926e474d0b57158062f4bbe6f5683be858f729b9d25ff8388" exitCode=0 Mar 14 07:22:03 crc kubenswrapper[4893]: I0314 07:22:03.999500 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-f82a-account-create-update-99zsr" event={"ID":"435a4f9d-9f5b-4f9f-bc4a-b4651831bac1","Type":"ContainerDied","Data":"9ed2f5c118d7fd5926e474d0b57158062f4bbe6f5683be858f729b9d25ff8388"} Mar 14 07:22:04 crc kubenswrapper[4893]: I0314 07:22:04.002922 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-33a2-account-create-update-tbdb9" event={"ID":"b113c1f3-1fdd-4fd0-806e-42707f79ba1e","Type":"ContainerStarted","Data":"de8f12a68ff328509ec01f4da9cfd1fdd44d102507da5c518743305ac27423ab"} Mar 14 07:22:04 crc kubenswrapper[4893]: I0314 07:22:04.002971 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-33a2-account-create-update-tbdb9" event={"ID":"b113c1f3-1fdd-4fd0-806e-42707f79ba1e","Type":"ContainerStarted","Data":"a3b3ac1b545d66d895167b2bc981dfa3135424dae9d7961de958b5d8b38251d5"} Mar 14 07:22:04 crc kubenswrapper[4893]: I0314 07:22:04.035834 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 14 07:22:04 crc kubenswrapper[4893]: I0314 07:22:04.068280 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-259c-account-create-update-g6bdg" podStartSLOduration=3.068257877 podStartE2EDuration="3.068257877s" podCreationTimestamp="2026-03-14 07:22:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:22:04.04165901 +0000 UTC m=+1403.303835812" watchObservedRunningTime="2026-03-14 07:22:04.068257877 +0000 UTC m=+1403.330434669" Mar 14 07:22:04 crc kubenswrapper[4893]: I0314 07:22:04.075871 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"7acb9d97-b4c7-49b4-90f0-e1d5d97d2581\" (UID: \"7acb9d97-b4c7-49b4-90f0-e1d5d97d2581\") " Mar 14 07:22:04 crc kubenswrapper[4893]: I0314 07:22:04.075921 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7acb9d97-b4c7-49b4-90f0-e1d5d97d2581-logs\") pod \"7acb9d97-b4c7-49b4-90f0-e1d5d97d2581\" (UID: \"7acb9d97-b4c7-49b4-90f0-e1d5d97d2581\") " Mar 14 07:22:04 crc kubenswrapper[4893]: I0314 07:22:04.075994 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7acb9d97-b4c7-49b4-90f0-e1d5d97d2581-config-data\") pod \"7acb9d97-b4c7-49b4-90f0-e1d5d97d2581\" (UID: \"7acb9d97-b4c7-49b4-90f0-e1d5d97d2581\") " Mar 14 07:22:04 crc kubenswrapper[4893]: I0314 07:22:04.076038 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7acb9d97-b4c7-49b4-90f0-e1d5d97d2581-combined-ca-bundle\") pod \"7acb9d97-b4c7-49b4-90f0-e1d5d97d2581\" (UID: \"7acb9d97-b4c7-49b4-90f0-e1d5d97d2581\") " Mar 14 07:22:04 crc kubenswrapper[4893]: I0314 07:22:04.076126 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7acb9d97-b4c7-49b4-90f0-e1d5d97d2581-scripts\") pod \"7acb9d97-b4c7-49b4-90f0-e1d5d97d2581\" (UID: \"7acb9d97-b4c7-49b4-90f0-e1d5d97d2581\") " Mar 14 07:22:04 crc kubenswrapper[4893]: I0314 07:22:04.076236 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpvsn\" (UniqueName: \"kubernetes.io/projected/7acb9d97-b4c7-49b4-90f0-e1d5d97d2581-kube-api-access-dpvsn\") pod \"7acb9d97-b4c7-49b4-90f0-e1d5d97d2581\" (UID: \"7acb9d97-b4c7-49b4-90f0-e1d5d97d2581\") " Mar 14 07:22:04 crc kubenswrapper[4893]: I0314 07:22:04.076309 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7acb9d97-b4c7-49b4-90f0-e1d5d97d2581-public-tls-certs\") pod \"7acb9d97-b4c7-49b4-90f0-e1d5d97d2581\" (UID: \"7acb9d97-b4c7-49b4-90f0-e1d5d97d2581\") " Mar 14 07:22:04 crc kubenswrapper[4893]: I0314 07:22:04.076337 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7acb9d97-b4c7-49b4-90f0-e1d5d97d2581-httpd-run\") pod \"7acb9d97-b4c7-49b4-90f0-e1d5d97d2581\" (UID: \"7acb9d97-b4c7-49b4-90f0-e1d5d97d2581\") " Mar 14 07:22:04 crc kubenswrapper[4893]: I0314 07:22:04.082617 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-33a2-account-create-update-tbdb9" podStartSLOduration=3.082591336 podStartE2EDuration="3.082591336s" podCreationTimestamp="2026-03-14 07:22:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:22:04.059838502 +0000 UTC m=+1403.322015294" watchObservedRunningTime="2026-03-14 07:22:04.082591336 +0000 UTC m=+1403.344768148" Mar 14 07:22:04 crc kubenswrapper[4893]: I0314 07:22:04.085183 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7acb9d97-b4c7-49b4-90f0-e1d5d97d2581-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7acb9d97-b4c7-49b4-90f0-e1d5d97d2581" (UID: "7acb9d97-b4c7-49b4-90f0-e1d5d97d2581"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:22:04 crc kubenswrapper[4893]: I0314 07:22:04.085496 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7acb9d97-b4c7-49b4-90f0-e1d5d97d2581-logs" (OuterVolumeSpecName: "logs") pod "7acb9d97-b4c7-49b4-90f0-e1d5d97d2581" (UID: "7acb9d97-b4c7-49b4-90f0-e1d5d97d2581"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:22:04 crc kubenswrapper[4893]: I0314 07:22:04.090766 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7acb9d97-b4c7-49b4-90f0-e1d5d97d2581-kube-api-access-dpvsn" (OuterVolumeSpecName: "kube-api-access-dpvsn") pod "7acb9d97-b4c7-49b4-90f0-e1d5d97d2581" (UID: "7acb9d97-b4c7-49b4-90f0-e1d5d97d2581"). InnerVolumeSpecName "kube-api-access-dpvsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:22:04 crc kubenswrapper[4893]: I0314 07:22:04.092785 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7acb9d97-b4c7-49b4-90f0-e1d5d97d2581-scripts" (OuterVolumeSpecName: "scripts") pod "7acb9d97-b4c7-49b4-90f0-e1d5d97d2581" (UID: "7acb9d97-b4c7-49b4-90f0-e1d5d97d2581"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:04 crc kubenswrapper[4893]: I0314 07:22:04.095928 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "7acb9d97-b4c7-49b4-90f0-e1d5d97d2581" (UID: "7acb9d97-b4c7-49b4-90f0-e1d5d97d2581"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 14 07:22:04 crc kubenswrapper[4893]: I0314 07:22:04.155872 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7acb9d97-b4c7-49b4-90f0-e1d5d97d2581-config-data" (OuterVolumeSpecName: "config-data") pod "7acb9d97-b4c7-49b4-90f0-e1d5d97d2581" (UID: "7acb9d97-b4c7-49b4-90f0-e1d5d97d2581"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:04 crc kubenswrapper[4893]: I0314 07:22:04.181757 4893 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7acb9d97-b4c7-49b4-90f0-e1d5d97d2581-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:04 crc kubenswrapper[4893]: I0314 07:22:04.181956 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dpvsn\" (UniqueName: \"kubernetes.io/projected/7acb9d97-b4c7-49b4-90f0-e1d5d97d2581-kube-api-access-dpvsn\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:04 crc kubenswrapper[4893]: I0314 07:22:04.182017 4893 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7acb9d97-b4c7-49b4-90f0-e1d5d97d2581-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:04 crc kubenswrapper[4893]: I0314 07:22:04.182081 4893 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Mar 14 07:22:04 crc kubenswrapper[4893]: I0314 07:22:04.182136 4893 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7acb9d97-b4c7-49b4-90f0-e1d5d97d2581-logs\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:04 crc kubenswrapper[4893]: I0314 07:22:04.182208 4893 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7acb9d97-b4c7-49b4-90f0-e1d5d97d2581-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:04 crc kubenswrapper[4893]: I0314 07:22:04.206857 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7acb9d97-b4c7-49b4-90f0-e1d5d97d2581-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7acb9d97-b4c7-49b4-90f0-e1d5d97d2581" (UID: "7acb9d97-b4c7-49b4-90f0-e1d5d97d2581"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:04 crc kubenswrapper[4893]: I0314 07:22:04.213580 4893 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Mar 14 07:22:04 crc kubenswrapper[4893]: I0314 07:22:04.238665 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7acb9d97-b4c7-49b4-90f0-e1d5d97d2581-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7acb9d97-b4c7-49b4-90f0-e1d5d97d2581" (UID: "7acb9d97-b4c7-49b4-90f0-e1d5d97d2581"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:04 crc kubenswrapper[4893]: I0314 07:22:04.283679 4893 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:04 crc kubenswrapper[4893]: I0314 07:22:04.283711 4893 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7acb9d97-b4c7-49b4-90f0-e1d5d97d2581-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:04 crc kubenswrapper[4893]: I0314 07:22:04.283724 4893 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7acb9d97-b4c7-49b4-90f0-e1d5d97d2581-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.013266 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"31912b7a-b762-438d-820f-84fc2f15ff85","Type":"ContainerStarted","Data":"b486620745036714e8a74e5dac973835df47c1d93972a1fd28507e74239e6c54"} Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.017061 4893 generic.go:334] "Generic (PLEG): container finished" podID="f09ba521-7c5d-4b30-902d-d7634a77e369" containerID="838c803bb68c10a1197d0b8154267c2281ac4430fa3451398720d14bddbc3c4a" exitCode=0 Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.017128 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-259c-account-create-update-g6bdg" event={"ID":"f09ba521-7c5d-4b30-902d-d7634a77e369","Type":"ContainerDied","Data":"838c803bb68c10a1197d0b8154267c2281ac4430fa3451398720d14bddbc3c4a"} Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.018875 4893 generic.go:334] "Generic (PLEG): container finished" podID="b113c1f3-1fdd-4fd0-806e-42707f79ba1e" containerID="de8f12a68ff328509ec01f4da9cfd1fdd44d102507da5c518743305ac27423ab" exitCode=0 Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.018943 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-33a2-account-create-update-tbdb9" event={"ID":"b113c1f3-1fdd-4fd0-806e-42707f79ba1e","Type":"ContainerDied","Data":"de8f12a68ff328509ec01f4da9cfd1fdd44d102507da5c518743305ac27423ab"} Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.020825 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1239ac87-7084-45c6-9eef-ecab07108656","Type":"ContainerStarted","Data":"6d4a0c5d5f1c982cd165c28bd58f8c768f1c7c5b718e400d69233659dc76ae51"} Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.020904 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.070271 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=5.07025229 podStartE2EDuration="5.07025229s" podCreationTimestamp="2026-03-14 07:22:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:22:05.063054655 +0000 UTC m=+1404.325231467" watchObservedRunningTime="2026-03-14 07:22:05.07025229 +0000 UTC m=+1404.332429082" Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.137361 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.188490 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.209579 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 07:22:05 crc kubenswrapper[4893]: E0314 07:22:05.210804 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7acb9d97-b4c7-49b4-90f0-e1d5d97d2581" containerName="glance-httpd" Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.210882 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="7acb9d97-b4c7-49b4-90f0-e1d5d97d2581" containerName="glance-httpd" Mar 14 07:22:05 crc kubenswrapper[4893]: E0314 07:22:05.210939 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7acb9d97-b4c7-49b4-90f0-e1d5d97d2581" containerName="glance-log" Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.210999 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="7acb9d97-b4c7-49b4-90f0-e1d5d97d2581" containerName="glance-log" Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.211231 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="7acb9d97-b4c7-49b4-90f0-e1d5d97d2581" containerName="glance-httpd" Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.211295 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="7acb9d97-b4c7-49b4-90f0-e1d5d97d2581" containerName="glance-log" Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.212187 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.212330 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.221044 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.221644 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.322036 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a07f9759-5cdb-4e42-b7c6-714d0e34ee55-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a07f9759-5cdb-4e42-b7c6-714d0e34ee55\") " pod="openstack/glance-default-external-api-0" Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.322662 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"a07f9759-5cdb-4e42-b7c6-714d0e34ee55\") " pod="openstack/glance-default-external-api-0" Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.322751 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a07f9759-5cdb-4e42-b7c6-714d0e34ee55-scripts\") pod \"glance-default-external-api-0\" (UID: \"a07f9759-5cdb-4e42-b7c6-714d0e34ee55\") " pod="openstack/glance-default-external-api-0" Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.322856 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a07f9759-5cdb-4e42-b7c6-714d0e34ee55-config-data\") pod \"glance-default-external-api-0\" (UID: \"a07f9759-5cdb-4e42-b7c6-714d0e34ee55\") " pod="openstack/glance-default-external-api-0" Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.322936 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a07f9759-5cdb-4e42-b7c6-714d0e34ee55-logs\") pod \"glance-default-external-api-0\" (UID: \"a07f9759-5cdb-4e42-b7c6-714d0e34ee55\") " pod="openstack/glance-default-external-api-0" Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.323007 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a07f9759-5cdb-4e42-b7c6-714d0e34ee55-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a07f9759-5cdb-4e42-b7c6-714d0e34ee55\") " pod="openstack/glance-default-external-api-0" Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.323095 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a07f9759-5cdb-4e42-b7c6-714d0e34ee55-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a07f9759-5cdb-4e42-b7c6-714d0e34ee55\") " pod="openstack/glance-default-external-api-0" Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.323178 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrhtt\" (UniqueName: \"kubernetes.io/projected/a07f9759-5cdb-4e42-b7c6-714d0e34ee55-kube-api-access-xrhtt\") pod \"glance-default-external-api-0\" (UID: \"a07f9759-5cdb-4e42-b7c6-714d0e34ee55\") " pod="openstack/glance-default-external-api-0" Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.415367 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7acb9d97-b4c7-49b4-90f0-e1d5d97d2581" path="/var/lib/kubelet/pods/7acb9d97-b4c7-49b4-90f0-e1d5d97d2581/volumes" Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.427490 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"a07f9759-5cdb-4e42-b7c6-714d0e34ee55\") " pod="openstack/glance-default-external-api-0" Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.427540 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a07f9759-5cdb-4e42-b7c6-714d0e34ee55-scripts\") pod \"glance-default-external-api-0\" (UID: \"a07f9759-5cdb-4e42-b7c6-714d0e34ee55\") " pod="openstack/glance-default-external-api-0" Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.427579 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a07f9759-5cdb-4e42-b7c6-714d0e34ee55-config-data\") pod \"glance-default-external-api-0\" (UID: \"a07f9759-5cdb-4e42-b7c6-714d0e34ee55\") " pod="openstack/glance-default-external-api-0" Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.427604 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a07f9759-5cdb-4e42-b7c6-714d0e34ee55-logs\") pod \"glance-default-external-api-0\" (UID: \"a07f9759-5cdb-4e42-b7c6-714d0e34ee55\") " pod="openstack/glance-default-external-api-0" Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.427625 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a07f9759-5cdb-4e42-b7c6-714d0e34ee55-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a07f9759-5cdb-4e42-b7c6-714d0e34ee55\") " pod="openstack/glance-default-external-api-0" Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.427660 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a07f9759-5cdb-4e42-b7c6-714d0e34ee55-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a07f9759-5cdb-4e42-b7c6-714d0e34ee55\") " pod="openstack/glance-default-external-api-0" Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.427687 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrhtt\" (UniqueName: \"kubernetes.io/projected/a07f9759-5cdb-4e42-b7c6-714d0e34ee55-kube-api-access-xrhtt\") pod \"glance-default-external-api-0\" (UID: \"a07f9759-5cdb-4e42-b7c6-714d0e34ee55\") " pod="openstack/glance-default-external-api-0" Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.427714 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a07f9759-5cdb-4e42-b7c6-714d0e34ee55-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a07f9759-5cdb-4e42-b7c6-714d0e34ee55\") " pod="openstack/glance-default-external-api-0" Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.429112 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a07f9759-5cdb-4e42-b7c6-714d0e34ee55-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a07f9759-5cdb-4e42-b7c6-714d0e34ee55\") " pod="openstack/glance-default-external-api-0" Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.429589 4893 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"a07f9759-5cdb-4e42-b7c6-714d0e34ee55\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.436800 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a07f9759-5cdb-4e42-b7c6-714d0e34ee55-logs\") pod \"glance-default-external-api-0\" (UID: \"a07f9759-5cdb-4e42-b7c6-714d0e34ee55\") " pod="openstack/glance-default-external-api-0" Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.437387 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a07f9759-5cdb-4e42-b7c6-714d0e34ee55-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a07f9759-5cdb-4e42-b7c6-714d0e34ee55\") " pod="openstack/glance-default-external-api-0" Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.439603 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a07f9759-5cdb-4e42-b7c6-714d0e34ee55-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a07f9759-5cdb-4e42-b7c6-714d0e34ee55\") " pod="openstack/glance-default-external-api-0" Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.441348 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a07f9759-5cdb-4e42-b7c6-714d0e34ee55-scripts\") pod \"glance-default-external-api-0\" (UID: \"a07f9759-5cdb-4e42-b7c6-714d0e34ee55\") " pod="openstack/glance-default-external-api-0" Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.445682 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a07f9759-5cdb-4e42-b7c6-714d0e34ee55-config-data\") pod \"glance-default-external-api-0\" (UID: \"a07f9759-5cdb-4e42-b7c6-714d0e34ee55\") " pod="openstack/glance-default-external-api-0" Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.458134 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrhtt\" (UniqueName: \"kubernetes.io/projected/a07f9759-5cdb-4e42-b7c6-714d0e34ee55-kube-api-access-xrhtt\") pod \"glance-default-external-api-0\" (UID: \"a07f9759-5cdb-4e42-b7c6-714d0e34ee55\") " pod="openstack/glance-default-external-api-0" Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.473820 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"a07f9759-5cdb-4e42-b7c6-714d0e34ee55\") " pod="openstack/glance-default-external-api-0" Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.535978 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-bbgd4" Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.564161 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.641377 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5frv\" (UniqueName: \"kubernetes.io/projected/e685f75c-8311-4120-88ca-be0e6060d132-kube-api-access-j5frv\") pod \"e685f75c-8311-4120-88ca-be0e6060d132\" (UID: \"e685f75c-8311-4120-88ca-be0e6060d132\") " Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.641668 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e685f75c-8311-4120-88ca-be0e6060d132-operator-scripts\") pod \"e685f75c-8311-4120-88ca-be0e6060d132\" (UID: \"e685f75c-8311-4120-88ca-be0e6060d132\") " Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.642575 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e685f75c-8311-4120-88ca-be0e6060d132-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e685f75c-8311-4120-88ca-be0e6060d132" (UID: "e685f75c-8311-4120-88ca-be0e6060d132"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.646137 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e685f75c-8311-4120-88ca-be0e6060d132-kube-api-access-j5frv" (OuterVolumeSpecName: "kube-api-access-j5frv") pod "e685f75c-8311-4120-88ca-be0e6060d132" (UID: "e685f75c-8311-4120-88ca-be0e6060d132"). InnerVolumeSpecName "kube-api-access-j5frv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.700688 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-f82a-account-create-update-99zsr" Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.731353 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-lpk6v" Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.744002 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fww67\" (UniqueName: \"kubernetes.io/projected/435a4f9d-9f5b-4f9f-bc4a-b4651831bac1-kube-api-access-fww67\") pod \"435a4f9d-9f5b-4f9f-bc4a-b4651831bac1\" (UID: \"435a4f9d-9f5b-4f9f-bc4a-b4651831bac1\") " Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.745214 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/435a4f9d-9f5b-4f9f-bc4a-b4651831bac1-operator-scripts\") pod \"435a4f9d-9f5b-4f9f-bc4a-b4651831bac1\" (UID: \"435a4f9d-9f5b-4f9f-bc4a-b4651831bac1\") " Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.749806 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5frv\" (UniqueName: \"kubernetes.io/projected/e685f75c-8311-4120-88ca-be0e6060d132-kube-api-access-j5frv\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.749832 4893 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e685f75c-8311-4120-88ca-be0e6060d132-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.753090 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/435a4f9d-9f5b-4f9f-bc4a-b4651831bac1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "435a4f9d-9f5b-4f9f-bc4a-b4651831bac1" (UID: "435a4f9d-9f5b-4f9f-bc4a-b4651831bac1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.762749 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/435a4f9d-9f5b-4f9f-bc4a-b4651831bac1-kube-api-access-fww67" (OuterVolumeSpecName: "kube-api-access-fww67") pod "435a4f9d-9f5b-4f9f-bc4a-b4651831bac1" (UID: "435a4f9d-9f5b-4f9f-bc4a-b4651831bac1"). InnerVolumeSpecName "kube-api-access-fww67". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.810713 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.811156 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c782f441-cf3b-4a69-965b-5d87dd4a00ad" containerName="glance-log" containerID="cri-o://fee9c35478e394d2ad663afea76c8313320b0a7f236352aaa7a70481737b3c1f" gracePeriod=30 Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.811589 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c782f441-cf3b-4a69-965b-5d87dd4a00ad" containerName="glance-httpd" containerID="cri-o://cc50be6a0913b80b7fcb0323f8b14eebc02204609d22675c73eb7671be680d74" gracePeriod=30 Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.828799 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-p7mnh" Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.842725 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557882-7m6mq" Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.852413 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f50f3b01-c781-4d53-8c0c-62cb289ebbde-operator-scripts\") pod \"f50f3b01-c781-4d53-8c0c-62cb289ebbde\" (UID: \"f50f3b01-c781-4d53-8c0c-62cb289ebbde\") " Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.852456 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rws76\" (UniqueName: \"kubernetes.io/projected/f50f3b01-c781-4d53-8c0c-62cb289ebbde-kube-api-access-rws76\") pod \"f50f3b01-c781-4d53-8c0c-62cb289ebbde\" (UID: \"f50f3b01-c781-4d53-8c0c-62cb289ebbde\") " Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.852997 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fww67\" (UniqueName: \"kubernetes.io/projected/435a4f9d-9f5b-4f9f-bc4a-b4651831bac1-kube-api-access-fww67\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.853016 4893 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/435a4f9d-9f5b-4f9f-bc4a-b4651831bac1-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.853273 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f50f3b01-c781-4d53-8c0c-62cb289ebbde-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f50f3b01-c781-4d53-8c0c-62cb289ebbde" (UID: "f50f3b01-c781-4d53-8c0c-62cb289ebbde"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.861236 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f50f3b01-c781-4d53-8c0c-62cb289ebbde-kube-api-access-rws76" (OuterVolumeSpecName: "kube-api-access-rws76") pod "f50f3b01-c781-4d53-8c0c-62cb289ebbde" (UID: "f50f3b01-c781-4d53-8c0c-62cb289ebbde"). InnerVolumeSpecName "kube-api-access-rws76". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.895074 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5cdc5f965f-t6wfv" Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.955168 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/481d0c9c-3ac9-4bae-bd8c-52489deda58c-operator-scripts\") pod \"481d0c9c-3ac9-4bae-bd8c-52489deda58c\" (UID: \"481d0c9c-3ac9-4bae-bd8c-52489deda58c\") " Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.955352 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7s9hd\" (UniqueName: \"kubernetes.io/projected/481d0c9c-3ac9-4bae-bd8c-52489deda58c-kube-api-access-7s9hd\") pod \"481d0c9c-3ac9-4bae-bd8c-52489deda58c\" (UID: \"481d0c9c-3ac9-4bae-bd8c-52489deda58c\") " Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.955429 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5mm5\" (UniqueName: \"kubernetes.io/projected/eba46be1-6a5c-4665-aebf-6b243dec4ed7-kube-api-access-w5mm5\") pod \"eba46be1-6a5c-4665-aebf-6b243dec4ed7\" (UID: \"eba46be1-6a5c-4665-aebf-6b243dec4ed7\") " Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.955963 4893 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f50f3b01-c781-4d53-8c0c-62cb289ebbde-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.955980 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rws76\" (UniqueName: \"kubernetes.io/projected/f50f3b01-c781-4d53-8c0c-62cb289ebbde-kube-api-access-rws76\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.956978 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/481d0c9c-3ac9-4bae-bd8c-52489deda58c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "481d0c9c-3ac9-4bae-bd8c-52489deda58c" (UID: "481d0c9c-3ac9-4bae-bd8c-52489deda58c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.961304 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/481d0c9c-3ac9-4bae-bd8c-52489deda58c-kube-api-access-7s9hd" (OuterVolumeSpecName: "kube-api-access-7s9hd") pod "481d0c9c-3ac9-4bae-bd8c-52489deda58c" (UID: "481d0c9c-3ac9-4bae-bd8c-52489deda58c"). InnerVolumeSpecName "kube-api-access-7s9hd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.962830 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-b866f57b8-fbw4s"] Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.963059 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-b866f57b8-fbw4s" podUID="61f6301e-96d7-4b42-b14c-1286aff6c13f" containerName="neutron-api" containerID="cri-o://66c46a8e3cf50ed9e16b93b6415976a6c522afb38e02301e75b1ac6e561113c1" gracePeriod=30 Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.964036 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-b866f57b8-fbw4s" podUID="61f6301e-96d7-4b42-b14c-1286aff6c13f" containerName="neutron-httpd" containerID="cri-o://78579ad35de8d3d335e36041fbcc2c6cb7eddbfe098db0f486431cd760fd2ad8" gracePeriod=30 Mar 14 07:22:05 crc kubenswrapper[4893]: I0314 07:22:05.977690 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eba46be1-6a5c-4665-aebf-6b243dec4ed7-kube-api-access-w5mm5" (OuterVolumeSpecName: "kube-api-access-w5mm5") pod "eba46be1-6a5c-4665-aebf-6b243dec4ed7" (UID: "eba46be1-6a5c-4665-aebf-6b243dec4ed7"). InnerVolumeSpecName "kube-api-access-w5mm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:22:06 crc kubenswrapper[4893]: I0314 07:22:06.063208 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5mm5\" (UniqueName: \"kubernetes.io/projected/eba46be1-6a5c-4665-aebf-6b243dec4ed7-kube-api-access-w5mm5\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:06 crc kubenswrapper[4893]: I0314 07:22:06.063247 4893 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/481d0c9c-3ac9-4bae-bd8c-52489deda58c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:06 crc kubenswrapper[4893]: I0314 07:22:06.063260 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7s9hd\" (UniqueName: \"kubernetes.io/projected/481d0c9c-3ac9-4bae-bd8c-52489deda58c-kube-api-access-7s9hd\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:06 crc kubenswrapper[4893]: I0314 07:22:06.102397 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-bbgd4" event={"ID":"e685f75c-8311-4120-88ca-be0e6060d132","Type":"ContainerDied","Data":"4145b69299be2b3f60904fff51b0470316df9d6499550625f6c1a91b3089e847"} Mar 14 07:22:06 crc kubenswrapper[4893]: I0314 07:22:06.102440 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4145b69299be2b3f60904fff51b0470316df9d6499550625f6c1a91b3089e847" Mar 14 07:22:06 crc kubenswrapper[4893]: I0314 07:22:06.102475 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-bbgd4" Mar 14 07:22:06 crc kubenswrapper[4893]: I0314 07:22:06.125961 4893 generic.go:334] "Generic (PLEG): container finished" podID="c782f441-cf3b-4a69-965b-5d87dd4a00ad" containerID="fee9c35478e394d2ad663afea76c8313320b0a7f236352aaa7a70481737b3c1f" exitCode=143 Mar 14 07:22:06 crc kubenswrapper[4893]: I0314 07:22:06.126048 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c782f441-cf3b-4a69-965b-5d87dd4a00ad","Type":"ContainerDied","Data":"fee9c35478e394d2ad663afea76c8313320b0a7f236352aaa7a70481737b3c1f"} Mar 14 07:22:06 crc kubenswrapper[4893]: I0314 07:22:06.132380 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-p7mnh" Mar 14 07:22:06 crc kubenswrapper[4893]: I0314 07:22:06.132530 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-p7mnh" event={"ID":"481d0c9c-3ac9-4bae-bd8c-52489deda58c","Type":"ContainerDied","Data":"1edc68fcc86d80fa096639a8aef62520284d052bcbf85934aa2035ccb7c85ce8"} Mar 14 07:22:06 crc kubenswrapper[4893]: I0314 07:22:06.132554 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1edc68fcc86d80fa096639a8aef62520284d052bcbf85934aa2035ccb7c85ce8" Mar 14 07:22:06 crc kubenswrapper[4893]: I0314 07:22:06.152701 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"31912b7a-b762-438d-820f-84fc2f15ff85","Type":"ContainerStarted","Data":"6ef8cade80f34d01276a53d77ec6a01908a7ebf2489c29ff53e33002ee596494"} Mar 14 07:22:06 crc kubenswrapper[4893]: I0314 07:22:06.158936 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557882-7m6mq" event={"ID":"eba46be1-6a5c-4665-aebf-6b243dec4ed7","Type":"ContainerDied","Data":"39a8cf740031ef08b89ca65bc20f98050d01114901e5535d34932812c3393b07"} Mar 14 07:22:06 crc kubenswrapper[4893]: I0314 07:22:06.158974 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39a8cf740031ef08b89ca65bc20f98050d01114901e5535d34932812c3393b07" Mar 14 07:22:06 crc kubenswrapper[4893]: I0314 07:22:06.159044 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557882-7m6mq" Mar 14 07:22:06 crc kubenswrapper[4893]: I0314 07:22:06.175971 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-lpk6v" event={"ID":"f50f3b01-c781-4d53-8c0c-62cb289ebbde","Type":"ContainerDied","Data":"05347266b0b7a2cf9b6a0416cfe3f28596eb1f2d0c5e113b6bfd558ed7a206f4"} Mar 14 07:22:06 crc kubenswrapper[4893]: I0314 07:22:06.176008 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05347266b0b7a2cf9b6a0416cfe3f28596eb1f2d0c5e113b6bfd558ed7a206f4" Mar 14 07:22:06 crc kubenswrapper[4893]: I0314 07:22:06.176081 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-lpk6v" Mar 14 07:22:06 crc kubenswrapper[4893]: I0314 07:22:06.184743 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 14 07:22:06 crc kubenswrapper[4893]: I0314 07:22:06.186313 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-f82a-account-create-update-99zsr" event={"ID":"435a4f9d-9f5b-4f9f-bc4a-b4651831bac1","Type":"ContainerDied","Data":"e091e6359069072e95d849579519dc6127d715d0682d72f22a1ab3894b0216e7"} Mar 14 07:22:06 crc kubenswrapper[4893]: I0314 07:22:06.186366 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e091e6359069072e95d849579519dc6127d715d0682d72f22a1ab3894b0216e7" Mar 14 07:22:06 crc kubenswrapper[4893]: I0314 07:22:06.186501 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-f82a-account-create-update-99zsr" Mar 14 07:22:06 crc kubenswrapper[4893]: I0314 07:22:06.359582 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 07:22:06 crc kubenswrapper[4893]: I0314 07:22:06.657461 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:22:06 crc kubenswrapper[4893]: I0314 07:22:06.728644 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-259c-account-create-update-g6bdg" Mar 14 07:22:06 crc kubenswrapper[4893]: I0314 07:22:06.730465 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-33a2-account-create-update-tbdb9" Mar 14 07:22:06 crc kubenswrapper[4893]: I0314 07:22:06.789047 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85bxz\" (UniqueName: \"kubernetes.io/projected/f09ba521-7c5d-4b30-902d-d7634a77e369-kube-api-access-85bxz\") pod \"f09ba521-7c5d-4b30-902d-d7634a77e369\" (UID: \"f09ba521-7c5d-4b30-902d-d7634a77e369\") " Mar 14 07:22:06 crc kubenswrapper[4893]: I0314 07:22:06.789176 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f09ba521-7c5d-4b30-902d-d7634a77e369-operator-scripts\") pod \"f09ba521-7c5d-4b30-902d-d7634a77e369\" (UID: \"f09ba521-7c5d-4b30-902d-d7634a77e369\") " Mar 14 07:22:06 crc kubenswrapper[4893]: I0314 07:22:06.789212 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b113c1f3-1fdd-4fd0-806e-42707f79ba1e-operator-scripts\") pod \"b113c1f3-1fdd-4fd0-806e-42707f79ba1e\" (UID: \"b113c1f3-1fdd-4fd0-806e-42707f79ba1e\") " Mar 14 07:22:06 crc kubenswrapper[4893]: I0314 07:22:06.789302 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggzn7\" (UniqueName: \"kubernetes.io/projected/b113c1f3-1fdd-4fd0-806e-42707f79ba1e-kube-api-access-ggzn7\") pod \"b113c1f3-1fdd-4fd0-806e-42707f79ba1e\" (UID: \"b113c1f3-1fdd-4fd0-806e-42707f79ba1e\") " Mar 14 07:22:06 crc kubenswrapper[4893]: I0314 07:22:06.789838 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f09ba521-7c5d-4b30-902d-d7634a77e369-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f09ba521-7c5d-4b30-902d-d7634a77e369" (UID: "f09ba521-7c5d-4b30-902d-d7634a77e369"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:22:06 crc kubenswrapper[4893]: I0314 07:22:06.790236 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b113c1f3-1fdd-4fd0-806e-42707f79ba1e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b113c1f3-1fdd-4fd0-806e-42707f79ba1e" (UID: "b113c1f3-1fdd-4fd0-806e-42707f79ba1e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:22:06 crc kubenswrapper[4893]: I0314 07:22:06.796403 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f09ba521-7c5d-4b30-902d-d7634a77e369-kube-api-access-85bxz" (OuterVolumeSpecName: "kube-api-access-85bxz") pod "f09ba521-7c5d-4b30-902d-d7634a77e369" (UID: "f09ba521-7c5d-4b30-902d-d7634a77e369"). InnerVolumeSpecName "kube-api-access-85bxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:22:06 crc kubenswrapper[4893]: I0314 07:22:06.796431 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b113c1f3-1fdd-4fd0-806e-42707f79ba1e-kube-api-access-ggzn7" (OuterVolumeSpecName: "kube-api-access-ggzn7") pod "b113c1f3-1fdd-4fd0-806e-42707f79ba1e" (UID: "b113c1f3-1fdd-4fd0-806e-42707f79ba1e"). InnerVolumeSpecName "kube-api-access-ggzn7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:22:06 crc kubenswrapper[4893]: I0314 07:22:06.891754 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85bxz\" (UniqueName: \"kubernetes.io/projected/f09ba521-7c5d-4b30-902d-d7634a77e369-kube-api-access-85bxz\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:06 crc kubenswrapper[4893]: I0314 07:22:06.891796 4893 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f09ba521-7c5d-4b30-902d-d7634a77e369-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:06 crc kubenswrapper[4893]: I0314 07:22:06.891809 4893 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b113c1f3-1fdd-4fd0-806e-42707f79ba1e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:06 crc kubenswrapper[4893]: I0314 07:22:06.891818 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ggzn7\" (UniqueName: \"kubernetes.io/projected/b113c1f3-1fdd-4fd0-806e-42707f79ba1e-kube-api-access-ggzn7\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:06 crc kubenswrapper[4893]: I0314 07:22:06.924129 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557876-nn5pp"] Mar 14 07:22:06 crc kubenswrapper[4893]: I0314 07:22:06.937536 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557876-nn5pp"] Mar 14 07:22:07 crc kubenswrapper[4893]: I0314 07:22:07.203167 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-259c-account-create-update-g6bdg" event={"ID":"f09ba521-7c5d-4b30-902d-d7634a77e369","Type":"ContainerDied","Data":"fb8f8994ed099008f89f993c685df68001911b5d498e559b32440364880031ec"} Mar 14 07:22:07 crc kubenswrapper[4893]: I0314 07:22:07.203209 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb8f8994ed099008f89f993c685df68001911b5d498e559b32440364880031ec" Mar 14 07:22:07 crc kubenswrapper[4893]: I0314 07:22:07.203186 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-259c-account-create-update-g6bdg" Mar 14 07:22:07 crc kubenswrapper[4893]: I0314 07:22:07.204662 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a07f9759-5cdb-4e42-b7c6-714d0e34ee55","Type":"ContainerStarted","Data":"f69d37d7aa3aaad764b626c6801d7a4b236d60fdfae8857b91bae1f1d591f9a8"} Mar 14 07:22:07 crc kubenswrapper[4893]: I0314 07:22:07.204701 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a07f9759-5cdb-4e42-b7c6-714d0e34ee55","Type":"ContainerStarted","Data":"a4b5d6451188781e94a46b2be23e8d8f4e42365d0974c6fec260c7d6efb45da6"} Mar 14 07:22:07 crc kubenswrapper[4893]: I0314 07:22:07.206325 4893 generic.go:334] "Generic (PLEG): container finished" podID="61f6301e-96d7-4b42-b14c-1286aff6c13f" containerID="78579ad35de8d3d335e36041fbcc2c6cb7eddbfe098db0f486431cd760fd2ad8" exitCode=0 Mar 14 07:22:07 crc kubenswrapper[4893]: I0314 07:22:07.206367 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b866f57b8-fbw4s" event={"ID":"61f6301e-96d7-4b42-b14c-1286aff6c13f","Type":"ContainerDied","Data":"78579ad35de8d3d335e36041fbcc2c6cb7eddbfe098db0f486431cd760fd2ad8"} Mar 14 07:22:07 crc kubenswrapper[4893]: I0314 07:22:07.208641 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-33a2-account-create-update-tbdb9" Mar 14 07:22:07 crc kubenswrapper[4893]: I0314 07:22:07.212753 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-33a2-account-create-update-tbdb9" event={"ID":"b113c1f3-1fdd-4fd0-806e-42707f79ba1e","Type":"ContainerDied","Data":"a3b3ac1b545d66d895167b2bc981dfa3135424dae9d7961de958b5d8b38251d5"} Mar 14 07:22:07 crc kubenswrapper[4893]: I0314 07:22:07.212813 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3b3ac1b545d66d895167b2bc981dfa3135424dae9d7961de958b5d8b38251d5" Mar 14 07:22:07 crc kubenswrapper[4893]: I0314 07:22:07.391480 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db8ca722-d281-4ac8-9da6-74dbd04787c4" path="/var/lib/kubelet/pods/db8ca722-d281-4ac8-9da6-74dbd04787c4/volumes" Mar 14 07:22:08 crc kubenswrapper[4893]: I0314 07:22:08.222724 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"31912b7a-b762-438d-820f-84fc2f15ff85","Type":"ContainerStarted","Data":"0c80292baa08bff5d7ce8101e7b6d0d959d1f4d38ec5d542baa2a5236bfeb5c1"} Mar 14 07:22:08 crc kubenswrapper[4893]: I0314 07:22:08.222988 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 14 07:22:08 crc kubenswrapper[4893]: I0314 07:22:08.222916 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="31912b7a-b762-438d-820f-84fc2f15ff85" containerName="ceilometer-central-agent" containerID="cri-o://71b61bbcb36ee4b29d89e6fb6180dc6c4d178973e2003b6cff8a03ab59ae3be1" gracePeriod=30 Mar 14 07:22:08 crc kubenswrapper[4893]: I0314 07:22:08.223075 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="31912b7a-b762-438d-820f-84fc2f15ff85" containerName="proxy-httpd" containerID="cri-o://0c80292baa08bff5d7ce8101e7b6d0d959d1f4d38ec5d542baa2a5236bfeb5c1" gracePeriod=30 Mar 14 07:22:08 crc kubenswrapper[4893]: I0314 07:22:08.223148 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="31912b7a-b762-438d-820f-84fc2f15ff85" containerName="ceilometer-notification-agent" containerID="cri-o://b486620745036714e8a74e5dac973835df47c1d93972a1fd28507e74239e6c54" gracePeriod=30 Mar 14 07:22:08 crc kubenswrapper[4893]: I0314 07:22:08.223186 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="31912b7a-b762-438d-820f-84fc2f15ff85" containerName="sg-core" containerID="cri-o://6ef8cade80f34d01276a53d77ec6a01908a7ebf2489c29ff53e33002ee596494" gracePeriod=30 Mar 14 07:22:08 crc kubenswrapper[4893]: I0314 07:22:08.233448 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a07f9759-5cdb-4e42-b7c6-714d0e34ee55","Type":"ContainerStarted","Data":"2fc121bd16a2e48ba5ae0e8191ee49932905a50f6c55bad773e4ec5663d05d86"} Mar 14 07:22:08 crc kubenswrapper[4893]: I0314 07:22:08.256894 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.600064966 podStartE2EDuration="8.256877937s" podCreationTimestamp="2026-03-14 07:22:00 +0000 UTC" firstStartedPulling="2026-03-14 07:22:01.794487354 +0000 UTC m=+1401.056664146" lastFinishedPulling="2026-03-14 07:22:07.451300315 +0000 UTC m=+1406.713477117" observedRunningTime="2026-03-14 07:22:08.253891684 +0000 UTC m=+1407.516068486" watchObservedRunningTime="2026-03-14 07:22:08.256877937 +0000 UTC m=+1407.519054729" Mar 14 07:22:08 crc kubenswrapper[4893]: I0314 07:22:08.300014 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.299993217 podStartE2EDuration="3.299993217s" podCreationTimestamp="2026-03-14 07:22:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:22:08.283067015 +0000 UTC m=+1407.545243807" watchObservedRunningTime="2026-03-14 07:22:08.299993217 +0000 UTC m=+1407.562170009" Mar 14 07:22:09 crc kubenswrapper[4893]: I0314 07:22:09.244230 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7776dc7c77-9lm7q" Mar 14 07:22:09 crc kubenswrapper[4893]: I0314 07:22:09.245003 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7776dc7c77-9lm7q" Mar 14 07:22:09 crc kubenswrapper[4893]: I0314 07:22:09.246958 4893 generic.go:334] "Generic (PLEG): container finished" podID="31912b7a-b762-438d-820f-84fc2f15ff85" containerID="0c80292baa08bff5d7ce8101e7b6d0d959d1f4d38ec5d542baa2a5236bfeb5c1" exitCode=0 Mar 14 07:22:09 crc kubenswrapper[4893]: I0314 07:22:09.246982 4893 generic.go:334] "Generic (PLEG): container finished" podID="31912b7a-b762-438d-820f-84fc2f15ff85" containerID="6ef8cade80f34d01276a53d77ec6a01908a7ebf2489c29ff53e33002ee596494" exitCode=2 Mar 14 07:22:09 crc kubenswrapper[4893]: I0314 07:22:09.246990 4893 generic.go:334] "Generic (PLEG): container finished" podID="31912b7a-b762-438d-820f-84fc2f15ff85" containerID="b486620745036714e8a74e5dac973835df47c1d93972a1fd28507e74239e6c54" exitCode=0 Mar 14 07:22:09 crc kubenswrapper[4893]: I0314 07:22:09.247037 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"31912b7a-b762-438d-820f-84fc2f15ff85","Type":"ContainerDied","Data":"0c80292baa08bff5d7ce8101e7b6d0d959d1f4d38ec5d542baa2a5236bfeb5c1"} Mar 14 07:22:09 crc kubenswrapper[4893]: I0314 07:22:09.247130 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"31912b7a-b762-438d-820f-84fc2f15ff85","Type":"ContainerDied","Data":"6ef8cade80f34d01276a53d77ec6a01908a7ebf2489c29ff53e33002ee596494"} Mar 14 07:22:09 crc kubenswrapper[4893]: I0314 07:22:09.247201 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"31912b7a-b762-438d-820f-84fc2f15ff85","Type":"ContainerDied","Data":"b486620745036714e8a74e5dac973835df47c1d93972a1fd28507e74239e6c54"} Mar 14 07:22:09 crc kubenswrapper[4893]: I0314 07:22:09.249405 4893 generic.go:334] "Generic (PLEG): container finished" podID="c782f441-cf3b-4a69-965b-5d87dd4a00ad" containerID="cc50be6a0913b80b7fcb0323f8b14eebc02204609d22675c73eb7671be680d74" exitCode=0 Mar 14 07:22:09 crc kubenswrapper[4893]: I0314 07:22:09.249485 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c782f441-cf3b-4a69-965b-5d87dd4a00ad","Type":"ContainerDied","Data":"cc50be6a0913b80b7fcb0323f8b14eebc02204609d22675c73eb7671be680d74"} Mar 14 07:22:09 crc kubenswrapper[4893]: I0314 07:22:09.539147 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 14 07:22:09 crc kubenswrapper[4893]: I0314 07:22:09.643139 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9vrk\" (UniqueName: \"kubernetes.io/projected/c782f441-cf3b-4a69-965b-5d87dd4a00ad-kube-api-access-f9vrk\") pod \"c782f441-cf3b-4a69-965b-5d87dd4a00ad\" (UID: \"c782f441-cf3b-4a69-965b-5d87dd4a00ad\") " Mar 14 07:22:09 crc kubenswrapper[4893]: I0314 07:22:09.643250 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c782f441-cf3b-4a69-965b-5d87dd4a00ad-internal-tls-certs\") pod \"c782f441-cf3b-4a69-965b-5d87dd4a00ad\" (UID: \"c782f441-cf3b-4a69-965b-5d87dd4a00ad\") " Mar 14 07:22:09 crc kubenswrapper[4893]: I0314 07:22:09.643279 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c782f441-cf3b-4a69-965b-5d87dd4a00ad-scripts\") pod \"c782f441-cf3b-4a69-965b-5d87dd4a00ad\" (UID: \"c782f441-cf3b-4a69-965b-5d87dd4a00ad\") " Mar 14 07:22:09 crc kubenswrapper[4893]: I0314 07:22:09.643308 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c782f441-cf3b-4a69-965b-5d87dd4a00ad-httpd-run\") pod \"c782f441-cf3b-4a69-965b-5d87dd4a00ad\" (UID: \"c782f441-cf3b-4a69-965b-5d87dd4a00ad\") " Mar 14 07:22:09 crc kubenswrapper[4893]: I0314 07:22:09.643328 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c782f441-cf3b-4a69-965b-5d87dd4a00ad-config-data\") pod \"c782f441-cf3b-4a69-965b-5d87dd4a00ad\" (UID: \"c782f441-cf3b-4a69-965b-5d87dd4a00ad\") " Mar 14 07:22:09 crc kubenswrapper[4893]: I0314 07:22:09.643894 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c782f441-cf3b-4a69-965b-5d87dd4a00ad-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c782f441-cf3b-4a69-965b-5d87dd4a00ad" (UID: "c782f441-cf3b-4a69-965b-5d87dd4a00ad"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:22:09 crc kubenswrapper[4893]: I0314 07:22:09.644231 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c782f441-cf3b-4a69-965b-5d87dd4a00ad-logs\") pod \"c782f441-cf3b-4a69-965b-5d87dd4a00ad\" (UID: \"c782f441-cf3b-4a69-965b-5d87dd4a00ad\") " Mar 14 07:22:09 crc kubenswrapper[4893]: I0314 07:22:09.644251 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"c782f441-cf3b-4a69-965b-5d87dd4a00ad\" (UID: \"c782f441-cf3b-4a69-965b-5d87dd4a00ad\") " Mar 14 07:22:09 crc kubenswrapper[4893]: I0314 07:22:09.644279 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c782f441-cf3b-4a69-965b-5d87dd4a00ad-combined-ca-bundle\") pod \"c782f441-cf3b-4a69-965b-5d87dd4a00ad\" (UID: \"c782f441-cf3b-4a69-965b-5d87dd4a00ad\") " Mar 14 07:22:09 crc kubenswrapper[4893]: I0314 07:22:09.644812 4893 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c782f441-cf3b-4a69-965b-5d87dd4a00ad-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:09 crc kubenswrapper[4893]: I0314 07:22:09.644868 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c782f441-cf3b-4a69-965b-5d87dd4a00ad-logs" (OuterVolumeSpecName: "logs") pod "c782f441-cf3b-4a69-965b-5d87dd4a00ad" (UID: "c782f441-cf3b-4a69-965b-5d87dd4a00ad"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:22:09 crc kubenswrapper[4893]: I0314 07:22:09.657504 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c782f441-cf3b-4a69-965b-5d87dd4a00ad-kube-api-access-f9vrk" (OuterVolumeSpecName: "kube-api-access-f9vrk") pod "c782f441-cf3b-4a69-965b-5d87dd4a00ad" (UID: "c782f441-cf3b-4a69-965b-5d87dd4a00ad"). InnerVolumeSpecName "kube-api-access-f9vrk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:22:09 crc kubenswrapper[4893]: I0314 07:22:09.657951 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "c782f441-cf3b-4a69-965b-5d87dd4a00ad" (UID: "c782f441-cf3b-4a69-965b-5d87dd4a00ad"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 14 07:22:09 crc kubenswrapper[4893]: I0314 07:22:09.679874 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c782f441-cf3b-4a69-965b-5d87dd4a00ad-scripts" (OuterVolumeSpecName: "scripts") pod "c782f441-cf3b-4a69-965b-5d87dd4a00ad" (UID: "c782f441-cf3b-4a69-965b-5d87dd4a00ad"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:09 crc kubenswrapper[4893]: I0314 07:22:09.709766 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c782f441-cf3b-4a69-965b-5d87dd4a00ad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c782f441-cf3b-4a69-965b-5d87dd4a00ad" (UID: "c782f441-cf3b-4a69-965b-5d87dd4a00ad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:09 crc kubenswrapper[4893]: I0314 07:22:09.735717 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c782f441-cf3b-4a69-965b-5d87dd4a00ad-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c782f441-cf3b-4a69-965b-5d87dd4a00ad" (UID: "c782f441-cf3b-4a69-965b-5d87dd4a00ad"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:09 crc kubenswrapper[4893]: I0314 07:22:09.746936 4893 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c782f441-cf3b-4a69-965b-5d87dd4a00ad-logs\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:09 crc kubenswrapper[4893]: I0314 07:22:09.746993 4893 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Mar 14 07:22:09 crc kubenswrapper[4893]: I0314 07:22:09.747006 4893 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c782f441-cf3b-4a69-965b-5d87dd4a00ad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:09 crc kubenswrapper[4893]: I0314 07:22:09.747020 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9vrk\" (UniqueName: \"kubernetes.io/projected/c782f441-cf3b-4a69-965b-5d87dd4a00ad-kube-api-access-f9vrk\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:09 crc kubenswrapper[4893]: I0314 07:22:09.747031 4893 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c782f441-cf3b-4a69-965b-5d87dd4a00ad-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:09 crc kubenswrapper[4893]: I0314 07:22:09.747042 4893 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c782f441-cf3b-4a69-965b-5d87dd4a00ad-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:09 crc kubenswrapper[4893]: I0314 07:22:09.750641 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c782f441-cf3b-4a69-965b-5d87dd4a00ad-config-data" (OuterVolumeSpecName: "config-data") pod "c782f441-cf3b-4a69-965b-5d87dd4a00ad" (UID: "c782f441-cf3b-4a69-965b-5d87dd4a00ad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:09 crc kubenswrapper[4893]: I0314 07:22:09.772001 4893 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Mar 14 07:22:09 crc kubenswrapper[4893]: I0314 07:22:09.848622 4893 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:09 crc kubenswrapper[4893]: I0314 07:22:09.848660 4893 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c782f441-cf3b-4a69-965b-5d87dd4a00ad-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:10 crc kubenswrapper[4893]: I0314 07:22:10.266149 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c782f441-cf3b-4a69-965b-5d87dd4a00ad","Type":"ContainerDied","Data":"5755d2f0017544014d3552adfe6784a4bfb019725b81953a5a8f764a925767ee"} Mar 14 07:22:10 crc kubenswrapper[4893]: I0314 07:22:10.266223 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 14 07:22:10 crc kubenswrapper[4893]: I0314 07:22:10.266234 4893 scope.go:117] "RemoveContainer" containerID="cc50be6a0913b80b7fcb0323f8b14eebc02204609d22675c73eb7671be680d74" Mar 14 07:22:10 crc kubenswrapper[4893]: I0314 07:22:10.310949 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 07:22:10 crc kubenswrapper[4893]: I0314 07:22:10.316441 4893 scope.go:117] "RemoveContainer" containerID="fee9c35478e394d2ad663afea76c8313320b0a7f236352aaa7a70481737b3c1f" Mar 14 07:22:10 crc kubenswrapper[4893]: I0314 07:22:10.319868 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 07:22:10 crc kubenswrapper[4893]: I0314 07:22:10.338104 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 07:22:10 crc kubenswrapper[4893]: E0314 07:22:10.338447 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eba46be1-6a5c-4665-aebf-6b243dec4ed7" containerName="oc" Mar 14 07:22:10 crc kubenswrapper[4893]: I0314 07:22:10.338464 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="eba46be1-6a5c-4665-aebf-6b243dec4ed7" containerName="oc" Mar 14 07:22:10 crc kubenswrapper[4893]: E0314 07:22:10.338473 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="435a4f9d-9f5b-4f9f-bc4a-b4651831bac1" containerName="mariadb-account-create-update" Mar 14 07:22:10 crc kubenswrapper[4893]: I0314 07:22:10.338479 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="435a4f9d-9f5b-4f9f-bc4a-b4651831bac1" containerName="mariadb-account-create-update" Mar 14 07:22:10 crc kubenswrapper[4893]: E0314 07:22:10.338490 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f50f3b01-c781-4d53-8c0c-62cb289ebbde" containerName="mariadb-database-create" Mar 14 07:22:10 crc kubenswrapper[4893]: I0314 07:22:10.338496 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="f50f3b01-c781-4d53-8c0c-62cb289ebbde" containerName="mariadb-database-create" Mar 14 07:22:10 crc kubenswrapper[4893]: E0314 07:22:10.338512 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c782f441-cf3b-4a69-965b-5d87dd4a00ad" containerName="glance-httpd" Mar 14 07:22:10 crc kubenswrapper[4893]: I0314 07:22:10.338532 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="c782f441-cf3b-4a69-965b-5d87dd4a00ad" containerName="glance-httpd" Mar 14 07:22:10 crc kubenswrapper[4893]: E0314 07:22:10.338543 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="481d0c9c-3ac9-4bae-bd8c-52489deda58c" containerName="mariadb-database-create" Mar 14 07:22:10 crc kubenswrapper[4893]: I0314 07:22:10.338549 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="481d0c9c-3ac9-4bae-bd8c-52489deda58c" containerName="mariadb-database-create" Mar 14 07:22:10 crc kubenswrapper[4893]: E0314 07:22:10.338562 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f09ba521-7c5d-4b30-902d-d7634a77e369" containerName="mariadb-account-create-update" Mar 14 07:22:10 crc kubenswrapper[4893]: I0314 07:22:10.338568 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="f09ba521-7c5d-4b30-902d-d7634a77e369" containerName="mariadb-account-create-update" Mar 14 07:22:10 crc kubenswrapper[4893]: E0314 07:22:10.338577 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e685f75c-8311-4120-88ca-be0e6060d132" containerName="mariadb-database-create" Mar 14 07:22:10 crc kubenswrapper[4893]: I0314 07:22:10.338582 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="e685f75c-8311-4120-88ca-be0e6060d132" containerName="mariadb-database-create" Mar 14 07:22:10 crc kubenswrapper[4893]: E0314 07:22:10.338589 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c782f441-cf3b-4a69-965b-5d87dd4a00ad" containerName="glance-log" Mar 14 07:22:10 crc kubenswrapper[4893]: I0314 07:22:10.338596 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="c782f441-cf3b-4a69-965b-5d87dd4a00ad" containerName="glance-log" Mar 14 07:22:10 crc kubenswrapper[4893]: E0314 07:22:10.338610 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b113c1f3-1fdd-4fd0-806e-42707f79ba1e" containerName="mariadb-account-create-update" Mar 14 07:22:10 crc kubenswrapper[4893]: I0314 07:22:10.338618 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="b113c1f3-1fdd-4fd0-806e-42707f79ba1e" containerName="mariadb-account-create-update" Mar 14 07:22:10 crc kubenswrapper[4893]: I0314 07:22:10.338770 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="481d0c9c-3ac9-4bae-bd8c-52489deda58c" containerName="mariadb-database-create" Mar 14 07:22:10 crc kubenswrapper[4893]: I0314 07:22:10.338780 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="c782f441-cf3b-4a69-965b-5d87dd4a00ad" containerName="glance-log" Mar 14 07:22:10 crc kubenswrapper[4893]: I0314 07:22:10.338793 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="c782f441-cf3b-4a69-965b-5d87dd4a00ad" containerName="glance-httpd" Mar 14 07:22:10 crc kubenswrapper[4893]: I0314 07:22:10.338803 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="eba46be1-6a5c-4665-aebf-6b243dec4ed7" containerName="oc" Mar 14 07:22:10 crc kubenswrapper[4893]: I0314 07:22:10.338810 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="f50f3b01-c781-4d53-8c0c-62cb289ebbde" containerName="mariadb-database-create" Mar 14 07:22:10 crc kubenswrapper[4893]: I0314 07:22:10.338817 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="435a4f9d-9f5b-4f9f-bc4a-b4651831bac1" containerName="mariadb-account-create-update" Mar 14 07:22:10 crc kubenswrapper[4893]: I0314 07:22:10.338824 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="e685f75c-8311-4120-88ca-be0e6060d132" containerName="mariadb-database-create" Mar 14 07:22:10 crc kubenswrapper[4893]: I0314 07:22:10.338833 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="b113c1f3-1fdd-4fd0-806e-42707f79ba1e" containerName="mariadb-account-create-update" Mar 14 07:22:10 crc kubenswrapper[4893]: I0314 07:22:10.338842 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="f09ba521-7c5d-4b30-902d-d7634a77e369" containerName="mariadb-account-create-update" Mar 14 07:22:10 crc kubenswrapper[4893]: I0314 07:22:10.339714 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 14 07:22:10 crc kubenswrapper[4893]: I0314 07:22:10.353034 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 07:22:10 crc kubenswrapper[4893]: I0314 07:22:10.371289 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 14 07:22:10 crc kubenswrapper[4893]: I0314 07:22:10.371582 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 14 07:22:10 crc kubenswrapper[4893]: I0314 07:22:10.472384 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc1eed68-2de8-46ed-91c9-3eb4fe897d3a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"dc1eed68-2de8-46ed-91c9-3eb4fe897d3a\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:22:10 crc kubenswrapper[4893]: I0314 07:22:10.472435 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc1eed68-2de8-46ed-91c9-3eb4fe897d3a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"dc1eed68-2de8-46ed-91c9-3eb4fe897d3a\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:22:10 crc kubenswrapper[4893]: I0314 07:22:10.472461 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc1eed68-2de8-46ed-91c9-3eb4fe897d3a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"dc1eed68-2de8-46ed-91c9-3eb4fe897d3a\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:22:10 crc kubenswrapper[4893]: I0314 07:22:10.472505 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc1eed68-2de8-46ed-91c9-3eb4fe897d3a-logs\") pod \"glance-default-internal-api-0\" (UID: \"dc1eed68-2de8-46ed-91c9-3eb4fe897d3a\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:22:10 crc kubenswrapper[4893]: I0314 07:22:10.472537 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc1eed68-2de8-46ed-91c9-3eb4fe897d3a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"dc1eed68-2de8-46ed-91c9-3eb4fe897d3a\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:22:10 crc kubenswrapper[4893]: I0314 07:22:10.472585 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dc1eed68-2de8-46ed-91c9-3eb4fe897d3a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"dc1eed68-2de8-46ed-91c9-3eb4fe897d3a\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:22:10 crc kubenswrapper[4893]: I0314 07:22:10.472658 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"dc1eed68-2de8-46ed-91c9-3eb4fe897d3a\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:22:10 crc kubenswrapper[4893]: I0314 07:22:10.472958 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5gkx\" (UniqueName: \"kubernetes.io/projected/dc1eed68-2de8-46ed-91c9-3eb4fe897d3a-kube-api-access-w5gkx\") pod \"glance-default-internal-api-0\" (UID: \"dc1eed68-2de8-46ed-91c9-3eb4fe897d3a\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:22:10 crc kubenswrapper[4893]: I0314 07:22:10.574024 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc1eed68-2de8-46ed-91c9-3eb4fe897d3a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"dc1eed68-2de8-46ed-91c9-3eb4fe897d3a\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:22:10 crc kubenswrapper[4893]: I0314 07:22:10.574060 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc1eed68-2de8-46ed-91c9-3eb4fe897d3a-logs\") pod \"glance-default-internal-api-0\" (UID: \"dc1eed68-2de8-46ed-91c9-3eb4fe897d3a\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:22:10 crc kubenswrapper[4893]: I0314 07:22:10.574089 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dc1eed68-2de8-46ed-91c9-3eb4fe897d3a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"dc1eed68-2de8-46ed-91c9-3eb4fe897d3a\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:22:10 crc kubenswrapper[4893]: I0314 07:22:10.574167 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"dc1eed68-2de8-46ed-91c9-3eb4fe897d3a\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:22:10 crc kubenswrapper[4893]: I0314 07:22:10.574215 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5gkx\" (UniqueName: \"kubernetes.io/projected/dc1eed68-2de8-46ed-91c9-3eb4fe897d3a-kube-api-access-w5gkx\") pod \"glance-default-internal-api-0\" (UID: \"dc1eed68-2de8-46ed-91c9-3eb4fe897d3a\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:22:10 crc kubenswrapper[4893]: I0314 07:22:10.574250 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc1eed68-2de8-46ed-91c9-3eb4fe897d3a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"dc1eed68-2de8-46ed-91c9-3eb4fe897d3a\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:22:10 crc kubenswrapper[4893]: I0314 07:22:10.574275 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc1eed68-2de8-46ed-91c9-3eb4fe897d3a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"dc1eed68-2de8-46ed-91c9-3eb4fe897d3a\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:22:10 crc kubenswrapper[4893]: I0314 07:22:10.574296 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc1eed68-2de8-46ed-91c9-3eb4fe897d3a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"dc1eed68-2de8-46ed-91c9-3eb4fe897d3a\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:22:10 crc kubenswrapper[4893]: I0314 07:22:10.574713 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc1eed68-2de8-46ed-91c9-3eb4fe897d3a-logs\") pod \"glance-default-internal-api-0\" (UID: \"dc1eed68-2de8-46ed-91c9-3eb4fe897d3a\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:22:10 crc kubenswrapper[4893]: I0314 07:22:10.574950 4893 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"dc1eed68-2de8-46ed-91c9-3eb4fe897d3a\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Mar 14 07:22:10 crc kubenswrapper[4893]: I0314 07:22:10.575303 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dc1eed68-2de8-46ed-91c9-3eb4fe897d3a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"dc1eed68-2de8-46ed-91c9-3eb4fe897d3a\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:22:10 crc kubenswrapper[4893]: I0314 07:22:10.580616 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc1eed68-2de8-46ed-91c9-3eb4fe897d3a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"dc1eed68-2de8-46ed-91c9-3eb4fe897d3a\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:22:10 crc kubenswrapper[4893]: I0314 07:22:10.580718 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc1eed68-2de8-46ed-91c9-3eb4fe897d3a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"dc1eed68-2de8-46ed-91c9-3eb4fe897d3a\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:22:10 crc kubenswrapper[4893]: I0314 07:22:10.581603 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc1eed68-2de8-46ed-91c9-3eb4fe897d3a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"dc1eed68-2de8-46ed-91c9-3eb4fe897d3a\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:22:10 crc kubenswrapper[4893]: I0314 07:22:10.591849 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc1eed68-2de8-46ed-91c9-3eb4fe897d3a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"dc1eed68-2de8-46ed-91c9-3eb4fe897d3a\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:22:10 crc kubenswrapper[4893]: I0314 07:22:10.595172 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5gkx\" (UniqueName: \"kubernetes.io/projected/dc1eed68-2de8-46ed-91c9-3eb4fe897d3a-kube-api-access-w5gkx\") pod \"glance-default-internal-api-0\" (UID: \"dc1eed68-2de8-46ed-91c9-3eb4fe897d3a\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:22:10 crc kubenswrapper[4893]: I0314 07:22:10.631842 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"dc1eed68-2de8-46ed-91c9-3eb4fe897d3a\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:22:10 crc kubenswrapper[4893]: I0314 07:22:10.710569 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 14 07:22:11 crc kubenswrapper[4893]: I0314 07:22:11.301708 4893 generic.go:334] "Generic (PLEG): container finished" podID="61f6301e-96d7-4b42-b14c-1286aff6c13f" containerID="66c46a8e3cf50ed9e16b93b6415976a6c522afb38e02301e75b1ac6e561113c1" exitCode=0 Mar 14 07:22:11 crc kubenswrapper[4893]: I0314 07:22:11.301971 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b866f57b8-fbw4s" event={"ID":"61f6301e-96d7-4b42-b14c-1286aff6c13f","Type":"ContainerDied","Data":"66c46a8e3cf50ed9e16b93b6415976a6c522afb38e02301e75b1ac6e561113c1"} Mar 14 07:22:11 crc kubenswrapper[4893]: I0314 07:22:11.320620 4893 generic.go:334] "Generic (PLEG): container finished" podID="31912b7a-b762-438d-820f-84fc2f15ff85" containerID="71b61bbcb36ee4b29d89e6fb6180dc6c4d178973e2003b6cff8a03ab59ae3be1" exitCode=0 Mar 14 07:22:11 crc kubenswrapper[4893]: I0314 07:22:11.320665 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"31912b7a-b762-438d-820f-84fc2f15ff85","Type":"ContainerDied","Data":"71b61bbcb36ee4b29d89e6fb6180dc6c4d178973e2003b6cff8a03ab59ae3be1"} Mar 14 07:22:11 crc kubenswrapper[4893]: I0314 07:22:11.320933 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 07:22:11 crc kubenswrapper[4893]: I0314 07:22:11.326731 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 07:22:11 crc kubenswrapper[4893]: I0314 07:22:11.424377 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c782f441-cf3b-4a69-965b-5d87dd4a00ad" path="/var/lib/kubelet/pods/c782f441-cf3b-4a69-965b-5d87dd4a00ad/volumes" Mar 14 07:22:11 crc kubenswrapper[4893]: I0314 07:22:11.513346 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31912b7a-b762-438d-820f-84fc2f15ff85-config-data\") pod \"31912b7a-b762-438d-820f-84fc2f15ff85\" (UID: \"31912b7a-b762-438d-820f-84fc2f15ff85\") " Mar 14 07:22:11 crc kubenswrapper[4893]: I0314 07:22:11.513430 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31912b7a-b762-438d-820f-84fc2f15ff85-combined-ca-bundle\") pod \"31912b7a-b762-438d-820f-84fc2f15ff85\" (UID: \"31912b7a-b762-438d-820f-84fc2f15ff85\") " Mar 14 07:22:11 crc kubenswrapper[4893]: I0314 07:22:11.513477 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hcgnb\" (UniqueName: \"kubernetes.io/projected/31912b7a-b762-438d-820f-84fc2f15ff85-kube-api-access-hcgnb\") pod \"31912b7a-b762-438d-820f-84fc2f15ff85\" (UID: \"31912b7a-b762-438d-820f-84fc2f15ff85\") " Mar 14 07:22:11 crc kubenswrapper[4893]: I0314 07:22:11.513612 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/31912b7a-b762-438d-820f-84fc2f15ff85-run-httpd\") pod \"31912b7a-b762-438d-820f-84fc2f15ff85\" (UID: \"31912b7a-b762-438d-820f-84fc2f15ff85\") " Mar 14 07:22:11 crc kubenswrapper[4893]: I0314 07:22:11.513666 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31912b7a-b762-438d-820f-84fc2f15ff85-scripts\") pod \"31912b7a-b762-438d-820f-84fc2f15ff85\" (UID: \"31912b7a-b762-438d-820f-84fc2f15ff85\") " Mar 14 07:22:11 crc kubenswrapper[4893]: I0314 07:22:11.513709 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/31912b7a-b762-438d-820f-84fc2f15ff85-log-httpd\") pod \"31912b7a-b762-438d-820f-84fc2f15ff85\" (UID: \"31912b7a-b762-438d-820f-84fc2f15ff85\") " Mar 14 07:22:11 crc kubenswrapper[4893]: I0314 07:22:11.513753 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/31912b7a-b762-438d-820f-84fc2f15ff85-sg-core-conf-yaml\") pod \"31912b7a-b762-438d-820f-84fc2f15ff85\" (UID: \"31912b7a-b762-438d-820f-84fc2f15ff85\") " Mar 14 07:22:11 crc kubenswrapper[4893]: I0314 07:22:11.522589 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31912b7a-b762-438d-820f-84fc2f15ff85-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "31912b7a-b762-438d-820f-84fc2f15ff85" (UID: "31912b7a-b762-438d-820f-84fc2f15ff85"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:22:11 crc kubenswrapper[4893]: I0314 07:22:11.523091 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31912b7a-b762-438d-820f-84fc2f15ff85-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "31912b7a-b762-438d-820f-84fc2f15ff85" (UID: "31912b7a-b762-438d-820f-84fc2f15ff85"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:22:11 crc kubenswrapper[4893]: I0314 07:22:11.527469 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31912b7a-b762-438d-820f-84fc2f15ff85-kube-api-access-hcgnb" (OuterVolumeSpecName: "kube-api-access-hcgnb") pod "31912b7a-b762-438d-820f-84fc2f15ff85" (UID: "31912b7a-b762-438d-820f-84fc2f15ff85"). InnerVolumeSpecName "kube-api-access-hcgnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:22:11 crc kubenswrapper[4893]: I0314 07:22:11.530370 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31912b7a-b762-438d-820f-84fc2f15ff85-scripts" (OuterVolumeSpecName: "scripts") pod "31912b7a-b762-438d-820f-84fc2f15ff85" (UID: "31912b7a-b762-438d-820f-84fc2f15ff85"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:11 crc kubenswrapper[4893]: I0314 07:22:11.550040 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31912b7a-b762-438d-820f-84fc2f15ff85-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "31912b7a-b762-438d-820f-84fc2f15ff85" (UID: "31912b7a-b762-438d-820f-84fc2f15ff85"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:11 crc kubenswrapper[4893]: I0314 07:22:11.614546 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31912b7a-b762-438d-820f-84fc2f15ff85-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "31912b7a-b762-438d-820f-84fc2f15ff85" (UID: "31912b7a-b762-438d-820f-84fc2f15ff85"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:11 crc kubenswrapper[4893]: I0314 07:22:11.615546 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31912b7a-b762-438d-820f-84fc2f15ff85-combined-ca-bundle\") pod \"31912b7a-b762-438d-820f-84fc2f15ff85\" (UID: \"31912b7a-b762-438d-820f-84fc2f15ff85\") " Mar 14 07:22:11 crc kubenswrapper[4893]: W0314 07:22:11.615652 4893 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/31912b7a-b762-438d-820f-84fc2f15ff85/volumes/kubernetes.io~secret/combined-ca-bundle Mar 14 07:22:11 crc kubenswrapper[4893]: I0314 07:22:11.615683 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31912b7a-b762-438d-820f-84fc2f15ff85-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "31912b7a-b762-438d-820f-84fc2f15ff85" (UID: "31912b7a-b762-438d-820f-84fc2f15ff85"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:11 crc kubenswrapper[4893]: I0314 07:22:11.616405 4893 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/31912b7a-b762-438d-820f-84fc2f15ff85-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:11 crc kubenswrapper[4893]: I0314 07:22:11.616478 4893 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31912b7a-b762-438d-820f-84fc2f15ff85-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:11 crc kubenswrapper[4893]: I0314 07:22:11.616571 4893 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/31912b7a-b762-438d-820f-84fc2f15ff85-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:11 crc kubenswrapper[4893]: I0314 07:22:11.616669 4893 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/31912b7a-b762-438d-820f-84fc2f15ff85-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:11 crc kubenswrapper[4893]: I0314 07:22:11.616760 4893 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31912b7a-b762-438d-820f-84fc2f15ff85-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:11 crc kubenswrapper[4893]: I0314 07:22:11.616834 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hcgnb\" (UniqueName: \"kubernetes.io/projected/31912b7a-b762-438d-820f-84fc2f15ff85-kube-api-access-hcgnb\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:11 crc kubenswrapper[4893]: I0314 07:22:11.636341 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 14 07:22:11 crc kubenswrapper[4893]: I0314 07:22:11.706334 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31912b7a-b762-438d-820f-84fc2f15ff85-config-data" (OuterVolumeSpecName: "config-data") pod "31912b7a-b762-438d-820f-84fc2f15ff85" (UID: "31912b7a-b762-438d-820f-84fc2f15ff85"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:11 crc kubenswrapper[4893]: I0314 07:22:11.718140 4893 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31912b7a-b762-438d-820f-84fc2f15ff85-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:11 crc kubenswrapper[4893]: I0314 07:22:11.765059 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-96kn9"] Mar 14 07:22:11 crc kubenswrapper[4893]: E0314 07:22:11.765414 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31912b7a-b762-438d-820f-84fc2f15ff85" containerName="proxy-httpd" Mar 14 07:22:11 crc kubenswrapper[4893]: I0314 07:22:11.765429 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="31912b7a-b762-438d-820f-84fc2f15ff85" containerName="proxy-httpd" Mar 14 07:22:11 crc kubenswrapper[4893]: E0314 07:22:11.765441 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31912b7a-b762-438d-820f-84fc2f15ff85" containerName="ceilometer-notification-agent" Mar 14 07:22:11 crc kubenswrapper[4893]: I0314 07:22:11.765448 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="31912b7a-b762-438d-820f-84fc2f15ff85" containerName="ceilometer-notification-agent" Mar 14 07:22:11 crc kubenswrapper[4893]: E0314 07:22:11.765462 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31912b7a-b762-438d-820f-84fc2f15ff85" containerName="sg-core" Mar 14 07:22:11 crc kubenswrapper[4893]: I0314 07:22:11.765468 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="31912b7a-b762-438d-820f-84fc2f15ff85" containerName="sg-core" Mar 14 07:22:11 crc kubenswrapper[4893]: E0314 07:22:11.765498 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31912b7a-b762-438d-820f-84fc2f15ff85" containerName="ceilometer-central-agent" Mar 14 07:22:11 crc kubenswrapper[4893]: I0314 07:22:11.765504 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="31912b7a-b762-438d-820f-84fc2f15ff85" containerName="ceilometer-central-agent" Mar 14 07:22:11 crc kubenswrapper[4893]: I0314 07:22:11.765777 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="31912b7a-b762-438d-820f-84fc2f15ff85" containerName="ceilometer-notification-agent" Mar 14 07:22:11 crc kubenswrapper[4893]: I0314 07:22:11.765791 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="31912b7a-b762-438d-820f-84fc2f15ff85" containerName="sg-core" Mar 14 07:22:11 crc kubenswrapper[4893]: I0314 07:22:11.765803 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="31912b7a-b762-438d-820f-84fc2f15ff85" containerName="ceilometer-central-agent" Mar 14 07:22:11 crc kubenswrapper[4893]: I0314 07:22:11.765815 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="31912b7a-b762-438d-820f-84fc2f15ff85" containerName="proxy-httpd" Mar 14 07:22:11 crc kubenswrapper[4893]: I0314 07:22:11.766434 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-96kn9" Mar 14 07:22:11 crc kubenswrapper[4893]: I0314 07:22:11.779902 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-g8mv9" Mar 14 07:22:11 crc kubenswrapper[4893]: I0314 07:22:11.780081 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 14 07:22:11 crc kubenswrapper[4893]: I0314 07:22:11.780183 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 14 07:22:11 crc kubenswrapper[4893]: I0314 07:22:11.787029 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-96kn9"] Mar 14 07:22:11 crc kubenswrapper[4893]: I0314 07:22:11.836860 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b866f57b8-fbw4s" Mar 14 07:22:11 crc kubenswrapper[4893]: I0314 07:22:11.920492 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htr4q\" (UniqueName: \"kubernetes.io/projected/61f6301e-96d7-4b42-b14c-1286aff6c13f-kube-api-access-htr4q\") pod \"61f6301e-96d7-4b42-b14c-1286aff6c13f\" (UID: \"61f6301e-96d7-4b42-b14c-1286aff6c13f\") " Mar 14 07:22:11 crc kubenswrapper[4893]: I0314 07:22:11.920566 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/61f6301e-96d7-4b42-b14c-1286aff6c13f-httpd-config\") pod \"61f6301e-96d7-4b42-b14c-1286aff6c13f\" (UID: \"61f6301e-96d7-4b42-b14c-1286aff6c13f\") " Mar 14 07:22:11 crc kubenswrapper[4893]: I0314 07:22:11.920727 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/61f6301e-96d7-4b42-b14c-1286aff6c13f-ovndb-tls-certs\") pod \"61f6301e-96d7-4b42-b14c-1286aff6c13f\" (UID: \"61f6301e-96d7-4b42-b14c-1286aff6c13f\") " Mar 14 07:22:11 crc kubenswrapper[4893]: I0314 07:22:11.920905 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/61f6301e-96d7-4b42-b14c-1286aff6c13f-config\") pod \"61f6301e-96d7-4b42-b14c-1286aff6c13f\" (UID: \"61f6301e-96d7-4b42-b14c-1286aff6c13f\") " Mar 14 07:22:11 crc kubenswrapper[4893]: I0314 07:22:11.921002 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61f6301e-96d7-4b42-b14c-1286aff6c13f-combined-ca-bundle\") pod \"61f6301e-96d7-4b42-b14c-1286aff6c13f\" (UID: \"61f6301e-96d7-4b42-b14c-1286aff6c13f\") " Mar 14 07:22:11 crc kubenswrapper[4893]: I0314 07:22:11.921878 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b57ae19b-38cb-419e-ab8f-7f0bb5ead383-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-96kn9\" (UID: \"b57ae19b-38cb-419e-ab8f-7f0bb5ead383\") " pod="openstack/nova-cell0-conductor-db-sync-96kn9" Mar 14 07:22:11 crc kubenswrapper[4893]: I0314 07:22:11.922065 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b57ae19b-38cb-419e-ab8f-7f0bb5ead383-config-data\") pod \"nova-cell0-conductor-db-sync-96kn9\" (UID: \"b57ae19b-38cb-419e-ab8f-7f0bb5ead383\") " pod="openstack/nova-cell0-conductor-db-sync-96kn9" Mar 14 07:22:11 crc kubenswrapper[4893]: I0314 07:22:11.922133 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b57ae19b-38cb-419e-ab8f-7f0bb5ead383-scripts\") pod \"nova-cell0-conductor-db-sync-96kn9\" (UID: \"b57ae19b-38cb-419e-ab8f-7f0bb5ead383\") " pod="openstack/nova-cell0-conductor-db-sync-96kn9" Mar 14 07:22:11 crc kubenswrapper[4893]: I0314 07:22:11.922161 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zl4wz\" (UniqueName: \"kubernetes.io/projected/b57ae19b-38cb-419e-ab8f-7f0bb5ead383-kube-api-access-zl4wz\") pod \"nova-cell0-conductor-db-sync-96kn9\" (UID: \"b57ae19b-38cb-419e-ab8f-7f0bb5ead383\") " pod="openstack/nova-cell0-conductor-db-sync-96kn9" Mar 14 07:22:11 crc kubenswrapper[4893]: I0314 07:22:11.925621 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61f6301e-96d7-4b42-b14c-1286aff6c13f-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "61f6301e-96d7-4b42-b14c-1286aff6c13f" (UID: "61f6301e-96d7-4b42-b14c-1286aff6c13f"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:11 crc kubenswrapper[4893]: I0314 07:22:11.931849 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61f6301e-96d7-4b42-b14c-1286aff6c13f-kube-api-access-htr4q" (OuterVolumeSpecName: "kube-api-access-htr4q") pod "61f6301e-96d7-4b42-b14c-1286aff6c13f" (UID: "61f6301e-96d7-4b42-b14c-1286aff6c13f"). InnerVolumeSpecName "kube-api-access-htr4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:22:11 crc kubenswrapper[4893]: I0314 07:22:11.975842 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61f6301e-96d7-4b42-b14c-1286aff6c13f-config" (OuterVolumeSpecName: "config") pod "61f6301e-96d7-4b42-b14c-1286aff6c13f" (UID: "61f6301e-96d7-4b42-b14c-1286aff6c13f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:11 crc kubenswrapper[4893]: I0314 07:22:11.982589 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61f6301e-96d7-4b42-b14c-1286aff6c13f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "61f6301e-96d7-4b42-b14c-1286aff6c13f" (UID: "61f6301e-96d7-4b42-b14c-1286aff6c13f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:12 crc kubenswrapper[4893]: I0314 07:22:12.010249 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61f6301e-96d7-4b42-b14c-1286aff6c13f-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "61f6301e-96d7-4b42-b14c-1286aff6c13f" (UID: "61f6301e-96d7-4b42-b14c-1286aff6c13f"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:12 crc kubenswrapper[4893]: I0314 07:22:12.024313 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b57ae19b-38cb-419e-ab8f-7f0bb5ead383-config-data\") pod \"nova-cell0-conductor-db-sync-96kn9\" (UID: \"b57ae19b-38cb-419e-ab8f-7f0bb5ead383\") " pod="openstack/nova-cell0-conductor-db-sync-96kn9" Mar 14 07:22:12 crc kubenswrapper[4893]: I0314 07:22:12.024378 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b57ae19b-38cb-419e-ab8f-7f0bb5ead383-scripts\") pod \"nova-cell0-conductor-db-sync-96kn9\" (UID: \"b57ae19b-38cb-419e-ab8f-7f0bb5ead383\") " pod="openstack/nova-cell0-conductor-db-sync-96kn9" Mar 14 07:22:12 crc kubenswrapper[4893]: I0314 07:22:12.024403 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zl4wz\" (UniqueName: \"kubernetes.io/projected/b57ae19b-38cb-419e-ab8f-7f0bb5ead383-kube-api-access-zl4wz\") pod \"nova-cell0-conductor-db-sync-96kn9\" (UID: \"b57ae19b-38cb-419e-ab8f-7f0bb5ead383\") " pod="openstack/nova-cell0-conductor-db-sync-96kn9" Mar 14 07:22:12 crc kubenswrapper[4893]: I0314 07:22:12.024486 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b57ae19b-38cb-419e-ab8f-7f0bb5ead383-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-96kn9\" (UID: \"b57ae19b-38cb-419e-ab8f-7f0bb5ead383\") " pod="openstack/nova-cell0-conductor-db-sync-96kn9" Mar 14 07:22:12 crc kubenswrapper[4893]: I0314 07:22:12.024687 4893 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/61f6301e-96d7-4b42-b14c-1286aff6c13f-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:12 crc kubenswrapper[4893]: I0314 07:22:12.024706 4893 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61f6301e-96d7-4b42-b14c-1286aff6c13f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:12 crc kubenswrapper[4893]: I0314 07:22:12.024718 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htr4q\" (UniqueName: \"kubernetes.io/projected/61f6301e-96d7-4b42-b14c-1286aff6c13f-kube-api-access-htr4q\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:12 crc kubenswrapper[4893]: I0314 07:22:12.024730 4893 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/61f6301e-96d7-4b42-b14c-1286aff6c13f-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:12 crc kubenswrapper[4893]: I0314 07:22:12.024746 4893 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/61f6301e-96d7-4b42-b14c-1286aff6c13f-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:12 crc kubenswrapper[4893]: I0314 07:22:12.028000 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b57ae19b-38cb-419e-ab8f-7f0bb5ead383-scripts\") pod \"nova-cell0-conductor-db-sync-96kn9\" (UID: \"b57ae19b-38cb-419e-ab8f-7f0bb5ead383\") " pod="openstack/nova-cell0-conductor-db-sync-96kn9" Mar 14 07:22:12 crc kubenswrapper[4893]: I0314 07:22:12.029308 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b57ae19b-38cb-419e-ab8f-7f0bb5ead383-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-96kn9\" (UID: \"b57ae19b-38cb-419e-ab8f-7f0bb5ead383\") " pod="openstack/nova-cell0-conductor-db-sync-96kn9" Mar 14 07:22:12 crc kubenswrapper[4893]: I0314 07:22:12.035125 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b57ae19b-38cb-419e-ab8f-7f0bb5ead383-config-data\") pod \"nova-cell0-conductor-db-sync-96kn9\" (UID: \"b57ae19b-38cb-419e-ab8f-7f0bb5ead383\") " pod="openstack/nova-cell0-conductor-db-sync-96kn9" Mar 14 07:22:12 crc kubenswrapper[4893]: I0314 07:22:12.039011 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zl4wz\" (UniqueName: \"kubernetes.io/projected/b57ae19b-38cb-419e-ab8f-7f0bb5ead383-kube-api-access-zl4wz\") pod \"nova-cell0-conductor-db-sync-96kn9\" (UID: \"b57ae19b-38cb-419e-ab8f-7f0bb5ead383\") " pod="openstack/nova-cell0-conductor-db-sync-96kn9" Mar 14 07:22:12 crc kubenswrapper[4893]: I0314 07:22:12.126871 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-96kn9" Mar 14 07:22:12 crc kubenswrapper[4893]: I0314 07:22:12.332179 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"31912b7a-b762-438d-820f-84fc2f15ff85","Type":"ContainerDied","Data":"c58a47973eaecdc8e02227a8b12a8df31f92d1ce64dcf9511756574e58a81857"} Mar 14 07:22:12 crc kubenswrapper[4893]: I0314 07:22:12.332435 4893 scope.go:117] "RemoveContainer" containerID="0c80292baa08bff5d7ce8101e7b6d0d959d1f4d38ec5d542baa2a5236bfeb5c1" Mar 14 07:22:12 crc kubenswrapper[4893]: I0314 07:22:12.332238 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 07:22:12 crc kubenswrapper[4893]: I0314 07:22:12.336693 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"dc1eed68-2de8-46ed-91c9-3eb4fe897d3a","Type":"ContainerStarted","Data":"43f69db823a3f9f8c85a6a93d52c4972fbfa042fe09bc248a37ffd104f7a1ea4"} Mar 14 07:22:12 crc kubenswrapper[4893]: I0314 07:22:12.336718 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"dc1eed68-2de8-46ed-91c9-3eb4fe897d3a","Type":"ContainerStarted","Data":"3f9e7a2379346ac60b5e650f8b10b47f08982c0ae473e7d2b73e4a1058b47d2e"} Mar 14 07:22:12 crc kubenswrapper[4893]: I0314 07:22:12.340625 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b866f57b8-fbw4s" event={"ID":"61f6301e-96d7-4b42-b14c-1286aff6c13f","Type":"ContainerDied","Data":"01aafd1c37af981ae6a784c909126598b665e39f904330c0d510c03e31a4fa65"} Mar 14 07:22:12 crc kubenswrapper[4893]: I0314 07:22:12.340679 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b866f57b8-fbw4s" Mar 14 07:22:12 crc kubenswrapper[4893]: I0314 07:22:12.367362 4893 scope.go:117] "RemoveContainer" containerID="6ef8cade80f34d01276a53d77ec6a01908a7ebf2489c29ff53e33002ee596494" Mar 14 07:22:12 crc kubenswrapper[4893]: I0314 07:22:12.381691 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:22:12 crc kubenswrapper[4893]: I0314 07:22:12.398462 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:22:12 crc kubenswrapper[4893]: I0314 07:22:12.407293 4893 scope.go:117] "RemoveContainer" containerID="b486620745036714e8a74e5dac973835df47c1d93972a1fd28507e74239e6c54" Mar 14 07:22:12 crc kubenswrapper[4893]: I0314 07:22:12.410054 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-b866f57b8-fbw4s"] Mar 14 07:22:12 crc kubenswrapper[4893]: I0314 07:22:12.417857 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-b866f57b8-fbw4s"] Mar 14 07:22:12 crc kubenswrapper[4893]: I0314 07:22:12.424999 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:22:12 crc kubenswrapper[4893]: E0314 07:22:12.425434 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61f6301e-96d7-4b42-b14c-1286aff6c13f" containerName="neutron-httpd" Mar 14 07:22:12 crc kubenswrapper[4893]: I0314 07:22:12.425451 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="61f6301e-96d7-4b42-b14c-1286aff6c13f" containerName="neutron-httpd" Mar 14 07:22:12 crc kubenswrapper[4893]: E0314 07:22:12.425479 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61f6301e-96d7-4b42-b14c-1286aff6c13f" containerName="neutron-api" Mar 14 07:22:12 crc kubenswrapper[4893]: I0314 07:22:12.425486 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="61f6301e-96d7-4b42-b14c-1286aff6c13f" containerName="neutron-api" Mar 14 07:22:12 crc kubenswrapper[4893]: I0314 07:22:12.425702 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="61f6301e-96d7-4b42-b14c-1286aff6c13f" containerName="neutron-httpd" Mar 14 07:22:12 crc kubenswrapper[4893]: I0314 07:22:12.425719 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="61f6301e-96d7-4b42-b14c-1286aff6c13f" containerName="neutron-api" Mar 14 07:22:12 crc kubenswrapper[4893]: I0314 07:22:12.432251 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:22:12 crc kubenswrapper[4893]: I0314 07:22:12.432375 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 07:22:12 crc kubenswrapper[4893]: I0314 07:22:12.434748 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 14 07:22:12 crc kubenswrapper[4893]: I0314 07:22:12.434986 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 14 07:22:12 crc kubenswrapper[4893]: I0314 07:22:12.444585 4893 scope.go:117] "RemoveContainer" containerID="71b61bbcb36ee4b29d89e6fb6180dc6c4d178973e2003b6cff8a03ab59ae3be1" Mar 14 07:22:12 crc kubenswrapper[4893]: I0314 07:22:12.535093 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3942ba87-6d7f-48e3-a7c8-06971f57d46d-scripts\") pod \"ceilometer-0\" (UID: \"3942ba87-6d7f-48e3-a7c8-06971f57d46d\") " pod="openstack/ceilometer-0" Mar 14 07:22:12 crc kubenswrapper[4893]: I0314 07:22:12.535311 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3942ba87-6d7f-48e3-a7c8-06971f57d46d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3942ba87-6d7f-48e3-a7c8-06971f57d46d\") " pod="openstack/ceilometer-0" Mar 14 07:22:12 crc kubenswrapper[4893]: I0314 07:22:12.535360 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3942ba87-6d7f-48e3-a7c8-06971f57d46d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3942ba87-6d7f-48e3-a7c8-06971f57d46d\") " pod="openstack/ceilometer-0" Mar 14 07:22:12 crc kubenswrapper[4893]: I0314 07:22:12.535382 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3942ba87-6d7f-48e3-a7c8-06971f57d46d-config-data\") pod \"ceilometer-0\" (UID: \"3942ba87-6d7f-48e3-a7c8-06971f57d46d\") " pod="openstack/ceilometer-0" Mar 14 07:22:12 crc kubenswrapper[4893]: I0314 07:22:12.535437 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3942ba87-6d7f-48e3-a7c8-06971f57d46d-log-httpd\") pod \"ceilometer-0\" (UID: \"3942ba87-6d7f-48e3-a7c8-06971f57d46d\") " pod="openstack/ceilometer-0" Mar 14 07:22:12 crc kubenswrapper[4893]: I0314 07:22:12.535464 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7p5v5\" (UniqueName: \"kubernetes.io/projected/3942ba87-6d7f-48e3-a7c8-06971f57d46d-kube-api-access-7p5v5\") pod \"ceilometer-0\" (UID: \"3942ba87-6d7f-48e3-a7c8-06971f57d46d\") " pod="openstack/ceilometer-0" Mar 14 07:22:12 crc kubenswrapper[4893]: I0314 07:22:12.535480 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3942ba87-6d7f-48e3-a7c8-06971f57d46d-run-httpd\") pod \"ceilometer-0\" (UID: \"3942ba87-6d7f-48e3-a7c8-06971f57d46d\") " pod="openstack/ceilometer-0" Mar 14 07:22:12 crc kubenswrapper[4893]: I0314 07:22:12.561650 4893 scope.go:117] "RemoveContainer" containerID="78579ad35de8d3d335e36041fbcc2c6cb7eddbfe098db0f486431cd760fd2ad8" Mar 14 07:22:12 crc kubenswrapper[4893]: I0314 07:22:12.581991 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-96kn9"] Mar 14 07:22:12 crc kubenswrapper[4893]: I0314 07:22:12.583888 4893 scope.go:117] "RemoveContainer" containerID="66c46a8e3cf50ed9e16b93b6415976a6c522afb38e02301e75b1ac6e561113c1" Mar 14 07:22:12 crc kubenswrapper[4893]: I0314 07:22:12.636722 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3942ba87-6d7f-48e3-a7c8-06971f57d46d-scripts\") pod \"ceilometer-0\" (UID: \"3942ba87-6d7f-48e3-a7c8-06971f57d46d\") " pod="openstack/ceilometer-0" Mar 14 07:22:12 crc kubenswrapper[4893]: I0314 07:22:12.636762 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3942ba87-6d7f-48e3-a7c8-06971f57d46d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3942ba87-6d7f-48e3-a7c8-06971f57d46d\") " pod="openstack/ceilometer-0" Mar 14 07:22:12 crc kubenswrapper[4893]: I0314 07:22:12.636803 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3942ba87-6d7f-48e3-a7c8-06971f57d46d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3942ba87-6d7f-48e3-a7c8-06971f57d46d\") " pod="openstack/ceilometer-0" Mar 14 07:22:12 crc kubenswrapper[4893]: I0314 07:22:12.636827 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3942ba87-6d7f-48e3-a7c8-06971f57d46d-config-data\") pod \"ceilometer-0\" (UID: \"3942ba87-6d7f-48e3-a7c8-06971f57d46d\") " pod="openstack/ceilometer-0" Mar 14 07:22:12 crc kubenswrapper[4893]: I0314 07:22:12.636861 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3942ba87-6d7f-48e3-a7c8-06971f57d46d-log-httpd\") pod \"ceilometer-0\" (UID: \"3942ba87-6d7f-48e3-a7c8-06971f57d46d\") " pod="openstack/ceilometer-0" Mar 14 07:22:12 crc kubenswrapper[4893]: I0314 07:22:12.636888 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7p5v5\" (UniqueName: \"kubernetes.io/projected/3942ba87-6d7f-48e3-a7c8-06971f57d46d-kube-api-access-7p5v5\") pod \"ceilometer-0\" (UID: \"3942ba87-6d7f-48e3-a7c8-06971f57d46d\") " pod="openstack/ceilometer-0" Mar 14 07:22:12 crc kubenswrapper[4893]: I0314 07:22:12.636906 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3942ba87-6d7f-48e3-a7c8-06971f57d46d-run-httpd\") pod \"ceilometer-0\" (UID: \"3942ba87-6d7f-48e3-a7c8-06971f57d46d\") " pod="openstack/ceilometer-0" Mar 14 07:22:12 crc kubenswrapper[4893]: I0314 07:22:12.637352 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3942ba87-6d7f-48e3-a7c8-06971f57d46d-run-httpd\") pod \"ceilometer-0\" (UID: \"3942ba87-6d7f-48e3-a7c8-06971f57d46d\") " pod="openstack/ceilometer-0" Mar 14 07:22:12 crc kubenswrapper[4893]: I0314 07:22:12.639299 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3942ba87-6d7f-48e3-a7c8-06971f57d46d-log-httpd\") pod \"ceilometer-0\" (UID: \"3942ba87-6d7f-48e3-a7c8-06971f57d46d\") " pod="openstack/ceilometer-0" Mar 14 07:22:12 crc kubenswrapper[4893]: I0314 07:22:12.642202 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3942ba87-6d7f-48e3-a7c8-06971f57d46d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3942ba87-6d7f-48e3-a7c8-06971f57d46d\") " pod="openstack/ceilometer-0" Mar 14 07:22:12 crc kubenswrapper[4893]: I0314 07:22:12.642750 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3942ba87-6d7f-48e3-a7c8-06971f57d46d-scripts\") pod \"ceilometer-0\" (UID: \"3942ba87-6d7f-48e3-a7c8-06971f57d46d\") " pod="openstack/ceilometer-0" Mar 14 07:22:12 crc kubenswrapper[4893]: I0314 07:22:12.644493 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3942ba87-6d7f-48e3-a7c8-06971f57d46d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3942ba87-6d7f-48e3-a7c8-06971f57d46d\") " pod="openstack/ceilometer-0" Mar 14 07:22:12 crc kubenswrapper[4893]: I0314 07:22:12.651796 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3942ba87-6d7f-48e3-a7c8-06971f57d46d-config-data\") pod \"ceilometer-0\" (UID: \"3942ba87-6d7f-48e3-a7c8-06971f57d46d\") " pod="openstack/ceilometer-0" Mar 14 07:22:12 crc kubenswrapper[4893]: I0314 07:22:12.658715 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7p5v5\" (UniqueName: \"kubernetes.io/projected/3942ba87-6d7f-48e3-a7c8-06971f57d46d-kube-api-access-7p5v5\") pod \"ceilometer-0\" (UID: \"3942ba87-6d7f-48e3-a7c8-06971f57d46d\") " pod="openstack/ceilometer-0" Mar 14 07:22:12 crc kubenswrapper[4893]: I0314 07:22:12.841588 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 07:22:13 crc kubenswrapper[4893]: W0314 07:22:13.289250 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3942ba87_6d7f_48e3_a7c8_06971f57d46d.slice/crio-10dafc75f51fb54f9ea174816a9011410c7790f91882cb3450e075c3e32ac7df WatchSource:0}: Error finding container 10dafc75f51fb54f9ea174816a9011410c7790f91882cb3450e075c3e32ac7df: Status 404 returned error can't find the container with id 10dafc75f51fb54f9ea174816a9011410c7790f91882cb3450e075c3e32ac7df Mar 14 07:22:13 crc kubenswrapper[4893]: I0314 07:22:13.290982 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:22:13 crc kubenswrapper[4893]: I0314 07:22:13.351481 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3942ba87-6d7f-48e3-a7c8-06971f57d46d","Type":"ContainerStarted","Data":"10dafc75f51fb54f9ea174816a9011410c7790f91882cb3450e075c3e32ac7df"} Mar 14 07:22:13 crc kubenswrapper[4893]: I0314 07:22:13.355343 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"dc1eed68-2de8-46ed-91c9-3eb4fe897d3a","Type":"ContainerStarted","Data":"4d1c426d2b310ef6f81cd1f91ebfb6670cc4b8ebb961307ca3eb73f633e88e39"} Mar 14 07:22:13 crc kubenswrapper[4893]: I0314 07:22:13.361379 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-96kn9" event={"ID":"b57ae19b-38cb-419e-ab8f-7f0bb5ead383","Type":"ContainerStarted","Data":"472dc2e3d79e2e2b81938251f4e53124b3ad8441f827b541040b8850108b6d30"} Mar 14 07:22:13 crc kubenswrapper[4893]: I0314 07:22:13.378213 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.378195032 podStartE2EDuration="3.378195032s" podCreationTimestamp="2026-03-14 07:22:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:22:13.375554878 +0000 UTC m=+1412.637731690" watchObservedRunningTime="2026-03-14 07:22:13.378195032 +0000 UTC m=+1412.640371824" Mar 14 07:22:13 crc kubenswrapper[4893]: I0314 07:22:13.400230 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31912b7a-b762-438d-820f-84fc2f15ff85" path="/var/lib/kubelet/pods/31912b7a-b762-438d-820f-84fc2f15ff85/volumes" Mar 14 07:22:13 crc kubenswrapper[4893]: I0314 07:22:13.405210 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61f6301e-96d7-4b42-b14c-1286aff6c13f" path="/var/lib/kubelet/pods/61f6301e-96d7-4b42-b14c-1286aff6c13f/volumes" Mar 14 07:22:15 crc kubenswrapper[4893]: I0314 07:22:15.055549 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:22:15 crc kubenswrapper[4893]: I0314 07:22:15.388631 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3942ba87-6d7f-48e3-a7c8-06971f57d46d","Type":"ContainerStarted","Data":"25efbeec87116a417d377421f90599f56b6b300e254cdb3f0c3c661d270a24fe"} Mar 14 07:22:15 crc kubenswrapper[4893]: I0314 07:22:15.388688 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3942ba87-6d7f-48e3-a7c8-06971f57d46d","Type":"ContainerStarted","Data":"30d88df3e82c6133a241b6be97ff6063c9dbd2e41ae5c37f404be50ee1df9d31"} Mar 14 07:22:15 crc kubenswrapper[4893]: I0314 07:22:15.565135 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 14 07:22:15 crc kubenswrapper[4893]: I0314 07:22:15.565489 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 14 07:22:15 crc kubenswrapper[4893]: I0314 07:22:15.637164 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 14 07:22:15 crc kubenswrapper[4893]: I0314 07:22:15.642395 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 14 07:22:16 crc kubenswrapper[4893]: I0314 07:22:16.414369 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3942ba87-6d7f-48e3-a7c8-06971f57d46d","Type":"ContainerStarted","Data":"7258d14faca081912758f1a10ef75ccd90a7802632c4f949a4df42fb36bfecbd"} Mar 14 07:22:16 crc kubenswrapper[4893]: I0314 07:22:16.416939 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 14 07:22:16 crc kubenswrapper[4893]: I0314 07:22:16.416970 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 14 07:22:18 crc kubenswrapper[4893]: I0314 07:22:18.259779 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 14 07:22:18 crc kubenswrapper[4893]: I0314 07:22:18.262031 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 14 07:22:20 crc kubenswrapper[4893]: I0314 07:22:20.711381 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 14 07:22:20 crc kubenswrapper[4893]: I0314 07:22:20.712985 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 14 07:22:20 crc kubenswrapper[4893]: I0314 07:22:20.754067 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 14 07:22:20 crc kubenswrapper[4893]: I0314 07:22:20.754145 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 14 07:22:21 crc kubenswrapper[4893]: I0314 07:22:21.470255 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-96kn9" event={"ID":"b57ae19b-38cb-419e-ab8f-7f0bb5ead383","Type":"ContainerStarted","Data":"a0e606f83c4344007c738a0607d69e9e4174dab58bc088f15e1b8aeacbda77a9"} Mar 14 07:22:21 crc kubenswrapper[4893]: I0314 07:22:21.470793 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 14 07:22:21 crc kubenswrapper[4893]: I0314 07:22:21.470815 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 14 07:22:22 crc kubenswrapper[4893]: I0314 07:22:22.482078 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3942ba87-6d7f-48e3-a7c8-06971f57d46d","Type":"ContainerStarted","Data":"7b081f5ce45d159c18dc27d3249641125368a1e39292325cf91a07491435210e"} Mar 14 07:22:22 crc kubenswrapper[4893]: I0314 07:22:22.483003 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3942ba87-6d7f-48e3-a7c8-06971f57d46d" containerName="ceilometer-central-agent" containerID="cri-o://30d88df3e82c6133a241b6be97ff6063c9dbd2e41ae5c37f404be50ee1df9d31" gracePeriod=30 Mar 14 07:22:22 crc kubenswrapper[4893]: I0314 07:22:22.483420 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3942ba87-6d7f-48e3-a7c8-06971f57d46d" containerName="proxy-httpd" containerID="cri-o://7b081f5ce45d159c18dc27d3249641125368a1e39292325cf91a07491435210e" gracePeriod=30 Mar 14 07:22:22 crc kubenswrapper[4893]: I0314 07:22:22.483449 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3942ba87-6d7f-48e3-a7c8-06971f57d46d" containerName="sg-core" containerID="cri-o://7258d14faca081912758f1a10ef75ccd90a7802632c4f949a4df42fb36bfecbd" gracePeriod=30 Mar 14 07:22:22 crc kubenswrapper[4893]: I0314 07:22:22.483511 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3942ba87-6d7f-48e3-a7c8-06971f57d46d" containerName="ceilometer-notification-agent" containerID="cri-o://25efbeec87116a417d377421f90599f56b6b300e254cdb3f0c3c661d270a24fe" gracePeriod=30 Mar 14 07:22:22 crc kubenswrapper[4893]: I0314 07:22:22.509032 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-96kn9" podStartSLOduration=3.437160442 podStartE2EDuration="11.509010546s" podCreationTimestamp="2026-03-14 07:22:11 +0000 UTC" firstStartedPulling="2026-03-14 07:22:12.620734802 +0000 UTC m=+1411.882911594" lastFinishedPulling="2026-03-14 07:22:20.692584906 +0000 UTC m=+1419.954761698" observedRunningTime="2026-03-14 07:22:21.495911732 +0000 UTC m=+1420.758088544" watchObservedRunningTime="2026-03-14 07:22:22.509010546 +0000 UTC m=+1421.771187338" Mar 14 07:22:22 crc kubenswrapper[4893]: I0314 07:22:22.515900 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.178956535 podStartE2EDuration="10.515878173s" podCreationTimestamp="2026-03-14 07:22:12 +0000 UTC" firstStartedPulling="2026-03-14 07:22:13.292357932 +0000 UTC m=+1412.554534724" lastFinishedPulling="2026-03-14 07:22:21.62927958 +0000 UTC m=+1420.891456362" observedRunningTime="2026-03-14 07:22:22.509351714 +0000 UTC m=+1421.771528546" watchObservedRunningTime="2026-03-14 07:22:22.515878173 +0000 UTC m=+1421.778054965" Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.426963 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.458793 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3942ba87-6d7f-48e3-a7c8-06971f57d46d-sg-core-conf-yaml\") pod \"3942ba87-6d7f-48e3-a7c8-06971f57d46d\" (UID: \"3942ba87-6d7f-48e3-a7c8-06971f57d46d\") " Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.458870 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3942ba87-6d7f-48e3-a7c8-06971f57d46d-log-httpd\") pod \"3942ba87-6d7f-48e3-a7c8-06971f57d46d\" (UID: \"3942ba87-6d7f-48e3-a7c8-06971f57d46d\") " Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.458893 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3942ba87-6d7f-48e3-a7c8-06971f57d46d-config-data\") pod \"3942ba87-6d7f-48e3-a7c8-06971f57d46d\" (UID: \"3942ba87-6d7f-48e3-a7c8-06971f57d46d\") " Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.458940 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3942ba87-6d7f-48e3-a7c8-06971f57d46d-scripts\") pod \"3942ba87-6d7f-48e3-a7c8-06971f57d46d\" (UID: \"3942ba87-6d7f-48e3-a7c8-06971f57d46d\") " Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.458971 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3942ba87-6d7f-48e3-a7c8-06971f57d46d-run-httpd\") pod \"3942ba87-6d7f-48e3-a7c8-06971f57d46d\" (UID: \"3942ba87-6d7f-48e3-a7c8-06971f57d46d\") " Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.459034 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3942ba87-6d7f-48e3-a7c8-06971f57d46d-combined-ca-bundle\") pod \"3942ba87-6d7f-48e3-a7c8-06971f57d46d\" (UID: \"3942ba87-6d7f-48e3-a7c8-06971f57d46d\") " Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.459066 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7p5v5\" (UniqueName: \"kubernetes.io/projected/3942ba87-6d7f-48e3-a7c8-06971f57d46d-kube-api-access-7p5v5\") pod \"3942ba87-6d7f-48e3-a7c8-06971f57d46d\" (UID: \"3942ba87-6d7f-48e3-a7c8-06971f57d46d\") " Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.459488 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3942ba87-6d7f-48e3-a7c8-06971f57d46d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3942ba87-6d7f-48e3-a7c8-06971f57d46d" (UID: "3942ba87-6d7f-48e3-a7c8-06971f57d46d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.474826 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3942ba87-6d7f-48e3-a7c8-06971f57d46d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3942ba87-6d7f-48e3-a7c8-06971f57d46d" (UID: "3942ba87-6d7f-48e3-a7c8-06971f57d46d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.498720 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3942ba87-6d7f-48e3-a7c8-06971f57d46d-scripts" (OuterVolumeSpecName: "scripts") pod "3942ba87-6d7f-48e3-a7c8-06971f57d46d" (UID: "3942ba87-6d7f-48e3-a7c8-06971f57d46d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.499762 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3942ba87-6d7f-48e3-a7c8-06971f57d46d-kube-api-access-7p5v5" (OuterVolumeSpecName: "kube-api-access-7p5v5") pod "3942ba87-6d7f-48e3-a7c8-06971f57d46d" (UID: "3942ba87-6d7f-48e3-a7c8-06971f57d46d"). InnerVolumeSpecName "kube-api-access-7p5v5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.508660 4893 generic.go:334] "Generic (PLEG): container finished" podID="3942ba87-6d7f-48e3-a7c8-06971f57d46d" containerID="7b081f5ce45d159c18dc27d3249641125368a1e39292325cf91a07491435210e" exitCode=0 Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.508721 4893 generic.go:334] "Generic (PLEG): container finished" podID="3942ba87-6d7f-48e3-a7c8-06971f57d46d" containerID="7258d14faca081912758f1a10ef75ccd90a7802632c4f949a4df42fb36bfecbd" exitCode=2 Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.508733 4893 generic.go:334] "Generic (PLEG): container finished" podID="3942ba87-6d7f-48e3-a7c8-06971f57d46d" containerID="25efbeec87116a417d377421f90599f56b6b300e254cdb3f0c3c661d270a24fe" exitCode=0 Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.508740 4893 generic.go:334] "Generic (PLEG): container finished" podID="3942ba87-6d7f-48e3-a7c8-06971f57d46d" containerID="30d88df3e82c6133a241b6be97ff6063c9dbd2e41ae5c37f404be50ee1df9d31" exitCode=0 Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.508764 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3942ba87-6d7f-48e3-a7c8-06971f57d46d","Type":"ContainerDied","Data":"7b081f5ce45d159c18dc27d3249641125368a1e39292325cf91a07491435210e"} Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.508817 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3942ba87-6d7f-48e3-a7c8-06971f57d46d","Type":"ContainerDied","Data":"7258d14faca081912758f1a10ef75ccd90a7802632c4f949a4df42fb36bfecbd"} Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.508832 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3942ba87-6d7f-48e3-a7c8-06971f57d46d","Type":"ContainerDied","Data":"25efbeec87116a417d377421f90599f56b6b300e254cdb3f0c3c661d270a24fe"} Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.508842 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3942ba87-6d7f-48e3-a7c8-06971f57d46d","Type":"ContainerDied","Data":"30d88df3e82c6133a241b6be97ff6063c9dbd2e41ae5c37f404be50ee1df9d31"} Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.508854 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3942ba87-6d7f-48e3-a7c8-06971f57d46d","Type":"ContainerDied","Data":"10dafc75f51fb54f9ea174816a9011410c7790f91882cb3450e075c3e32ac7df"} Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.508897 4893 scope.go:117] "RemoveContainer" containerID="7b081f5ce45d159c18dc27d3249641125368a1e39292325cf91a07491435210e" Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.509099 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.519682 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3942ba87-6d7f-48e3-a7c8-06971f57d46d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "3942ba87-6d7f-48e3-a7c8-06971f57d46d" (UID: "3942ba87-6d7f-48e3-a7c8-06971f57d46d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.556662 4893 scope.go:117] "RemoveContainer" containerID="7258d14faca081912758f1a10ef75ccd90a7802632c4f949a4df42fb36bfecbd" Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.558135 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.558232 4893 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.564469 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3942ba87-6d7f-48e3-a7c8-06971f57d46d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3942ba87-6d7f-48e3-a7c8-06971f57d46d" (UID: "3942ba87-6d7f-48e3-a7c8-06971f57d46d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.585908 4893 scope.go:117] "RemoveContainer" containerID="25efbeec87116a417d377421f90599f56b6b300e254cdb3f0c3c661d270a24fe" Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.589922 4893 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3942ba87-6d7f-48e3-a7c8-06971f57d46d-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.589955 4893 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3942ba87-6d7f-48e3-a7c8-06971f57d46d-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.589969 4893 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3942ba87-6d7f-48e3-a7c8-06971f57d46d-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.589981 4893 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3942ba87-6d7f-48e3-a7c8-06971f57d46d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.589993 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7p5v5\" (UniqueName: \"kubernetes.io/projected/3942ba87-6d7f-48e3-a7c8-06971f57d46d-kube-api-access-7p5v5\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.590005 4893 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3942ba87-6d7f-48e3-a7c8-06971f57d46d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.591695 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.624823 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3942ba87-6d7f-48e3-a7c8-06971f57d46d-config-data" (OuterVolumeSpecName: "config-data") pod "3942ba87-6d7f-48e3-a7c8-06971f57d46d" (UID: "3942ba87-6d7f-48e3-a7c8-06971f57d46d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.636386 4893 scope.go:117] "RemoveContainer" containerID="30d88df3e82c6133a241b6be97ff6063c9dbd2e41ae5c37f404be50ee1df9d31" Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.698799 4893 scope.go:117] "RemoveContainer" containerID="7b081f5ce45d159c18dc27d3249641125368a1e39292325cf91a07491435210e" Mar 14 07:22:23 crc kubenswrapper[4893]: E0314 07:22:23.699474 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b081f5ce45d159c18dc27d3249641125368a1e39292325cf91a07491435210e\": container with ID starting with 7b081f5ce45d159c18dc27d3249641125368a1e39292325cf91a07491435210e not found: ID does not exist" containerID="7b081f5ce45d159c18dc27d3249641125368a1e39292325cf91a07491435210e" Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.699560 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b081f5ce45d159c18dc27d3249641125368a1e39292325cf91a07491435210e"} err="failed to get container status \"7b081f5ce45d159c18dc27d3249641125368a1e39292325cf91a07491435210e\": rpc error: code = NotFound desc = could not find container \"7b081f5ce45d159c18dc27d3249641125368a1e39292325cf91a07491435210e\": container with ID starting with 7b081f5ce45d159c18dc27d3249641125368a1e39292325cf91a07491435210e not found: ID does not exist" Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.699658 4893 scope.go:117] "RemoveContainer" containerID="7258d14faca081912758f1a10ef75ccd90a7802632c4f949a4df42fb36bfecbd" Mar 14 07:22:23 crc kubenswrapper[4893]: E0314 07:22:23.700271 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7258d14faca081912758f1a10ef75ccd90a7802632c4f949a4df42fb36bfecbd\": container with ID starting with 7258d14faca081912758f1a10ef75ccd90a7802632c4f949a4df42fb36bfecbd not found: ID does not exist" containerID="7258d14faca081912758f1a10ef75ccd90a7802632c4f949a4df42fb36bfecbd" Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.700312 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7258d14faca081912758f1a10ef75ccd90a7802632c4f949a4df42fb36bfecbd"} err="failed to get container status \"7258d14faca081912758f1a10ef75ccd90a7802632c4f949a4df42fb36bfecbd\": rpc error: code = NotFound desc = could not find container \"7258d14faca081912758f1a10ef75ccd90a7802632c4f949a4df42fb36bfecbd\": container with ID starting with 7258d14faca081912758f1a10ef75ccd90a7802632c4f949a4df42fb36bfecbd not found: ID does not exist" Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.700356 4893 scope.go:117] "RemoveContainer" containerID="25efbeec87116a417d377421f90599f56b6b300e254cdb3f0c3c661d270a24fe" Mar 14 07:22:23 crc kubenswrapper[4893]: E0314 07:22:23.701215 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25efbeec87116a417d377421f90599f56b6b300e254cdb3f0c3c661d270a24fe\": container with ID starting with 25efbeec87116a417d377421f90599f56b6b300e254cdb3f0c3c661d270a24fe not found: ID does not exist" containerID="25efbeec87116a417d377421f90599f56b6b300e254cdb3f0c3c661d270a24fe" Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.701265 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25efbeec87116a417d377421f90599f56b6b300e254cdb3f0c3c661d270a24fe"} err="failed to get container status \"25efbeec87116a417d377421f90599f56b6b300e254cdb3f0c3c661d270a24fe\": rpc error: code = NotFound desc = could not find container \"25efbeec87116a417d377421f90599f56b6b300e254cdb3f0c3c661d270a24fe\": container with ID starting with 25efbeec87116a417d377421f90599f56b6b300e254cdb3f0c3c661d270a24fe not found: ID does not exist" Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.701281 4893 scope.go:117] "RemoveContainer" containerID="30d88df3e82c6133a241b6be97ff6063c9dbd2e41ae5c37f404be50ee1df9d31" Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.701661 4893 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3942ba87-6d7f-48e3-a7c8-06971f57d46d-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:23 crc kubenswrapper[4893]: E0314 07:22:23.701747 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30d88df3e82c6133a241b6be97ff6063c9dbd2e41ae5c37f404be50ee1df9d31\": container with ID starting with 30d88df3e82c6133a241b6be97ff6063c9dbd2e41ae5c37f404be50ee1df9d31 not found: ID does not exist" containerID="30d88df3e82c6133a241b6be97ff6063c9dbd2e41ae5c37f404be50ee1df9d31" Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.701775 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30d88df3e82c6133a241b6be97ff6063c9dbd2e41ae5c37f404be50ee1df9d31"} err="failed to get container status \"30d88df3e82c6133a241b6be97ff6063c9dbd2e41ae5c37f404be50ee1df9d31\": rpc error: code = NotFound desc = could not find container \"30d88df3e82c6133a241b6be97ff6063c9dbd2e41ae5c37f404be50ee1df9d31\": container with ID starting with 30d88df3e82c6133a241b6be97ff6063c9dbd2e41ae5c37f404be50ee1df9d31 not found: ID does not exist" Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.701797 4893 scope.go:117] "RemoveContainer" containerID="7b081f5ce45d159c18dc27d3249641125368a1e39292325cf91a07491435210e" Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.703652 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b081f5ce45d159c18dc27d3249641125368a1e39292325cf91a07491435210e"} err="failed to get container status \"7b081f5ce45d159c18dc27d3249641125368a1e39292325cf91a07491435210e\": rpc error: code = NotFound desc = could not find container \"7b081f5ce45d159c18dc27d3249641125368a1e39292325cf91a07491435210e\": container with ID starting with 7b081f5ce45d159c18dc27d3249641125368a1e39292325cf91a07491435210e not found: ID does not exist" Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.703678 4893 scope.go:117] "RemoveContainer" containerID="7258d14faca081912758f1a10ef75ccd90a7802632c4f949a4df42fb36bfecbd" Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.704065 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7258d14faca081912758f1a10ef75ccd90a7802632c4f949a4df42fb36bfecbd"} err="failed to get container status \"7258d14faca081912758f1a10ef75ccd90a7802632c4f949a4df42fb36bfecbd\": rpc error: code = NotFound desc = could not find container \"7258d14faca081912758f1a10ef75ccd90a7802632c4f949a4df42fb36bfecbd\": container with ID starting with 7258d14faca081912758f1a10ef75ccd90a7802632c4f949a4df42fb36bfecbd not found: ID does not exist" Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.704085 4893 scope.go:117] "RemoveContainer" containerID="25efbeec87116a417d377421f90599f56b6b300e254cdb3f0c3c661d270a24fe" Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.704275 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25efbeec87116a417d377421f90599f56b6b300e254cdb3f0c3c661d270a24fe"} err="failed to get container status \"25efbeec87116a417d377421f90599f56b6b300e254cdb3f0c3c661d270a24fe\": rpc error: code = NotFound desc = could not find container \"25efbeec87116a417d377421f90599f56b6b300e254cdb3f0c3c661d270a24fe\": container with ID starting with 25efbeec87116a417d377421f90599f56b6b300e254cdb3f0c3c661d270a24fe not found: ID does not exist" Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.704314 4893 scope.go:117] "RemoveContainer" containerID="30d88df3e82c6133a241b6be97ff6063c9dbd2e41ae5c37f404be50ee1df9d31" Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.704647 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30d88df3e82c6133a241b6be97ff6063c9dbd2e41ae5c37f404be50ee1df9d31"} err="failed to get container status \"30d88df3e82c6133a241b6be97ff6063c9dbd2e41ae5c37f404be50ee1df9d31\": rpc error: code = NotFound desc = could not find container \"30d88df3e82c6133a241b6be97ff6063c9dbd2e41ae5c37f404be50ee1df9d31\": container with ID starting with 30d88df3e82c6133a241b6be97ff6063c9dbd2e41ae5c37f404be50ee1df9d31 not found: ID does not exist" Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.704669 4893 scope.go:117] "RemoveContainer" containerID="7b081f5ce45d159c18dc27d3249641125368a1e39292325cf91a07491435210e" Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.704965 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b081f5ce45d159c18dc27d3249641125368a1e39292325cf91a07491435210e"} err="failed to get container status \"7b081f5ce45d159c18dc27d3249641125368a1e39292325cf91a07491435210e\": rpc error: code = NotFound desc = could not find container \"7b081f5ce45d159c18dc27d3249641125368a1e39292325cf91a07491435210e\": container with ID starting with 7b081f5ce45d159c18dc27d3249641125368a1e39292325cf91a07491435210e not found: ID does not exist" Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.704983 4893 scope.go:117] "RemoveContainer" containerID="7258d14faca081912758f1a10ef75ccd90a7802632c4f949a4df42fb36bfecbd" Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.705272 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7258d14faca081912758f1a10ef75ccd90a7802632c4f949a4df42fb36bfecbd"} err="failed to get container status \"7258d14faca081912758f1a10ef75ccd90a7802632c4f949a4df42fb36bfecbd\": rpc error: code = NotFound desc = could not find container \"7258d14faca081912758f1a10ef75ccd90a7802632c4f949a4df42fb36bfecbd\": container with ID starting with 7258d14faca081912758f1a10ef75ccd90a7802632c4f949a4df42fb36bfecbd not found: ID does not exist" Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.705290 4893 scope.go:117] "RemoveContainer" containerID="25efbeec87116a417d377421f90599f56b6b300e254cdb3f0c3c661d270a24fe" Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.705671 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25efbeec87116a417d377421f90599f56b6b300e254cdb3f0c3c661d270a24fe"} err="failed to get container status \"25efbeec87116a417d377421f90599f56b6b300e254cdb3f0c3c661d270a24fe\": rpc error: code = NotFound desc = could not find container \"25efbeec87116a417d377421f90599f56b6b300e254cdb3f0c3c661d270a24fe\": container with ID starting with 25efbeec87116a417d377421f90599f56b6b300e254cdb3f0c3c661d270a24fe not found: ID does not exist" Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.705710 4893 scope.go:117] "RemoveContainer" containerID="30d88df3e82c6133a241b6be97ff6063c9dbd2e41ae5c37f404be50ee1df9d31" Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.706000 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30d88df3e82c6133a241b6be97ff6063c9dbd2e41ae5c37f404be50ee1df9d31"} err="failed to get container status \"30d88df3e82c6133a241b6be97ff6063c9dbd2e41ae5c37f404be50ee1df9d31\": rpc error: code = NotFound desc = could not find container \"30d88df3e82c6133a241b6be97ff6063c9dbd2e41ae5c37f404be50ee1df9d31\": container with ID starting with 30d88df3e82c6133a241b6be97ff6063c9dbd2e41ae5c37f404be50ee1df9d31 not found: ID does not exist" Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.706019 4893 scope.go:117] "RemoveContainer" containerID="7b081f5ce45d159c18dc27d3249641125368a1e39292325cf91a07491435210e" Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.706498 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b081f5ce45d159c18dc27d3249641125368a1e39292325cf91a07491435210e"} err="failed to get container status \"7b081f5ce45d159c18dc27d3249641125368a1e39292325cf91a07491435210e\": rpc error: code = NotFound desc = could not find container \"7b081f5ce45d159c18dc27d3249641125368a1e39292325cf91a07491435210e\": container with ID starting with 7b081f5ce45d159c18dc27d3249641125368a1e39292325cf91a07491435210e not found: ID does not exist" Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.706590 4893 scope.go:117] "RemoveContainer" containerID="7258d14faca081912758f1a10ef75ccd90a7802632c4f949a4df42fb36bfecbd" Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.707250 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7258d14faca081912758f1a10ef75ccd90a7802632c4f949a4df42fb36bfecbd"} err="failed to get container status \"7258d14faca081912758f1a10ef75ccd90a7802632c4f949a4df42fb36bfecbd\": rpc error: code = NotFound desc = could not find container \"7258d14faca081912758f1a10ef75ccd90a7802632c4f949a4df42fb36bfecbd\": container with ID starting with 7258d14faca081912758f1a10ef75ccd90a7802632c4f949a4df42fb36bfecbd not found: ID does not exist" Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.707311 4893 scope.go:117] "RemoveContainer" containerID="25efbeec87116a417d377421f90599f56b6b300e254cdb3f0c3c661d270a24fe" Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.707708 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25efbeec87116a417d377421f90599f56b6b300e254cdb3f0c3c661d270a24fe"} err="failed to get container status \"25efbeec87116a417d377421f90599f56b6b300e254cdb3f0c3c661d270a24fe\": rpc error: code = NotFound desc = could not find container \"25efbeec87116a417d377421f90599f56b6b300e254cdb3f0c3c661d270a24fe\": container with ID starting with 25efbeec87116a417d377421f90599f56b6b300e254cdb3f0c3c661d270a24fe not found: ID does not exist" Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.707735 4893 scope.go:117] "RemoveContainer" containerID="30d88df3e82c6133a241b6be97ff6063c9dbd2e41ae5c37f404be50ee1df9d31" Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.710222 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30d88df3e82c6133a241b6be97ff6063c9dbd2e41ae5c37f404be50ee1df9d31"} err="failed to get container status \"30d88df3e82c6133a241b6be97ff6063c9dbd2e41ae5c37f404be50ee1df9d31\": rpc error: code = NotFound desc = could not find container \"30d88df3e82c6133a241b6be97ff6063c9dbd2e41ae5c37f404be50ee1df9d31\": container with ID starting with 30d88df3e82c6133a241b6be97ff6063c9dbd2e41ae5c37f404be50ee1df9d31 not found: ID does not exist" Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.848570 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.850107 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.879030 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:22:23 crc kubenswrapper[4893]: E0314 07:22:23.879428 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3942ba87-6d7f-48e3-a7c8-06971f57d46d" containerName="ceilometer-notification-agent" Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.879444 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="3942ba87-6d7f-48e3-a7c8-06971f57d46d" containerName="ceilometer-notification-agent" Mar 14 07:22:23 crc kubenswrapper[4893]: E0314 07:22:23.879469 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3942ba87-6d7f-48e3-a7c8-06971f57d46d" containerName="sg-core" Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.879475 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="3942ba87-6d7f-48e3-a7c8-06971f57d46d" containerName="sg-core" Mar 14 07:22:23 crc kubenswrapper[4893]: E0314 07:22:23.879488 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3942ba87-6d7f-48e3-a7c8-06971f57d46d" containerName="proxy-httpd" Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.879494 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="3942ba87-6d7f-48e3-a7c8-06971f57d46d" containerName="proxy-httpd" Mar 14 07:22:23 crc kubenswrapper[4893]: E0314 07:22:23.879505 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3942ba87-6d7f-48e3-a7c8-06971f57d46d" containerName="ceilometer-central-agent" Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.879511 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="3942ba87-6d7f-48e3-a7c8-06971f57d46d" containerName="ceilometer-central-agent" Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.879685 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="3942ba87-6d7f-48e3-a7c8-06971f57d46d" containerName="sg-core" Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.879699 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="3942ba87-6d7f-48e3-a7c8-06971f57d46d" containerName="ceilometer-notification-agent" Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.879716 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="3942ba87-6d7f-48e3-a7c8-06971f57d46d" containerName="ceilometer-central-agent" Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.879732 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="3942ba87-6d7f-48e3-a7c8-06971f57d46d" containerName="proxy-httpd" Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.881219 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.883286 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.883987 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 14 07:22:23 crc kubenswrapper[4893]: I0314 07:22:23.898409 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:22:24 crc kubenswrapper[4893]: I0314 07:22:24.005713 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68f76416-fb2c-4ee5-b5c3-cd641ff16d3c-scripts\") pod \"ceilometer-0\" (UID: \"68f76416-fb2c-4ee5-b5c3-cd641ff16d3c\") " pod="openstack/ceilometer-0" Mar 14 07:22:24 crc kubenswrapper[4893]: I0314 07:22:24.005772 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ql2ck\" (UniqueName: \"kubernetes.io/projected/68f76416-fb2c-4ee5-b5c3-cd641ff16d3c-kube-api-access-ql2ck\") pod \"ceilometer-0\" (UID: \"68f76416-fb2c-4ee5-b5c3-cd641ff16d3c\") " pod="openstack/ceilometer-0" Mar 14 07:22:24 crc kubenswrapper[4893]: I0314 07:22:24.006374 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68f76416-fb2c-4ee5-b5c3-cd641ff16d3c-log-httpd\") pod \"ceilometer-0\" (UID: \"68f76416-fb2c-4ee5-b5c3-cd641ff16d3c\") " pod="openstack/ceilometer-0" Mar 14 07:22:24 crc kubenswrapper[4893]: I0314 07:22:24.006428 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68f76416-fb2c-4ee5-b5c3-cd641ff16d3c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"68f76416-fb2c-4ee5-b5c3-cd641ff16d3c\") " pod="openstack/ceilometer-0" Mar 14 07:22:24 crc kubenswrapper[4893]: I0314 07:22:24.006929 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/68f76416-fb2c-4ee5-b5c3-cd641ff16d3c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"68f76416-fb2c-4ee5-b5c3-cd641ff16d3c\") " pod="openstack/ceilometer-0" Mar 14 07:22:24 crc kubenswrapper[4893]: I0314 07:22:24.007132 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68f76416-fb2c-4ee5-b5c3-cd641ff16d3c-config-data\") pod \"ceilometer-0\" (UID: \"68f76416-fb2c-4ee5-b5c3-cd641ff16d3c\") " pod="openstack/ceilometer-0" Mar 14 07:22:24 crc kubenswrapper[4893]: I0314 07:22:24.007174 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68f76416-fb2c-4ee5-b5c3-cd641ff16d3c-run-httpd\") pod \"ceilometer-0\" (UID: \"68f76416-fb2c-4ee5-b5c3-cd641ff16d3c\") " pod="openstack/ceilometer-0" Mar 14 07:22:24 crc kubenswrapper[4893]: I0314 07:22:24.109660 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68f76416-fb2c-4ee5-b5c3-cd641ff16d3c-config-data\") pod \"ceilometer-0\" (UID: \"68f76416-fb2c-4ee5-b5c3-cd641ff16d3c\") " pod="openstack/ceilometer-0" Mar 14 07:22:24 crc kubenswrapper[4893]: I0314 07:22:24.109979 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68f76416-fb2c-4ee5-b5c3-cd641ff16d3c-run-httpd\") pod \"ceilometer-0\" (UID: \"68f76416-fb2c-4ee5-b5c3-cd641ff16d3c\") " pod="openstack/ceilometer-0" Mar 14 07:22:24 crc kubenswrapper[4893]: I0314 07:22:24.110130 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68f76416-fb2c-4ee5-b5c3-cd641ff16d3c-scripts\") pod \"ceilometer-0\" (UID: \"68f76416-fb2c-4ee5-b5c3-cd641ff16d3c\") " pod="openstack/ceilometer-0" Mar 14 07:22:24 crc kubenswrapper[4893]: I0314 07:22:24.110231 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ql2ck\" (UniqueName: \"kubernetes.io/projected/68f76416-fb2c-4ee5-b5c3-cd641ff16d3c-kube-api-access-ql2ck\") pod \"ceilometer-0\" (UID: \"68f76416-fb2c-4ee5-b5c3-cd641ff16d3c\") " pod="openstack/ceilometer-0" Mar 14 07:22:24 crc kubenswrapper[4893]: I0314 07:22:24.110353 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68f76416-fb2c-4ee5-b5c3-cd641ff16d3c-log-httpd\") pod \"ceilometer-0\" (UID: \"68f76416-fb2c-4ee5-b5c3-cd641ff16d3c\") " pod="openstack/ceilometer-0" Mar 14 07:22:24 crc kubenswrapper[4893]: I0314 07:22:24.110887 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68f76416-fb2c-4ee5-b5c3-cd641ff16d3c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"68f76416-fb2c-4ee5-b5c3-cd641ff16d3c\") " pod="openstack/ceilometer-0" Mar 14 07:22:24 crc kubenswrapper[4893]: I0314 07:22:24.110812 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68f76416-fb2c-4ee5-b5c3-cd641ff16d3c-log-httpd\") pod \"ceilometer-0\" (UID: \"68f76416-fb2c-4ee5-b5c3-cd641ff16d3c\") " pod="openstack/ceilometer-0" Mar 14 07:22:24 crc kubenswrapper[4893]: I0314 07:22:24.110598 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68f76416-fb2c-4ee5-b5c3-cd641ff16d3c-run-httpd\") pod \"ceilometer-0\" (UID: \"68f76416-fb2c-4ee5-b5c3-cd641ff16d3c\") " pod="openstack/ceilometer-0" Mar 14 07:22:24 crc kubenswrapper[4893]: I0314 07:22:24.111655 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/68f76416-fb2c-4ee5-b5c3-cd641ff16d3c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"68f76416-fb2c-4ee5-b5c3-cd641ff16d3c\") " pod="openstack/ceilometer-0" Mar 14 07:22:24 crc kubenswrapper[4893]: I0314 07:22:24.114850 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/68f76416-fb2c-4ee5-b5c3-cd641ff16d3c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"68f76416-fb2c-4ee5-b5c3-cd641ff16d3c\") " pod="openstack/ceilometer-0" Mar 14 07:22:24 crc kubenswrapper[4893]: I0314 07:22:24.114904 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68f76416-fb2c-4ee5-b5c3-cd641ff16d3c-scripts\") pod \"ceilometer-0\" (UID: \"68f76416-fb2c-4ee5-b5c3-cd641ff16d3c\") " pod="openstack/ceilometer-0" Mar 14 07:22:24 crc kubenswrapper[4893]: I0314 07:22:24.116419 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68f76416-fb2c-4ee5-b5c3-cd641ff16d3c-config-data\") pod \"ceilometer-0\" (UID: \"68f76416-fb2c-4ee5-b5c3-cd641ff16d3c\") " pod="openstack/ceilometer-0" Mar 14 07:22:24 crc kubenswrapper[4893]: I0314 07:22:24.128931 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68f76416-fb2c-4ee5-b5c3-cd641ff16d3c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"68f76416-fb2c-4ee5-b5c3-cd641ff16d3c\") " pod="openstack/ceilometer-0" Mar 14 07:22:24 crc kubenswrapper[4893]: I0314 07:22:24.133113 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ql2ck\" (UniqueName: \"kubernetes.io/projected/68f76416-fb2c-4ee5-b5c3-cd641ff16d3c-kube-api-access-ql2ck\") pod \"ceilometer-0\" (UID: \"68f76416-fb2c-4ee5-b5c3-cd641ff16d3c\") " pod="openstack/ceilometer-0" Mar 14 07:22:24 crc kubenswrapper[4893]: I0314 07:22:24.260398 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 07:22:24 crc kubenswrapper[4893]: W0314 07:22:24.821655 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68f76416_fb2c_4ee5_b5c3_cd641ff16d3c.slice/crio-73582fe2518e8fc0c014ff3392ddafd33731a9135875473c30a7146f400cefbc WatchSource:0}: Error finding container 73582fe2518e8fc0c014ff3392ddafd33731a9135875473c30a7146f400cefbc: Status 404 returned error can't find the container with id 73582fe2518e8fc0c014ff3392ddafd33731a9135875473c30a7146f400cefbc Mar 14 07:22:24 crc kubenswrapper[4893]: I0314 07:22:24.823479 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:22:25 crc kubenswrapper[4893]: I0314 07:22:25.388703 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3942ba87-6d7f-48e3-a7c8-06971f57d46d" path="/var/lib/kubelet/pods/3942ba87-6d7f-48e3-a7c8-06971f57d46d/volumes" Mar 14 07:22:25 crc kubenswrapper[4893]: I0314 07:22:25.535687 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68f76416-fb2c-4ee5-b5c3-cd641ff16d3c","Type":"ContainerStarted","Data":"73582fe2518e8fc0c014ff3392ddafd33731a9135875473c30a7146f400cefbc"} Mar 14 07:22:26 crc kubenswrapper[4893]: I0314 07:22:26.550648 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68f76416-fb2c-4ee5-b5c3-cd641ff16d3c","Type":"ContainerStarted","Data":"ca275f31d0ccdaf6dd527f27cc02130a43801fb38f3f2eea6213a4584393d63d"} Mar 14 07:22:26 crc kubenswrapper[4893]: I0314 07:22:26.551057 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68f76416-fb2c-4ee5-b5c3-cd641ff16d3c","Type":"ContainerStarted","Data":"a779b33a0061f8c13a3cb36595d96876bed2bf6540269a38b323e93a58171750"} Mar 14 07:22:27 crc kubenswrapper[4893]: I0314 07:22:27.562463 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68f76416-fb2c-4ee5-b5c3-cd641ff16d3c","Type":"ContainerStarted","Data":"13dae70aa4245e5fe03a14d69771afb52a0dc74bb52c53c6760616b1d9500784"} Mar 14 07:22:29 crc kubenswrapper[4893]: I0314 07:22:29.731133 4893 patch_prober.go:28] interesting pod/machine-config-daemon-d4x6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:22:29 crc kubenswrapper[4893]: I0314 07:22:29.731545 4893 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:22:29 crc kubenswrapper[4893]: I0314 07:22:29.731614 4893 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" Mar 14 07:22:29 crc kubenswrapper[4893]: I0314 07:22:29.732506 4893 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"65067d2744ce3683d92ff7c636321367aa0c4ec520d4ca1606a1f744b31b6656"} pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 07:22:29 crc kubenswrapper[4893]: I0314 07:22:29.732580 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" containerID="cri-o://65067d2744ce3683d92ff7c636321367aa0c4ec520d4ca1606a1f744b31b6656" gracePeriod=600 Mar 14 07:22:30 crc kubenswrapper[4893]: I0314 07:22:30.609326 4893 generic.go:334] "Generic (PLEG): container finished" podID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerID="65067d2744ce3683d92ff7c636321367aa0c4ec520d4ca1606a1f744b31b6656" exitCode=0 Mar 14 07:22:30 crc kubenswrapper[4893]: I0314 07:22:30.609594 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" event={"ID":"ad6724e5-48cf-4417-ae51-b1cb8c6af70d","Type":"ContainerDied","Data":"65067d2744ce3683d92ff7c636321367aa0c4ec520d4ca1606a1f744b31b6656"} Mar 14 07:22:30 crc kubenswrapper[4893]: I0314 07:22:30.609730 4893 scope.go:117] "RemoveContainer" containerID="6b7e5a6c1e81433472238895d55cad009404cc5608a11ac60397f9c58ede773f" Mar 14 07:22:31 crc kubenswrapper[4893]: I0314 07:22:31.632709 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68f76416-fb2c-4ee5-b5c3-cd641ff16d3c","Type":"ContainerStarted","Data":"78845d818142f720fd8d7d8c6df6b7453444798b0ba9a40d8c533a6fec9d37fc"} Mar 14 07:22:31 crc kubenswrapper[4893]: I0314 07:22:31.633123 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 14 07:22:31 crc kubenswrapper[4893]: I0314 07:22:31.635326 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" event={"ID":"ad6724e5-48cf-4417-ae51-b1cb8c6af70d","Type":"ContainerStarted","Data":"95504669fee57337f728128f8e2d8bc7bfed00823cb8a0e77ef2c491f6b7fa2c"} Mar 14 07:22:31 crc kubenswrapper[4893]: I0314 07:22:31.678488 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.024444393 podStartE2EDuration="8.678471241s" podCreationTimestamp="2026-03-14 07:22:23 +0000 UTC" firstStartedPulling="2026-03-14 07:22:24.824665089 +0000 UTC m=+1424.086841871" lastFinishedPulling="2026-03-14 07:22:30.478691917 +0000 UTC m=+1429.740868719" observedRunningTime="2026-03-14 07:22:31.657032913 +0000 UTC m=+1430.919209715" watchObservedRunningTime="2026-03-14 07:22:31.678471241 +0000 UTC m=+1430.940648023" Mar 14 07:22:33 crc kubenswrapper[4893]: I0314 07:22:33.128932 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:22:33 crc kubenswrapper[4893]: I0314 07:22:33.654408 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="68f76416-fb2c-4ee5-b5c3-cd641ff16d3c" containerName="ceilometer-central-agent" containerID="cri-o://a779b33a0061f8c13a3cb36595d96876bed2bf6540269a38b323e93a58171750" gracePeriod=30 Mar 14 07:22:33 crc kubenswrapper[4893]: I0314 07:22:33.654554 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="68f76416-fb2c-4ee5-b5c3-cd641ff16d3c" containerName="proxy-httpd" containerID="cri-o://78845d818142f720fd8d7d8c6df6b7453444798b0ba9a40d8c533a6fec9d37fc" gracePeriod=30 Mar 14 07:22:33 crc kubenswrapper[4893]: I0314 07:22:33.654564 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="68f76416-fb2c-4ee5-b5c3-cd641ff16d3c" containerName="sg-core" containerID="cri-o://13dae70aa4245e5fe03a14d69771afb52a0dc74bb52c53c6760616b1d9500784" gracePeriod=30 Mar 14 07:22:33 crc kubenswrapper[4893]: I0314 07:22:33.654603 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="68f76416-fb2c-4ee5-b5c3-cd641ff16d3c" containerName="ceilometer-notification-agent" containerID="cri-o://ca275f31d0ccdaf6dd527f27cc02130a43801fb38f3f2eea6213a4584393d63d" gracePeriod=30 Mar 14 07:22:34 crc kubenswrapper[4893]: I0314 07:22:34.447106 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 07:22:34 crc kubenswrapper[4893]: I0314 07:22:34.612876 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68f76416-fb2c-4ee5-b5c3-cd641ff16d3c-combined-ca-bundle\") pod \"68f76416-fb2c-4ee5-b5c3-cd641ff16d3c\" (UID: \"68f76416-fb2c-4ee5-b5c3-cd641ff16d3c\") " Mar 14 07:22:34 crc kubenswrapper[4893]: I0314 07:22:34.613052 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68f76416-fb2c-4ee5-b5c3-cd641ff16d3c-scripts\") pod \"68f76416-fb2c-4ee5-b5c3-cd641ff16d3c\" (UID: \"68f76416-fb2c-4ee5-b5c3-cd641ff16d3c\") " Mar 14 07:22:34 crc kubenswrapper[4893]: I0314 07:22:34.613100 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68f76416-fb2c-4ee5-b5c3-cd641ff16d3c-config-data\") pod \"68f76416-fb2c-4ee5-b5c3-cd641ff16d3c\" (UID: \"68f76416-fb2c-4ee5-b5c3-cd641ff16d3c\") " Mar 14 07:22:34 crc kubenswrapper[4893]: I0314 07:22:34.613133 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ql2ck\" (UniqueName: \"kubernetes.io/projected/68f76416-fb2c-4ee5-b5c3-cd641ff16d3c-kube-api-access-ql2ck\") pod \"68f76416-fb2c-4ee5-b5c3-cd641ff16d3c\" (UID: \"68f76416-fb2c-4ee5-b5c3-cd641ff16d3c\") " Mar 14 07:22:34 crc kubenswrapper[4893]: I0314 07:22:34.613177 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/68f76416-fb2c-4ee5-b5c3-cd641ff16d3c-sg-core-conf-yaml\") pod \"68f76416-fb2c-4ee5-b5c3-cd641ff16d3c\" (UID: \"68f76416-fb2c-4ee5-b5c3-cd641ff16d3c\") " Mar 14 07:22:34 crc kubenswrapper[4893]: I0314 07:22:34.613223 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68f76416-fb2c-4ee5-b5c3-cd641ff16d3c-run-httpd\") pod \"68f76416-fb2c-4ee5-b5c3-cd641ff16d3c\" (UID: \"68f76416-fb2c-4ee5-b5c3-cd641ff16d3c\") " Mar 14 07:22:34 crc kubenswrapper[4893]: I0314 07:22:34.613239 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68f76416-fb2c-4ee5-b5c3-cd641ff16d3c-log-httpd\") pod \"68f76416-fb2c-4ee5-b5c3-cd641ff16d3c\" (UID: \"68f76416-fb2c-4ee5-b5c3-cd641ff16d3c\") " Mar 14 07:22:34 crc kubenswrapper[4893]: I0314 07:22:34.614253 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68f76416-fb2c-4ee5-b5c3-cd641ff16d3c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "68f76416-fb2c-4ee5-b5c3-cd641ff16d3c" (UID: "68f76416-fb2c-4ee5-b5c3-cd641ff16d3c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:22:34 crc kubenswrapper[4893]: I0314 07:22:34.614452 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68f76416-fb2c-4ee5-b5c3-cd641ff16d3c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "68f76416-fb2c-4ee5-b5c3-cd641ff16d3c" (UID: "68f76416-fb2c-4ee5-b5c3-cd641ff16d3c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:22:34 crc kubenswrapper[4893]: I0314 07:22:34.619116 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68f76416-fb2c-4ee5-b5c3-cd641ff16d3c-scripts" (OuterVolumeSpecName: "scripts") pod "68f76416-fb2c-4ee5-b5c3-cd641ff16d3c" (UID: "68f76416-fb2c-4ee5-b5c3-cd641ff16d3c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:34 crc kubenswrapper[4893]: I0314 07:22:34.620549 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68f76416-fb2c-4ee5-b5c3-cd641ff16d3c-kube-api-access-ql2ck" (OuterVolumeSpecName: "kube-api-access-ql2ck") pod "68f76416-fb2c-4ee5-b5c3-cd641ff16d3c" (UID: "68f76416-fb2c-4ee5-b5c3-cd641ff16d3c"). InnerVolumeSpecName "kube-api-access-ql2ck". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:22:34 crc kubenswrapper[4893]: I0314 07:22:34.646474 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68f76416-fb2c-4ee5-b5c3-cd641ff16d3c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "68f76416-fb2c-4ee5-b5c3-cd641ff16d3c" (UID: "68f76416-fb2c-4ee5-b5c3-cd641ff16d3c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:34 crc kubenswrapper[4893]: I0314 07:22:34.666615 4893 generic.go:334] "Generic (PLEG): container finished" podID="68f76416-fb2c-4ee5-b5c3-cd641ff16d3c" containerID="78845d818142f720fd8d7d8c6df6b7453444798b0ba9a40d8c533a6fec9d37fc" exitCode=0 Mar 14 07:22:34 crc kubenswrapper[4893]: I0314 07:22:34.666669 4893 generic.go:334] "Generic (PLEG): container finished" podID="68f76416-fb2c-4ee5-b5c3-cd641ff16d3c" containerID="13dae70aa4245e5fe03a14d69771afb52a0dc74bb52c53c6760616b1d9500784" exitCode=2 Mar 14 07:22:34 crc kubenswrapper[4893]: I0314 07:22:34.666683 4893 generic.go:334] "Generic (PLEG): container finished" podID="68f76416-fb2c-4ee5-b5c3-cd641ff16d3c" containerID="ca275f31d0ccdaf6dd527f27cc02130a43801fb38f3f2eea6213a4584393d63d" exitCode=0 Mar 14 07:22:34 crc kubenswrapper[4893]: I0314 07:22:34.666697 4893 generic.go:334] "Generic (PLEG): container finished" podID="68f76416-fb2c-4ee5-b5c3-cd641ff16d3c" containerID="a779b33a0061f8c13a3cb36595d96876bed2bf6540269a38b323e93a58171750" exitCode=0 Mar 14 07:22:34 crc kubenswrapper[4893]: I0314 07:22:34.666727 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68f76416-fb2c-4ee5-b5c3-cd641ff16d3c","Type":"ContainerDied","Data":"78845d818142f720fd8d7d8c6df6b7453444798b0ba9a40d8c533a6fec9d37fc"} Mar 14 07:22:34 crc kubenswrapper[4893]: I0314 07:22:34.666770 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 07:22:34 crc kubenswrapper[4893]: I0314 07:22:34.666765 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68f76416-fb2c-4ee5-b5c3-cd641ff16d3c","Type":"ContainerDied","Data":"13dae70aa4245e5fe03a14d69771afb52a0dc74bb52c53c6760616b1d9500784"} Mar 14 07:22:34 crc kubenswrapper[4893]: I0314 07:22:34.666861 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68f76416-fb2c-4ee5-b5c3-cd641ff16d3c","Type":"ContainerDied","Data":"ca275f31d0ccdaf6dd527f27cc02130a43801fb38f3f2eea6213a4584393d63d"} Mar 14 07:22:34 crc kubenswrapper[4893]: I0314 07:22:34.666881 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68f76416-fb2c-4ee5-b5c3-cd641ff16d3c","Type":"ContainerDied","Data":"a779b33a0061f8c13a3cb36595d96876bed2bf6540269a38b323e93a58171750"} Mar 14 07:22:34 crc kubenswrapper[4893]: I0314 07:22:34.666898 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68f76416-fb2c-4ee5-b5c3-cd641ff16d3c","Type":"ContainerDied","Data":"73582fe2518e8fc0c014ff3392ddafd33731a9135875473c30a7146f400cefbc"} Mar 14 07:22:34 crc kubenswrapper[4893]: I0314 07:22:34.666924 4893 scope.go:117] "RemoveContainer" containerID="78845d818142f720fd8d7d8c6df6b7453444798b0ba9a40d8c533a6fec9d37fc" Mar 14 07:22:34 crc kubenswrapper[4893]: I0314 07:22:34.688713 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68f76416-fb2c-4ee5-b5c3-cd641ff16d3c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "68f76416-fb2c-4ee5-b5c3-cd641ff16d3c" (UID: "68f76416-fb2c-4ee5-b5c3-cd641ff16d3c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:34 crc kubenswrapper[4893]: I0314 07:22:34.690114 4893 scope.go:117] "RemoveContainer" containerID="13dae70aa4245e5fe03a14d69771afb52a0dc74bb52c53c6760616b1d9500784" Mar 14 07:22:34 crc kubenswrapper[4893]: I0314 07:22:34.709127 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68f76416-fb2c-4ee5-b5c3-cd641ff16d3c-config-data" (OuterVolumeSpecName: "config-data") pod "68f76416-fb2c-4ee5-b5c3-cd641ff16d3c" (UID: "68f76416-fb2c-4ee5-b5c3-cd641ff16d3c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:34 crc kubenswrapper[4893]: I0314 07:22:34.709823 4893 scope.go:117] "RemoveContainer" containerID="ca275f31d0ccdaf6dd527f27cc02130a43801fb38f3f2eea6213a4584393d63d" Mar 14 07:22:34 crc kubenswrapper[4893]: I0314 07:22:34.715238 4893 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68f76416-fb2c-4ee5-b5c3-cd641ff16d3c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:34 crc kubenswrapper[4893]: I0314 07:22:34.715266 4893 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68f76416-fb2c-4ee5-b5c3-cd641ff16d3c-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:34 crc kubenswrapper[4893]: I0314 07:22:34.715276 4893 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68f76416-fb2c-4ee5-b5c3-cd641ff16d3c-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:34 crc kubenswrapper[4893]: I0314 07:22:34.715284 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ql2ck\" (UniqueName: \"kubernetes.io/projected/68f76416-fb2c-4ee5-b5c3-cd641ff16d3c-kube-api-access-ql2ck\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:34 crc kubenswrapper[4893]: I0314 07:22:34.715294 4893 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/68f76416-fb2c-4ee5-b5c3-cd641ff16d3c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:34 crc kubenswrapper[4893]: I0314 07:22:34.715302 4893 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68f76416-fb2c-4ee5-b5c3-cd641ff16d3c-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:34 crc kubenswrapper[4893]: I0314 07:22:34.715310 4893 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68f76416-fb2c-4ee5-b5c3-cd641ff16d3c-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:34 crc kubenswrapper[4893]: I0314 07:22:34.728111 4893 scope.go:117] "RemoveContainer" containerID="a779b33a0061f8c13a3cb36595d96876bed2bf6540269a38b323e93a58171750" Mar 14 07:22:34 crc kubenswrapper[4893]: I0314 07:22:34.749298 4893 scope.go:117] "RemoveContainer" containerID="78845d818142f720fd8d7d8c6df6b7453444798b0ba9a40d8c533a6fec9d37fc" Mar 14 07:22:34 crc kubenswrapper[4893]: E0314 07:22:34.750356 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78845d818142f720fd8d7d8c6df6b7453444798b0ba9a40d8c533a6fec9d37fc\": container with ID starting with 78845d818142f720fd8d7d8c6df6b7453444798b0ba9a40d8c533a6fec9d37fc not found: ID does not exist" containerID="78845d818142f720fd8d7d8c6df6b7453444798b0ba9a40d8c533a6fec9d37fc" Mar 14 07:22:34 crc kubenswrapper[4893]: I0314 07:22:34.750409 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78845d818142f720fd8d7d8c6df6b7453444798b0ba9a40d8c533a6fec9d37fc"} err="failed to get container status \"78845d818142f720fd8d7d8c6df6b7453444798b0ba9a40d8c533a6fec9d37fc\": rpc error: code = NotFound desc = could not find container \"78845d818142f720fd8d7d8c6df6b7453444798b0ba9a40d8c533a6fec9d37fc\": container with ID starting with 78845d818142f720fd8d7d8c6df6b7453444798b0ba9a40d8c533a6fec9d37fc not found: ID does not exist" Mar 14 07:22:34 crc kubenswrapper[4893]: I0314 07:22:34.750440 4893 scope.go:117] "RemoveContainer" containerID="13dae70aa4245e5fe03a14d69771afb52a0dc74bb52c53c6760616b1d9500784" Mar 14 07:22:34 crc kubenswrapper[4893]: E0314 07:22:34.750943 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13dae70aa4245e5fe03a14d69771afb52a0dc74bb52c53c6760616b1d9500784\": container with ID starting with 13dae70aa4245e5fe03a14d69771afb52a0dc74bb52c53c6760616b1d9500784 not found: ID does not exist" containerID="13dae70aa4245e5fe03a14d69771afb52a0dc74bb52c53c6760616b1d9500784" Mar 14 07:22:34 crc kubenswrapper[4893]: I0314 07:22:34.751009 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13dae70aa4245e5fe03a14d69771afb52a0dc74bb52c53c6760616b1d9500784"} err="failed to get container status \"13dae70aa4245e5fe03a14d69771afb52a0dc74bb52c53c6760616b1d9500784\": rpc error: code = NotFound desc = could not find container \"13dae70aa4245e5fe03a14d69771afb52a0dc74bb52c53c6760616b1d9500784\": container with ID starting with 13dae70aa4245e5fe03a14d69771afb52a0dc74bb52c53c6760616b1d9500784 not found: ID does not exist" Mar 14 07:22:34 crc kubenswrapper[4893]: I0314 07:22:34.751048 4893 scope.go:117] "RemoveContainer" containerID="ca275f31d0ccdaf6dd527f27cc02130a43801fb38f3f2eea6213a4584393d63d" Mar 14 07:22:34 crc kubenswrapper[4893]: E0314 07:22:34.751613 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca275f31d0ccdaf6dd527f27cc02130a43801fb38f3f2eea6213a4584393d63d\": container with ID starting with ca275f31d0ccdaf6dd527f27cc02130a43801fb38f3f2eea6213a4584393d63d not found: ID does not exist" containerID="ca275f31d0ccdaf6dd527f27cc02130a43801fb38f3f2eea6213a4584393d63d" Mar 14 07:22:34 crc kubenswrapper[4893]: I0314 07:22:34.751647 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca275f31d0ccdaf6dd527f27cc02130a43801fb38f3f2eea6213a4584393d63d"} err="failed to get container status \"ca275f31d0ccdaf6dd527f27cc02130a43801fb38f3f2eea6213a4584393d63d\": rpc error: code = NotFound desc = could not find container \"ca275f31d0ccdaf6dd527f27cc02130a43801fb38f3f2eea6213a4584393d63d\": container with ID starting with ca275f31d0ccdaf6dd527f27cc02130a43801fb38f3f2eea6213a4584393d63d not found: ID does not exist" Mar 14 07:22:34 crc kubenswrapper[4893]: I0314 07:22:34.751670 4893 scope.go:117] "RemoveContainer" containerID="a779b33a0061f8c13a3cb36595d96876bed2bf6540269a38b323e93a58171750" Mar 14 07:22:34 crc kubenswrapper[4893]: E0314 07:22:34.752109 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a779b33a0061f8c13a3cb36595d96876bed2bf6540269a38b323e93a58171750\": container with ID starting with a779b33a0061f8c13a3cb36595d96876bed2bf6540269a38b323e93a58171750 not found: ID does not exist" containerID="a779b33a0061f8c13a3cb36595d96876bed2bf6540269a38b323e93a58171750" Mar 14 07:22:34 crc kubenswrapper[4893]: I0314 07:22:34.752141 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a779b33a0061f8c13a3cb36595d96876bed2bf6540269a38b323e93a58171750"} err="failed to get container status \"a779b33a0061f8c13a3cb36595d96876bed2bf6540269a38b323e93a58171750\": rpc error: code = NotFound desc = could not find container \"a779b33a0061f8c13a3cb36595d96876bed2bf6540269a38b323e93a58171750\": container with ID starting with a779b33a0061f8c13a3cb36595d96876bed2bf6540269a38b323e93a58171750 not found: ID does not exist" Mar 14 07:22:34 crc kubenswrapper[4893]: I0314 07:22:34.752162 4893 scope.go:117] "RemoveContainer" containerID="78845d818142f720fd8d7d8c6df6b7453444798b0ba9a40d8c533a6fec9d37fc" Mar 14 07:22:34 crc kubenswrapper[4893]: I0314 07:22:34.752497 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78845d818142f720fd8d7d8c6df6b7453444798b0ba9a40d8c533a6fec9d37fc"} err="failed to get container status \"78845d818142f720fd8d7d8c6df6b7453444798b0ba9a40d8c533a6fec9d37fc\": rpc error: code = NotFound desc = could not find container \"78845d818142f720fd8d7d8c6df6b7453444798b0ba9a40d8c533a6fec9d37fc\": container with ID starting with 78845d818142f720fd8d7d8c6df6b7453444798b0ba9a40d8c533a6fec9d37fc not found: ID does not exist" Mar 14 07:22:34 crc kubenswrapper[4893]: I0314 07:22:34.752580 4893 scope.go:117] "RemoveContainer" containerID="13dae70aa4245e5fe03a14d69771afb52a0dc74bb52c53c6760616b1d9500784" Mar 14 07:22:34 crc kubenswrapper[4893]: I0314 07:22:34.752885 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13dae70aa4245e5fe03a14d69771afb52a0dc74bb52c53c6760616b1d9500784"} err="failed to get container status \"13dae70aa4245e5fe03a14d69771afb52a0dc74bb52c53c6760616b1d9500784\": rpc error: code = NotFound desc = could not find container \"13dae70aa4245e5fe03a14d69771afb52a0dc74bb52c53c6760616b1d9500784\": container with ID starting with 13dae70aa4245e5fe03a14d69771afb52a0dc74bb52c53c6760616b1d9500784 not found: ID does not exist" Mar 14 07:22:34 crc kubenswrapper[4893]: I0314 07:22:34.752910 4893 scope.go:117] "RemoveContainer" containerID="ca275f31d0ccdaf6dd527f27cc02130a43801fb38f3f2eea6213a4584393d63d" Mar 14 07:22:34 crc kubenswrapper[4893]: I0314 07:22:34.753174 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca275f31d0ccdaf6dd527f27cc02130a43801fb38f3f2eea6213a4584393d63d"} err="failed to get container status \"ca275f31d0ccdaf6dd527f27cc02130a43801fb38f3f2eea6213a4584393d63d\": rpc error: code = NotFound desc = could not find container \"ca275f31d0ccdaf6dd527f27cc02130a43801fb38f3f2eea6213a4584393d63d\": container with ID starting with ca275f31d0ccdaf6dd527f27cc02130a43801fb38f3f2eea6213a4584393d63d not found: ID does not exist" Mar 14 07:22:34 crc kubenswrapper[4893]: I0314 07:22:34.753194 4893 scope.go:117] "RemoveContainer" containerID="a779b33a0061f8c13a3cb36595d96876bed2bf6540269a38b323e93a58171750" Mar 14 07:22:34 crc kubenswrapper[4893]: I0314 07:22:34.753487 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a779b33a0061f8c13a3cb36595d96876bed2bf6540269a38b323e93a58171750"} err="failed to get container status \"a779b33a0061f8c13a3cb36595d96876bed2bf6540269a38b323e93a58171750\": rpc error: code = NotFound desc = could not find container \"a779b33a0061f8c13a3cb36595d96876bed2bf6540269a38b323e93a58171750\": container with ID starting with a779b33a0061f8c13a3cb36595d96876bed2bf6540269a38b323e93a58171750 not found: ID does not exist" Mar 14 07:22:34 crc kubenswrapper[4893]: I0314 07:22:34.753571 4893 scope.go:117] "RemoveContainer" containerID="78845d818142f720fd8d7d8c6df6b7453444798b0ba9a40d8c533a6fec9d37fc" Mar 14 07:22:34 crc kubenswrapper[4893]: I0314 07:22:34.753858 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78845d818142f720fd8d7d8c6df6b7453444798b0ba9a40d8c533a6fec9d37fc"} err="failed to get container status \"78845d818142f720fd8d7d8c6df6b7453444798b0ba9a40d8c533a6fec9d37fc\": rpc error: code = NotFound desc = could not find container \"78845d818142f720fd8d7d8c6df6b7453444798b0ba9a40d8c533a6fec9d37fc\": container with ID starting with 78845d818142f720fd8d7d8c6df6b7453444798b0ba9a40d8c533a6fec9d37fc not found: ID does not exist" Mar 14 07:22:34 crc kubenswrapper[4893]: I0314 07:22:34.753900 4893 scope.go:117] "RemoveContainer" containerID="13dae70aa4245e5fe03a14d69771afb52a0dc74bb52c53c6760616b1d9500784" Mar 14 07:22:34 crc kubenswrapper[4893]: I0314 07:22:34.754151 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13dae70aa4245e5fe03a14d69771afb52a0dc74bb52c53c6760616b1d9500784"} err="failed to get container status \"13dae70aa4245e5fe03a14d69771afb52a0dc74bb52c53c6760616b1d9500784\": rpc error: code = NotFound desc = could not find container \"13dae70aa4245e5fe03a14d69771afb52a0dc74bb52c53c6760616b1d9500784\": container with ID starting with 13dae70aa4245e5fe03a14d69771afb52a0dc74bb52c53c6760616b1d9500784 not found: ID does not exist" Mar 14 07:22:34 crc kubenswrapper[4893]: I0314 07:22:34.754196 4893 scope.go:117] "RemoveContainer" containerID="ca275f31d0ccdaf6dd527f27cc02130a43801fb38f3f2eea6213a4584393d63d" Mar 14 07:22:34 crc kubenswrapper[4893]: I0314 07:22:34.754514 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca275f31d0ccdaf6dd527f27cc02130a43801fb38f3f2eea6213a4584393d63d"} err="failed to get container status \"ca275f31d0ccdaf6dd527f27cc02130a43801fb38f3f2eea6213a4584393d63d\": rpc error: code = NotFound desc = could not find container \"ca275f31d0ccdaf6dd527f27cc02130a43801fb38f3f2eea6213a4584393d63d\": container with ID starting with ca275f31d0ccdaf6dd527f27cc02130a43801fb38f3f2eea6213a4584393d63d not found: ID does not exist" Mar 14 07:22:34 crc kubenswrapper[4893]: I0314 07:22:34.754573 4893 scope.go:117] "RemoveContainer" containerID="a779b33a0061f8c13a3cb36595d96876bed2bf6540269a38b323e93a58171750" Mar 14 07:22:34 crc kubenswrapper[4893]: I0314 07:22:34.754938 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a779b33a0061f8c13a3cb36595d96876bed2bf6540269a38b323e93a58171750"} err="failed to get container status \"a779b33a0061f8c13a3cb36595d96876bed2bf6540269a38b323e93a58171750\": rpc error: code = NotFound desc = could not find container \"a779b33a0061f8c13a3cb36595d96876bed2bf6540269a38b323e93a58171750\": container with ID starting with a779b33a0061f8c13a3cb36595d96876bed2bf6540269a38b323e93a58171750 not found: ID does not exist" Mar 14 07:22:34 crc kubenswrapper[4893]: I0314 07:22:34.754982 4893 scope.go:117] "RemoveContainer" containerID="78845d818142f720fd8d7d8c6df6b7453444798b0ba9a40d8c533a6fec9d37fc" Mar 14 07:22:34 crc kubenswrapper[4893]: I0314 07:22:34.755398 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78845d818142f720fd8d7d8c6df6b7453444798b0ba9a40d8c533a6fec9d37fc"} err="failed to get container status \"78845d818142f720fd8d7d8c6df6b7453444798b0ba9a40d8c533a6fec9d37fc\": rpc error: code = NotFound desc = could not find container \"78845d818142f720fd8d7d8c6df6b7453444798b0ba9a40d8c533a6fec9d37fc\": container with ID starting with 78845d818142f720fd8d7d8c6df6b7453444798b0ba9a40d8c533a6fec9d37fc not found: ID does not exist" Mar 14 07:22:34 crc kubenswrapper[4893]: I0314 07:22:34.755441 4893 scope.go:117] "RemoveContainer" containerID="13dae70aa4245e5fe03a14d69771afb52a0dc74bb52c53c6760616b1d9500784" Mar 14 07:22:34 crc kubenswrapper[4893]: I0314 07:22:34.755985 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13dae70aa4245e5fe03a14d69771afb52a0dc74bb52c53c6760616b1d9500784"} err="failed to get container status \"13dae70aa4245e5fe03a14d69771afb52a0dc74bb52c53c6760616b1d9500784\": rpc error: code = NotFound desc = could not find container \"13dae70aa4245e5fe03a14d69771afb52a0dc74bb52c53c6760616b1d9500784\": container with ID starting with 13dae70aa4245e5fe03a14d69771afb52a0dc74bb52c53c6760616b1d9500784 not found: ID does not exist" Mar 14 07:22:34 crc kubenswrapper[4893]: I0314 07:22:34.756032 4893 scope.go:117] "RemoveContainer" containerID="ca275f31d0ccdaf6dd527f27cc02130a43801fb38f3f2eea6213a4584393d63d" Mar 14 07:22:34 crc kubenswrapper[4893]: I0314 07:22:34.756307 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca275f31d0ccdaf6dd527f27cc02130a43801fb38f3f2eea6213a4584393d63d"} err="failed to get container status \"ca275f31d0ccdaf6dd527f27cc02130a43801fb38f3f2eea6213a4584393d63d\": rpc error: code = NotFound desc = could not find container \"ca275f31d0ccdaf6dd527f27cc02130a43801fb38f3f2eea6213a4584393d63d\": container with ID starting with ca275f31d0ccdaf6dd527f27cc02130a43801fb38f3f2eea6213a4584393d63d not found: ID does not exist" Mar 14 07:22:34 crc kubenswrapper[4893]: I0314 07:22:34.756331 4893 scope.go:117] "RemoveContainer" containerID="a779b33a0061f8c13a3cb36595d96876bed2bf6540269a38b323e93a58171750" Mar 14 07:22:34 crc kubenswrapper[4893]: I0314 07:22:34.756617 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a779b33a0061f8c13a3cb36595d96876bed2bf6540269a38b323e93a58171750"} err="failed to get container status \"a779b33a0061f8c13a3cb36595d96876bed2bf6540269a38b323e93a58171750\": rpc error: code = NotFound desc = could not find container \"a779b33a0061f8c13a3cb36595d96876bed2bf6540269a38b323e93a58171750\": container with ID starting with a779b33a0061f8c13a3cb36595d96876bed2bf6540269a38b323e93a58171750 not found: ID does not exist" Mar 14 07:22:35 crc kubenswrapper[4893]: I0314 07:22:35.012214 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:22:35 crc kubenswrapper[4893]: I0314 07:22:35.027981 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:22:35 crc kubenswrapper[4893]: I0314 07:22:35.041355 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:22:35 crc kubenswrapper[4893]: E0314 07:22:35.042026 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68f76416-fb2c-4ee5-b5c3-cd641ff16d3c" containerName="proxy-httpd" Mar 14 07:22:35 crc kubenswrapper[4893]: I0314 07:22:35.042040 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="68f76416-fb2c-4ee5-b5c3-cd641ff16d3c" containerName="proxy-httpd" Mar 14 07:22:35 crc kubenswrapper[4893]: E0314 07:22:35.042059 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68f76416-fb2c-4ee5-b5c3-cd641ff16d3c" containerName="ceilometer-central-agent" Mar 14 07:22:35 crc kubenswrapper[4893]: I0314 07:22:35.042068 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="68f76416-fb2c-4ee5-b5c3-cd641ff16d3c" containerName="ceilometer-central-agent" Mar 14 07:22:35 crc kubenswrapper[4893]: E0314 07:22:35.042093 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68f76416-fb2c-4ee5-b5c3-cd641ff16d3c" containerName="sg-core" Mar 14 07:22:35 crc kubenswrapper[4893]: I0314 07:22:35.042104 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="68f76416-fb2c-4ee5-b5c3-cd641ff16d3c" containerName="sg-core" Mar 14 07:22:35 crc kubenswrapper[4893]: E0314 07:22:35.042134 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68f76416-fb2c-4ee5-b5c3-cd641ff16d3c" containerName="ceilometer-notification-agent" Mar 14 07:22:35 crc kubenswrapper[4893]: I0314 07:22:35.042143 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="68f76416-fb2c-4ee5-b5c3-cd641ff16d3c" containerName="ceilometer-notification-agent" Mar 14 07:22:35 crc kubenswrapper[4893]: I0314 07:22:35.042334 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="68f76416-fb2c-4ee5-b5c3-cd641ff16d3c" containerName="ceilometer-notification-agent" Mar 14 07:22:35 crc kubenswrapper[4893]: I0314 07:22:35.042348 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="68f76416-fb2c-4ee5-b5c3-cd641ff16d3c" containerName="ceilometer-central-agent" Mar 14 07:22:35 crc kubenswrapper[4893]: I0314 07:22:35.042365 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="68f76416-fb2c-4ee5-b5c3-cd641ff16d3c" containerName="proxy-httpd" Mar 14 07:22:35 crc kubenswrapper[4893]: I0314 07:22:35.042378 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="68f76416-fb2c-4ee5-b5c3-cd641ff16d3c" containerName="sg-core" Mar 14 07:22:35 crc kubenswrapper[4893]: I0314 07:22:35.044454 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 07:22:35 crc kubenswrapper[4893]: I0314 07:22:35.048188 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 14 07:22:35 crc kubenswrapper[4893]: I0314 07:22:35.048402 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 14 07:22:35 crc kubenswrapper[4893]: I0314 07:22:35.059318 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:22:35 crc kubenswrapper[4893]: I0314 07:22:35.223723 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f8dfb58-4c36-4f9f-b142-29a325099f3e-run-httpd\") pod \"ceilometer-0\" (UID: \"1f8dfb58-4c36-4f9f-b142-29a325099f3e\") " pod="openstack/ceilometer-0" Mar 14 07:22:35 crc kubenswrapper[4893]: I0314 07:22:35.224407 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f8dfb58-4c36-4f9f-b142-29a325099f3e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1f8dfb58-4c36-4f9f-b142-29a325099f3e\") " pod="openstack/ceilometer-0" Mar 14 07:22:35 crc kubenswrapper[4893]: I0314 07:22:35.224647 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlgtv\" (UniqueName: \"kubernetes.io/projected/1f8dfb58-4c36-4f9f-b142-29a325099f3e-kube-api-access-zlgtv\") pod \"ceilometer-0\" (UID: \"1f8dfb58-4c36-4f9f-b142-29a325099f3e\") " pod="openstack/ceilometer-0" Mar 14 07:22:35 crc kubenswrapper[4893]: I0314 07:22:35.224857 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f8dfb58-4c36-4f9f-b142-29a325099f3e-scripts\") pod \"ceilometer-0\" (UID: \"1f8dfb58-4c36-4f9f-b142-29a325099f3e\") " pod="openstack/ceilometer-0" Mar 14 07:22:35 crc kubenswrapper[4893]: I0314 07:22:35.225004 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1f8dfb58-4c36-4f9f-b142-29a325099f3e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1f8dfb58-4c36-4f9f-b142-29a325099f3e\") " pod="openstack/ceilometer-0" Mar 14 07:22:35 crc kubenswrapper[4893]: I0314 07:22:35.225171 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f8dfb58-4c36-4f9f-b142-29a325099f3e-config-data\") pod \"ceilometer-0\" (UID: \"1f8dfb58-4c36-4f9f-b142-29a325099f3e\") " pod="openstack/ceilometer-0" Mar 14 07:22:35 crc kubenswrapper[4893]: I0314 07:22:35.225479 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f8dfb58-4c36-4f9f-b142-29a325099f3e-log-httpd\") pod \"ceilometer-0\" (UID: \"1f8dfb58-4c36-4f9f-b142-29a325099f3e\") " pod="openstack/ceilometer-0" Mar 14 07:22:35 crc kubenswrapper[4893]: I0314 07:22:35.327466 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f8dfb58-4c36-4f9f-b142-29a325099f3e-run-httpd\") pod \"ceilometer-0\" (UID: \"1f8dfb58-4c36-4f9f-b142-29a325099f3e\") " pod="openstack/ceilometer-0" Mar 14 07:22:35 crc kubenswrapper[4893]: I0314 07:22:35.327574 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f8dfb58-4c36-4f9f-b142-29a325099f3e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1f8dfb58-4c36-4f9f-b142-29a325099f3e\") " pod="openstack/ceilometer-0" Mar 14 07:22:35 crc kubenswrapper[4893]: I0314 07:22:35.327638 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlgtv\" (UniqueName: \"kubernetes.io/projected/1f8dfb58-4c36-4f9f-b142-29a325099f3e-kube-api-access-zlgtv\") pod \"ceilometer-0\" (UID: \"1f8dfb58-4c36-4f9f-b142-29a325099f3e\") " pod="openstack/ceilometer-0" Mar 14 07:22:35 crc kubenswrapper[4893]: I0314 07:22:35.327685 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f8dfb58-4c36-4f9f-b142-29a325099f3e-scripts\") pod \"ceilometer-0\" (UID: \"1f8dfb58-4c36-4f9f-b142-29a325099f3e\") " pod="openstack/ceilometer-0" Mar 14 07:22:35 crc kubenswrapper[4893]: I0314 07:22:35.327706 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1f8dfb58-4c36-4f9f-b142-29a325099f3e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1f8dfb58-4c36-4f9f-b142-29a325099f3e\") " pod="openstack/ceilometer-0" Mar 14 07:22:35 crc kubenswrapper[4893]: I0314 07:22:35.327733 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f8dfb58-4c36-4f9f-b142-29a325099f3e-config-data\") pod \"ceilometer-0\" (UID: \"1f8dfb58-4c36-4f9f-b142-29a325099f3e\") " pod="openstack/ceilometer-0" Mar 14 07:22:35 crc kubenswrapper[4893]: I0314 07:22:35.327773 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f8dfb58-4c36-4f9f-b142-29a325099f3e-log-httpd\") pod \"ceilometer-0\" (UID: \"1f8dfb58-4c36-4f9f-b142-29a325099f3e\") " pod="openstack/ceilometer-0" Mar 14 07:22:35 crc kubenswrapper[4893]: I0314 07:22:35.328448 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f8dfb58-4c36-4f9f-b142-29a325099f3e-log-httpd\") pod \"ceilometer-0\" (UID: \"1f8dfb58-4c36-4f9f-b142-29a325099f3e\") " pod="openstack/ceilometer-0" Mar 14 07:22:35 crc kubenswrapper[4893]: I0314 07:22:35.329064 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f8dfb58-4c36-4f9f-b142-29a325099f3e-run-httpd\") pod \"ceilometer-0\" (UID: \"1f8dfb58-4c36-4f9f-b142-29a325099f3e\") " pod="openstack/ceilometer-0" Mar 14 07:22:35 crc kubenswrapper[4893]: I0314 07:22:35.331441 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f8dfb58-4c36-4f9f-b142-29a325099f3e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1f8dfb58-4c36-4f9f-b142-29a325099f3e\") " pod="openstack/ceilometer-0" Mar 14 07:22:35 crc kubenswrapper[4893]: I0314 07:22:35.332638 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1f8dfb58-4c36-4f9f-b142-29a325099f3e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1f8dfb58-4c36-4f9f-b142-29a325099f3e\") " pod="openstack/ceilometer-0" Mar 14 07:22:35 crc kubenswrapper[4893]: I0314 07:22:35.334451 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f8dfb58-4c36-4f9f-b142-29a325099f3e-config-data\") pod \"ceilometer-0\" (UID: \"1f8dfb58-4c36-4f9f-b142-29a325099f3e\") " pod="openstack/ceilometer-0" Mar 14 07:22:35 crc kubenswrapper[4893]: I0314 07:22:35.344346 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlgtv\" (UniqueName: \"kubernetes.io/projected/1f8dfb58-4c36-4f9f-b142-29a325099f3e-kube-api-access-zlgtv\") pod \"ceilometer-0\" (UID: \"1f8dfb58-4c36-4f9f-b142-29a325099f3e\") " pod="openstack/ceilometer-0" Mar 14 07:22:35 crc kubenswrapper[4893]: I0314 07:22:35.344502 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f8dfb58-4c36-4f9f-b142-29a325099f3e-scripts\") pod \"ceilometer-0\" (UID: \"1f8dfb58-4c36-4f9f-b142-29a325099f3e\") " pod="openstack/ceilometer-0" Mar 14 07:22:35 crc kubenswrapper[4893]: I0314 07:22:35.383839 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 07:22:35 crc kubenswrapper[4893]: I0314 07:22:35.389846 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68f76416-fb2c-4ee5-b5c3-cd641ff16d3c" path="/var/lib/kubelet/pods/68f76416-fb2c-4ee5-b5c3-cd641ff16d3c/volumes" Mar 14 07:22:35 crc kubenswrapper[4893]: I0314 07:22:35.678055 4893 generic.go:334] "Generic (PLEG): container finished" podID="b57ae19b-38cb-419e-ab8f-7f0bb5ead383" containerID="a0e606f83c4344007c738a0607d69e9e4174dab58bc088f15e1b8aeacbda77a9" exitCode=0 Mar 14 07:22:35 crc kubenswrapper[4893]: I0314 07:22:35.678100 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-96kn9" event={"ID":"b57ae19b-38cb-419e-ab8f-7f0bb5ead383","Type":"ContainerDied","Data":"a0e606f83c4344007c738a0607d69e9e4174dab58bc088f15e1b8aeacbda77a9"} Mar 14 07:22:35 crc kubenswrapper[4893]: W0314 07:22:35.825877 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f8dfb58_4c36_4f9f_b142_29a325099f3e.slice/crio-a01f2492a7de0996b71785e6edf868e0a74744b84c9b8c845e954ff7d27725d5 WatchSource:0}: Error finding container a01f2492a7de0996b71785e6edf868e0a74744b84c9b8c845e954ff7d27725d5: Status 404 returned error can't find the container with id a01f2492a7de0996b71785e6edf868e0a74744b84c9b8c845e954ff7d27725d5 Mar 14 07:22:35 crc kubenswrapper[4893]: I0314 07:22:35.827838 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:22:36 crc kubenswrapper[4893]: I0314 07:22:36.688192 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1f8dfb58-4c36-4f9f-b142-29a325099f3e","Type":"ContainerStarted","Data":"950611c56e3344fdc62264065d2587e5a1938230c6418d51d23815de1934eaea"} Mar 14 07:22:36 crc kubenswrapper[4893]: I0314 07:22:36.688637 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1f8dfb58-4c36-4f9f-b142-29a325099f3e","Type":"ContainerStarted","Data":"a01f2492a7de0996b71785e6edf868e0a74744b84c9b8c845e954ff7d27725d5"} Mar 14 07:22:37 crc kubenswrapper[4893]: I0314 07:22:37.097658 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-96kn9" Mar 14 07:22:37 crc kubenswrapper[4893]: I0314 07:22:37.262616 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b57ae19b-38cb-419e-ab8f-7f0bb5ead383-combined-ca-bundle\") pod \"b57ae19b-38cb-419e-ab8f-7f0bb5ead383\" (UID: \"b57ae19b-38cb-419e-ab8f-7f0bb5ead383\") " Mar 14 07:22:37 crc kubenswrapper[4893]: I0314 07:22:37.263010 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b57ae19b-38cb-419e-ab8f-7f0bb5ead383-config-data\") pod \"b57ae19b-38cb-419e-ab8f-7f0bb5ead383\" (UID: \"b57ae19b-38cb-419e-ab8f-7f0bb5ead383\") " Mar 14 07:22:37 crc kubenswrapper[4893]: I0314 07:22:37.263106 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zl4wz\" (UniqueName: \"kubernetes.io/projected/b57ae19b-38cb-419e-ab8f-7f0bb5ead383-kube-api-access-zl4wz\") pod \"b57ae19b-38cb-419e-ab8f-7f0bb5ead383\" (UID: \"b57ae19b-38cb-419e-ab8f-7f0bb5ead383\") " Mar 14 07:22:37 crc kubenswrapper[4893]: I0314 07:22:37.264572 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b57ae19b-38cb-419e-ab8f-7f0bb5ead383-scripts\") pod \"b57ae19b-38cb-419e-ab8f-7f0bb5ead383\" (UID: \"b57ae19b-38cb-419e-ab8f-7f0bb5ead383\") " Mar 14 07:22:37 crc kubenswrapper[4893]: I0314 07:22:37.270696 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b57ae19b-38cb-419e-ab8f-7f0bb5ead383-kube-api-access-zl4wz" (OuterVolumeSpecName: "kube-api-access-zl4wz") pod "b57ae19b-38cb-419e-ab8f-7f0bb5ead383" (UID: "b57ae19b-38cb-419e-ab8f-7f0bb5ead383"). InnerVolumeSpecName "kube-api-access-zl4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:22:37 crc kubenswrapper[4893]: I0314 07:22:37.270708 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b57ae19b-38cb-419e-ab8f-7f0bb5ead383-scripts" (OuterVolumeSpecName: "scripts") pod "b57ae19b-38cb-419e-ab8f-7f0bb5ead383" (UID: "b57ae19b-38cb-419e-ab8f-7f0bb5ead383"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:37 crc kubenswrapper[4893]: I0314 07:22:37.294929 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b57ae19b-38cb-419e-ab8f-7f0bb5ead383-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b57ae19b-38cb-419e-ab8f-7f0bb5ead383" (UID: "b57ae19b-38cb-419e-ab8f-7f0bb5ead383"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:37 crc kubenswrapper[4893]: I0314 07:22:37.307051 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b57ae19b-38cb-419e-ab8f-7f0bb5ead383-config-data" (OuterVolumeSpecName: "config-data") pod "b57ae19b-38cb-419e-ab8f-7f0bb5ead383" (UID: "b57ae19b-38cb-419e-ab8f-7f0bb5ead383"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:37 crc kubenswrapper[4893]: I0314 07:22:37.367080 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zl4wz\" (UniqueName: \"kubernetes.io/projected/b57ae19b-38cb-419e-ab8f-7f0bb5ead383-kube-api-access-zl4wz\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:37 crc kubenswrapper[4893]: I0314 07:22:37.367110 4893 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b57ae19b-38cb-419e-ab8f-7f0bb5ead383-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:37 crc kubenswrapper[4893]: I0314 07:22:37.367121 4893 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b57ae19b-38cb-419e-ab8f-7f0bb5ead383-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:37 crc kubenswrapper[4893]: I0314 07:22:37.367131 4893 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b57ae19b-38cb-419e-ab8f-7f0bb5ead383-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:37 crc kubenswrapper[4893]: I0314 07:22:37.697929 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-96kn9" event={"ID":"b57ae19b-38cb-419e-ab8f-7f0bb5ead383","Type":"ContainerDied","Data":"472dc2e3d79e2e2b81938251f4e53124b3ad8441f827b541040b8850108b6d30"} Mar 14 07:22:37 crc kubenswrapper[4893]: I0314 07:22:37.697967 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="472dc2e3d79e2e2b81938251f4e53124b3ad8441f827b541040b8850108b6d30" Mar 14 07:22:37 crc kubenswrapper[4893]: I0314 07:22:37.698019 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-96kn9" Mar 14 07:22:37 crc kubenswrapper[4893]: I0314 07:22:37.701745 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1f8dfb58-4c36-4f9f-b142-29a325099f3e","Type":"ContainerStarted","Data":"d5aab96a43fad3ce075801aad0ba40c5256b90c39be16a195a7a51e462662ae4"} Mar 14 07:22:37 crc kubenswrapper[4893]: I0314 07:22:37.791357 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 14 07:22:37 crc kubenswrapper[4893]: E0314 07:22:37.792250 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b57ae19b-38cb-419e-ab8f-7f0bb5ead383" containerName="nova-cell0-conductor-db-sync" Mar 14 07:22:37 crc kubenswrapper[4893]: I0314 07:22:37.792295 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="b57ae19b-38cb-419e-ab8f-7f0bb5ead383" containerName="nova-cell0-conductor-db-sync" Mar 14 07:22:37 crc kubenswrapper[4893]: I0314 07:22:37.792471 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="b57ae19b-38cb-419e-ab8f-7f0bb5ead383" containerName="nova-cell0-conductor-db-sync" Mar 14 07:22:37 crc kubenswrapper[4893]: I0314 07:22:37.794001 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 14 07:22:37 crc kubenswrapper[4893]: I0314 07:22:37.797496 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 14 07:22:37 crc kubenswrapper[4893]: I0314 07:22:37.798013 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-g8mv9" Mar 14 07:22:37 crc kubenswrapper[4893]: I0314 07:22:37.804489 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 14 07:22:37 crc kubenswrapper[4893]: I0314 07:22:37.982367 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb5d1a2a-ad9f-4eb7-8a70-37a1523a1b6f-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"eb5d1a2a-ad9f-4eb7-8a70-37a1523a1b6f\") " pod="openstack/nova-cell0-conductor-0" Mar 14 07:22:37 crc kubenswrapper[4893]: I0314 07:22:37.982683 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpglh\" (UniqueName: \"kubernetes.io/projected/eb5d1a2a-ad9f-4eb7-8a70-37a1523a1b6f-kube-api-access-bpglh\") pod \"nova-cell0-conductor-0\" (UID: \"eb5d1a2a-ad9f-4eb7-8a70-37a1523a1b6f\") " pod="openstack/nova-cell0-conductor-0" Mar 14 07:22:37 crc kubenswrapper[4893]: I0314 07:22:37.982713 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb5d1a2a-ad9f-4eb7-8a70-37a1523a1b6f-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"eb5d1a2a-ad9f-4eb7-8a70-37a1523a1b6f\") " pod="openstack/nova-cell0-conductor-0" Mar 14 07:22:38 crc kubenswrapper[4893]: I0314 07:22:38.084400 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb5d1a2a-ad9f-4eb7-8a70-37a1523a1b6f-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"eb5d1a2a-ad9f-4eb7-8a70-37a1523a1b6f\") " pod="openstack/nova-cell0-conductor-0" Mar 14 07:22:38 crc kubenswrapper[4893]: I0314 07:22:38.084499 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpglh\" (UniqueName: \"kubernetes.io/projected/eb5d1a2a-ad9f-4eb7-8a70-37a1523a1b6f-kube-api-access-bpglh\") pod \"nova-cell0-conductor-0\" (UID: \"eb5d1a2a-ad9f-4eb7-8a70-37a1523a1b6f\") " pod="openstack/nova-cell0-conductor-0" Mar 14 07:22:38 crc kubenswrapper[4893]: I0314 07:22:38.084562 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb5d1a2a-ad9f-4eb7-8a70-37a1523a1b6f-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"eb5d1a2a-ad9f-4eb7-8a70-37a1523a1b6f\") " pod="openstack/nova-cell0-conductor-0" Mar 14 07:22:38 crc kubenswrapper[4893]: I0314 07:22:38.088640 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb5d1a2a-ad9f-4eb7-8a70-37a1523a1b6f-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"eb5d1a2a-ad9f-4eb7-8a70-37a1523a1b6f\") " pod="openstack/nova-cell0-conductor-0" Mar 14 07:22:38 crc kubenswrapper[4893]: I0314 07:22:38.089002 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb5d1a2a-ad9f-4eb7-8a70-37a1523a1b6f-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"eb5d1a2a-ad9f-4eb7-8a70-37a1523a1b6f\") " pod="openstack/nova-cell0-conductor-0" Mar 14 07:22:38 crc kubenswrapper[4893]: I0314 07:22:38.101147 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpglh\" (UniqueName: \"kubernetes.io/projected/eb5d1a2a-ad9f-4eb7-8a70-37a1523a1b6f-kube-api-access-bpglh\") pod \"nova-cell0-conductor-0\" (UID: \"eb5d1a2a-ad9f-4eb7-8a70-37a1523a1b6f\") " pod="openstack/nova-cell0-conductor-0" Mar 14 07:22:38 crc kubenswrapper[4893]: I0314 07:22:38.126072 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 14 07:22:38 crc kubenswrapper[4893]: I0314 07:22:38.566643 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 14 07:22:38 crc kubenswrapper[4893]: W0314 07:22:38.569616 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb5d1a2a_ad9f_4eb7_8a70_37a1523a1b6f.slice/crio-718f695f76bab1d83372fb0a3cbf3c3aaf7b424b06a7cb6f76e286df01b44511 WatchSource:0}: Error finding container 718f695f76bab1d83372fb0a3cbf3c3aaf7b424b06a7cb6f76e286df01b44511: Status 404 returned error can't find the container with id 718f695f76bab1d83372fb0a3cbf3c3aaf7b424b06a7cb6f76e286df01b44511 Mar 14 07:22:38 crc kubenswrapper[4893]: I0314 07:22:38.711578 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1f8dfb58-4c36-4f9f-b142-29a325099f3e","Type":"ContainerStarted","Data":"7a443e5103fbd27a22cea9d965cf65d3ae0ac1bbfe32f671c5f3f620ad3f8284"} Mar 14 07:22:38 crc kubenswrapper[4893]: I0314 07:22:38.713261 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"eb5d1a2a-ad9f-4eb7-8a70-37a1523a1b6f","Type":"ContainerStarted","Data":"718f695f76bab1d83372fb0a3cbf3c3aaf7b424b06a7cb6f76e286df01b44511"} Mar 14 07:22:39 crc kubenswrapper[4893]: I0314 07:22:39.724198 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"eb5d1a2a-ad9f-4eb7-8a70-37a1523a1b6f","Type":"ContainerStarted","Data":"7ab1a7eb14a49fb840db8cffee0e903b4ae4cb885907191a6bc0ab1496f7f86c"} Mar 14 07:22:39 crc kubenswrapper[4893]: I0314 07:22:39.724655 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 14 07:22:39 crc kubenswrapper[4893]: I0314 07:22:39.747686 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.747663776 podStartE2EDuration="2.747663776s" podCreationTimestamp="2026-03-14 07:22:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:22:39.740192976 +0000 UTC m=+1439.002369778" watchObservedRunningTime="2026-03-14 07:22:39.747663776 +0000 UTC m=+1439.009840568" Mar 14 07:22:41 crc kubenswrapper[4893]: I0314 07:22:41.743689 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1f8dfb58-4c36-4f9f-b142-29a325099f3e","Type":"ContainerStarted","Data":"3b4dcfde76f399ad63952470692ea7797b03fc5391b57f10b894e2301aa465e2"} Mar 14 07:22:41 crc kubenswrapper[4893]: I0314 07:22:41.744213 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 14 07:22:41 crc kubenswrapper[4893]: I0314 07:22:41.773738 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.022912611 podStartE2EDuration="6.773693009s" podCreationTimestamp="2026-03-14 07:22:35 +0000 UTC" firstStartedPulling="2026-03-14 07:22:35.828032185 +0000 UTC m=+1435.090208997" lastFinishedPulling="2026-03-14 07:22:40.578812603 +0000 UTC m=+1439.840989395" observedRunningTime="2026-03-14 07:22:41.765255715 +0000 UTC m=+1441.027432527" watchObservedRunningTime="2026-03-14 07:22:41.773693009 +0000 UTC m=+1441.035869791" Mar 14 07:22:43 crc kubenswrapper[4893]: I0314 07:22:43.152209 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 14 07:22:43 crc kubenswrapper[4893]: I0314 07:22:43.646893 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-k5lhh"] Mar 14 07:22:43 crc kubenswrapper[4893]: I0314 07:22:43.648012 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-k5lhh" Mar 14 07:22:43 crc kubenswrapper[4893]: I0314 07:22:43.650052 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 14 07:22:43 crc kubenswrapper[4893]: I0314 07:22:43.650183 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 14 07:22:43 crc kubenswrapper[4893]: I0314 07:22:43.661578 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-k5lhh"] Mar 14 07:22:43 crc kubenswrapper[4893]: I0314 07:22:43.792397 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/438ee5c6-8f2a-491b-903d-78537b8465f4-config-data\") pod \"nova-cell0-cell-mapping-k5lhh\" (UID: \"438ee5c6-8f2a-491b-903d-78537b8465f4\") " pod="openstack/nova-cell0-cell-mapping-k5lhh" Mar 14 07:22:43 crc kubenswrapper[4893]: I0314 07:22:43.792688 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/438ee5c6-8f2a-491b-903d-78537b8465f4-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-k5lhh\" (UID: \"438ee5c6-8f2a-491b-903d-78537b8465f4\") " pod="openstack/nova-cell0-cell-mapping-k5lhh" Mar 14 07:22:43 crc kubenswrapper[4893]: I0314 07:22:43.792772 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/438ee5c6-8f2a-491b-903d-78537b8465f4-scripts\") pod \"nova-cell0-cell-mapping-k5lhh\" (UID: \"438ee5c6-8f2a-491b-903d-78537b8465f4\") " pod="openstack/nova-cell0-cell-mapping-k5lhh" Mar 14 07:22:43 crc kubenswrapper[4893]: I0314 07:22:43.792828 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwb2g\" (UniqueName: \"kubernetes.io/projected/438ee5c6-8f2a-491b-903d-78537b8465f4-kube-api-access-vwb2g\") pod \"nova-cell0-cell-mapping-k5lhh\" (UID: \"438ee5c6-8f2a-491b-903d-78537b8465f4\") " pod="openstack/nova-cell0-cell-mapping-k5lhh" Mar 14 07:22:43 crc kubenswrapper[4893]: I0314 07:22:43.809341 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 14 07:22:43 crc kubenswrapper[4893]: I0314 07:22:43.822403 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 14 07:22:43 crc kubenswrapper[4893]: I0314 07:22:43.829716 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 14 07:22:43 crc kubenswrapper[4893]: I0314 07:22:43.830173 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 14 07:22:43 crc kubenswrapper[4893]: I0314 07:22:43.893779 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 14 07:22:43 crc kubenswrapper[4893]: I0314 07:22:43.898366 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed84bb9f-6d66-4b40-8d90-2a2785d0ddd2-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ed84bb9f-6d66-4b40-8d90-2a2785d0ddd2\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 07:22:43 crc kubenswrapper[4893]: I0314 07:22:43.898460 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/438ee5c6-8f2a-491b-903d-78537b8465f4-config-data\") pod \"nova-cell0-cell-mapping-k5lhh\" (UID: \"438ee5c6-8f2a-491b-903d-78537b8465f4\") " pod="openstack/nova-cell0-cell-mapping-k5lhh" Mar 14 07:22:43 crc kubenswrapper[4893]: I0314 07:22:43.898501 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed84bb9f-6d66-4b40-8d90-2a2785d0ddd2-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ed84bb9f-6d66-4b40-8d90-2a2785d0ddd2\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 07:22:43 crc kubenswrapper[4893]: I0314 07:22:43.898639 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/438ee5c6-8f2a-491b-903d-78537b8465f4-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-k5lhh\" (UID: \"438ee5c6-8f2a-491b-903d-78537b8465f4\") " pod="openstack/nova-cell0-cell-mapping-k5lhh" Mar 14 07:22:43 crc kubenswrapper[4893]: I0314 07:22:43.898665 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmght\" (UniqueName: \"kubernetes.io/projected/ed84bb9f-6d66-4b40-8d90-2a2785d0ddd2-kube-api-access-vmght\") pod \"nova-cell1-novncproxy-0\" (UID: \"ed84bb9f-6d66-4b40-8d90-2a2785d0ddd2\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 07:22:43 crc kubenswrapper[4893]: I0314 07:22:43.898912 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/438ee5c6-8f2a-491b-903d-78537b8465f4-scripts\") pod \"nova-cell0-cell-mapping-k5lhh\" (UID: \"438ee5c6-8f2a-491b-903d-78537b8465f4\") " pod="openstack/nova-cell0-cell-mapping-k5lhh" Mar 14 07:22:43 crc kubenswrapper[4893]: I0314 07:22:43.899063 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwb2g\" (UniqueName: \"kubernetes.io/projected/438ee5c6-8f2a-491b-903d-78537b8465f4-kube-api-access-vwb2g\") pod \"nova-cell0-cell-mapping-k5lhh\" (UID: \"438ee5c6-8f2a-491b-903d-78537b8465f4\") " pod="openstack/nova-cell0-cell-mapping-k5lhh" Mar 14 07:22:43 crc kubenswrapper[4893]: I0314 07:22:43.955899 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 07:22:43 crc kubenswrapper[4893]: I0314 07:22:43.961576 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/438ee5c6-8f2a-491b-903d-78537b8465f4-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-k5lhh\" (UID: \"438ee5c6-8f2a-491b-903d-78537b8465f4\") " pod="openstack/nova-cell0-cell-mapping-k5lhh" Mar 14 07:22:43 crc kubenswrapper[4893]: I0314 07:22:43.962316 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/438ee5c6-8f2a-491b-903d-78537b8465f4-scripts\") pod \"nova-cell0-cell-mapping-k5lhh\" (UID: \"438ee5c6-8f2a-491b-903d-78537b8465f4\") " pod="openstack/nova-cell0-cell-mapping-k5lhh" Mar 14 07:22:43 crc kubenswrapper[4893]: I0314 07:22:43.986431 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 14 07:22:43 crc kubenswrapper[4893]: I0314 07:22:43.986607 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/438ee5c6-8f2a-491b-903d-78537b8465f4-config-data\") pod \"nova-cell0-cell-mapping-k5lhh\" (UID: \"438ee5c6-8f2a-491b-903d-78537b8465f4\") " pod="openstack/nova-cell0-cell-mapping-k5lhh" Mar 14 07:22:43 crc kubenswrapper[4893]: I0314 07:22:43.986804 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwb2g\" (UniqueName: \"kubernetes.io/projected/438ee5c6-8f2a-491b-903d-78537b8465f4-kube-api-access-vwb2g\") pod \"nova-cell0-cell-mapping-k5lhh\" (UID: \"438ee5c6-8f2a-491b-903d-78537b8465f4\") " pod="openstack/nova-cell0-cell-mapping-k5lhh" Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.003811 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed84bb9f-6d66-4b40-8d90-2a2785d0ddd2-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ed84bb9f-6d66-4b40-8d90-2a2785d0ddd2\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.003860 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed84bb9f-6d66-4b40-8d90-2a2785d0ddd2-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ed84bb9f-6d66-4b40-8d90-2a2785d0ddd2\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.003900 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmght\" (UniqueName: \"kubernetes.io/projected/ed84bb9f-6d66-4b40-8d90-2a2785d0ddd2-kube-api-access-vmght\") pod \"nova-cell1-novncproxy-0\" (UID: \"ed84bb9f-6d66-4b40-8d90-2a2785d0ddd2\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.003970 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/777a623c-ec9d-4342-b3e6-4c88286184a7-config-data\") pod \"nova-api-0\" (UID: \"777a623c-ec9d-4342-b3e6-4c88286184a7\") " pod="openstack/nova-api-0" Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.004019 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/777a623c-ec9d-4342-b3e6-4c88286184a7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"777a623c-ec9d-4342-b3e6-4c88286184a7\") " pod="openstack/nova-api-0" Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.004042 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftj8h\" (UniqueName: \"kubernetes.io/projected/777a623c-ec9d-4342-b3e6-4c88286184a7-kube-api-access-ftj8h\") pod \"nova-api-0\" (UID: \"777a623c-ec9d-4342-b3e6-4c88286184a7\") " pod="openstack/nova-api-0" Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.004068 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/777a623c-ec9d-4342-b3e6-4c88286184a7-logs\") pod \"nova-api-0\" (UID: \"777a623c-ec9d-4342-b3e6-4c88286184a7\") " pod="openstack/nova-api-0" Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.036644 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed84bb9f-6d66-4b40-8d90-2a2785d0ddd2-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ed84bb9f-6d66-4b40-8d90-2a2785d0ddd2\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.039293 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed84bb9f-6d66-4b40-8d90-2a2785d0ddd2-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ed84bb9f-6d66-4b40-8d90-2a2785d0ddd2\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.039376 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.060824 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmght\" (UniqueName: \"kubernetes.io/projected/ed84bb9f-6d66-4b40-8d90-2a2785d0ddd2-kube-api-access-vmght\") pod \"nova-cell1-novncproxy-0\" (UID: \"ed84bb9f-6d66-4b40-8d90-2a2785d0ddd2\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.073434 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.075204 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.079015 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.103160 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.107706 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/777a623c-ec9d-4342-b3e6-4c88286184a7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"777a623c-ec9d-4342-b3e6-4c88286184a7\") " pod="openstack/nova-api-0" Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.107755 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftj8h\" (UniqueName: \"kubernetes.io/projected/777a623c-ec9d-4342-b3e6-4c88286184a7-kube-api-access-ftj8h\") pod \"nova-api-0\" (UID: \"777a623c-ec9d-4342-b3e6-4c88286184a7\") " pod="openstack/nova-api-0" Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.107781 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2336e116-3fed-45af-968b-301825aa44a6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2336e116-3fed-45af-968b-301825aa44a6\") " pod="openstack/nova-metadata-0" Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.107808 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/777a623c-ec9d-4342-b3e6-4c88286184a7-logs\") pod \"nova-api-0\" (UID: \"777a623c-ec9d-4342-b3e6-4c88286184a7\") " pod="openstack/nova-api-0" Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.107828 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2336e116-3fed-45af-968b-301825aa44a6-config-data\") pod \"nova-metadata-0\" (UID: \"2336e116-3fed-45af-968b-301825aa44a6\") " pod="openstack/nova-metadata-0" Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.107899 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2336e116-3fed-45af-968b-301825aa44a6-logs\") pod \"nova-metadata-0\" (UID: \"2336e116-3fed-45af-968b-301825aa44a6\") " pod="openstack/nova-metadata-0" Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.107918 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddwc6\" (UniqueName: \"kubernetes.io/projected/2336e116-3fed-45af-968b-301825aa44a6-kube-api-access-ddwc6\") pod \"nova-metadata-0\" (UID: \"2336e116-3fed-45af-968b-301825aa44a6\") " pod="openstack/nova-metadata-0" Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.107953 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/777a623c-ec9d-4342-b3e6-4c88286184a7-config-data\") pod \"nova-api-0\" (UID: \"777a623c-ec9d-4342-b3e6-4c88286184a7\") " pod="openstack/nova-api-0" Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.108650 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/777a623c-ec9d-4342-b3e6-4c88286184a7-logs\") pod \"nova-api-0\" (UID: \"777a623c-ec9d-4342-b3e6-4c88286184a7\") " pod="openstack/nova-api-0" Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.119415 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.120947 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.123930 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.124376 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/777a623c-ec9d-4342-b3e6-4c88286184a7-config-data\") pod \"nova-api-0\" (UID: \"777a623c-ec9d-4342-b3e6-4c88286184a7\") " pod="openstack/nova-api-0" Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.124784 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/777a623c-ec9d-4342-b3e6-4c88286184a7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"777a623c-ec9d-4342-b3e6-4c88286184a7\") " pod="openstack/nova-api-0" Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.125982 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.129265 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftj8h\" (UniqueName: \"kubernetes.io/projected/777a623c-ec9d-4342-b3e6-4c88286184a7-kube-api-access-ftj8h\") pod \"nova-api-0\" (UID: \"777a623c-ec9d-4342-b3e6-4c88286184a7\") " pod="openstack/nova-api-0" Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.163741 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.172012 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b74b5cfd5-p584x"] Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.173772 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b74b5cfd5-p584x" Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.193349 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b74b5cfd5-p584x"] Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.210353 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1343ad6-1ec8-4249-826c-df0efc18fcb8-config\") pod \"dnsmasq-dns-5b74b5cfd5-p584x\" (UID: \"f1343ad6-1ec8-4249-826c-df0efc18fcb8\") " pod="openstack/dnsmasq-dns-5b74b5cfd5-p584x" Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.210405 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2336e116-3fed-45af-968b-301825aa44a6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2336e116-3fed-45af-968b-301825aa44a6\") " pod="openstack/nova-metadata-0" Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.210438 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2336e116-3fed-45af-968b-301825aa44a6-config-data\") pod \"nova-metadata-0\" (UID: \"2336e116-3fed-45af-968b-301825aa44a6\") " pod="openstack/nova-metadata-0" Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.210622 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f1343ad6-1ec8-4249-826c-df0efc18fcb8-dns-svc\") pod \"dnsmasq-dns-5b74b5cfd5-p584x\" (UID: \"f1343ad6-1ec8-4249-826c-df0efc18fcb8\") " pod="openstack/dnsmasq-dns-5b74b5cfd5-p584x" Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.210677 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f1343ad6-1ec8-4249-826c-df0efc18fcb8-ovsdbserver-sb\") pod \"dnsmasq-dns-5b74b5cfd5-p584x\" (UID: \"f1343ad6-1ec8-4249-826c-df0efc18fcb8\") " pod="openstack/dnsmasq-dns-5b74b5cfd5-p584x" Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.210716 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b51e9fd-b41e-4a2d-a792-b21cdb16a895-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3b51e9fd-b41e-4a2d-a792-b21cdb16a895\") " pod="openstack/nova-scheduler-0" Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.210738 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2336e116-3fed-45af-968b-301825aa44a6-logs\") pod \"nova-metadata-0\" (UID: \"2336e116-3fed-45af-968b-301825aa44a6\") " pod="openstack/nova-metadata-0" Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.210755 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddwc6\" (UniqueName: \"kubernetes.io/projected/2336e116-3fed-45af-968b-301825aa44a6-kube-api-access-ddwc6\") pod \"nova-metadata-0\" (UID: \"2336e116-3fed-45af-968b-301825aa44a6\") " pod="openstack/nova-metadata-0" Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.210775 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b51e9fd-b41e-4a2d-a792-b21cdb16a895-config-data\") pod \"nova-scheduler-0\" (UID: \"3b51e9fd-b41e-4a2d-a792-b21cdb16a895\") " pod="openstack/nova-scheduler-0" Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.210806 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f1343ad6-1ec8-4249-826c-df0efc18fcb8-ovsdbserver-nb\") pod \"dnsmasq-dns-5b74b5cfd5-p584x\" (UID: \"f1343ad6-1ec8-4249-826c-df0efc18fcb8\") " pod="openstack/dnsmasq-dns-5b74b5cfd5-p584x" Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.210837 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m75qh\" (UniqueName: \"kubernetes.io/projected/f1343ad6-1ec8-4249-826c-df0efc18fcb8-kube-api-access-m75qh\") pod \"dnsmasq-dns-5b74b5cfd5-p584x\" (UID: \"f1343ad6-1ec8-4249-826c-df0efc18fcb8\") " pod="openstack/dnsmasq-dns-5b74b5cfd5-p584x" Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.210855 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f1343ad6-1ec8-4249-826c-df0efc18fcb8-dns-swift-storage-0\") pod \"dnsmasq-dns-5b74b5cfd5-p584x\" (UID: \"f1343ad6-1ec8-4249-826c-df0efc18fcb8\") " pod="openstack/dnsmasq-dns-5b74b5cfd5-p584x" Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.210893 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9x49\" (UniqueName: \"kubernetes.io/projected/3b51e9fd-b41e-4a2d-a792-b21cdb16a895-kube-api-access-v9x49\") pod \"nova-scheduler-0\" (UID: \"3b51e9fd-b41e-4a2d-a792-b21cdb16a895\") " pod="openstack/nova-scheduler-0" Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.212715 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2336e116-3fed-45af-968b-301825aa44a6-logs\") pod \"nova-metadata-0\" (UID: \"2336e116-3fed-45af-968b-301825aa44a6\") " pod="openstack/nova-metadata-0" Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.213817 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2336e116-3fed-45af-968b-301825aa44a6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2336e116-3fed-45af-968b-301825aa44a6\") " pod="openstack/nova-metadata-0" Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.222083 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2336e116-3fed-45af-968b-301825aa44a6-config-data\") pod \"nova-metadata-0\" (UID: \"2336e116-3fed-45af-968b-301825aa44a6\") " pod="openstack/nova-metadata-0" Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.222297 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.233043 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddwc6\" (UniqueName: \"kubernetes.io/projected/2336e116-3fed-45af-968b-301825aa44a6-kube-api-access-ddwc6\") pod \"nova-metadata-0\" (UID: \"2336e116-3fed-45af-968b-301825aa44a6\") " pod="openstack/nova-metadata-0" Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.238726 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.266632 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-k5lhh" Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.312315 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9x49\" (UniqueName: \"kubernetes.io/projected/3b51e9fd-b41e-4a2d-a792-b21cdb16a895-kube-api-access-v9x49\") pod \"nova-scheduler-0\" (UID: \"3b51e9fd-b41e-4a2d-a792-b21cdb16a895\") " pod="openstack/nova-scheduler-0" Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.312604 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1343ad6-1ec8-4249-826c-df0efc18fcb8-config\") pod \"dnsmasq-dns-5b74b5cfd5-p584x\" (UID: \"f1343ad6-1ec8-4249-826c-df0efc18fcb8\") " pod="openstack/dnsmasq-dns-5b74b5cfd5-p584x" Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.312680 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f1343ad6-1ec8-4249-826c-df0efc18fcb8-dns-svc\") pod \"dnsmasq-dns-5b74b5cfd5-p584x\" (UID: \"f1343ad6-1ec8-4249-826c-df0efc18fcb8\") " pod="openstack/dnsmasq-dns-5b74b5cfd5-p584x" Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.312713 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f1343ad6-1ec8-4249-826c-df0efc18fcb8-ovsdbserver-sb\") pod \"dnsmasq-dns-5b74b5cfd5-p584x\" (UID: \"f1343ad6-1ec8-4249-826c-df0efc18fcb8\") " pod="openstack/dnsmasq-dns-5b74b5cfd5-p584x" Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.312743 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b51e9fd-b41e-4a2d-a792-b21cdb16a895-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3b51e9fd-b41e-4a2d-a792-b21cdb16a895\") " pod="openstack/nova-scheduler-0" Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.312779 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b51e9fd-b41e-4a2d-a792-b21cdb16a895-config-data\") pod \"nova-scheduler-0\" (UID: \"3b51e9fd-b41e-4a2d-a792-b21cdb16a895\") " pod="openstack/nova-scheduler-0" Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.312800 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f1343ad6-1ec8-4249-826c-df0efc18fcb8-ovsdbserver-nb\") pod \"dnsmasq-dns-5b74b5cfd5-p584x\" (UID: \"f1343ad6-1ec8-4249-826c-df0efc18fcb8\") " pod="openstack/dnsmasq-dns-5b74b5cfd5-p584x" Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.312840 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m75qh\" (UniqueName: \"kubernetes.io/projected/f1343ad6-1ec8-4249-826c-df0efc18fcb8-kube-api-access-m75qh\") pod \"dnsmasq-dns-5b74b5cfd5-p584x\" (UID: \"f1343ad6-1ec8-4249-826c-df0efc18fcb8\") " pod="openstack/dnsmasq-dns-5b74b5cfd5-p584x" Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.312856 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f1343ad6-1ec8-4249-826c-df0efc18fcb8-dns-swift-storage-0\") pod \"dnsmasq-dns-5b74b5cfd5-p584x\" (UID: \"f1343ad6-1ec8-4249-826c-df0efc18fcb8\") " pod="openstack/dnsmasq-dns-5b74b5cfd5-p584x" Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.314432 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f1343ad6-1ec8-4249-826c-df0efc18fcb8-dns-swift-storage-0\") pod \"dnsmasq-dns-5b74b5cfd5-p584x\" (UID: \"f1343ad6-1ec8-4249-826c-df0efc18fcb8\") " pod="openstack/dnsmasq-dns-5b74b5cfd5-p584x" Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.315963 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1343ad6-1ec8-4249-826c-df0efc18fcb8-config\") pod \"dnsmasq-dns-5b74b5cfd5-p584x\" (UID: \"f1343ad6-1ec8-4249-826c-df0efc18fcb8\") " pod="openstack/dnsmasq-dns-5b74b5cfd5-p584x" Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.318005 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f1343ad6-1ec8-4249-826c-df0efc18fcb8-dns-svc\") pod \"dnsmasq-dns-5b74b5cfd5-p584x\" (UID: \"f1343ad6-1ec8-4249-826c-df0efc18fcb8\") " pod="openstack/dnsmasq-dns-5b74b5cfd5-p584x" Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.318483 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f1343ad6-1ec8-4249-826c-df0efc18fcb8-ovsdbserver-sb\") pod \"dnsmasq-dns-5b74b5cfd5-p584x\" (UID: \"f1343ad6-1ec8-4249-826c-df0efc18fcb8\") " pod="openstack/dnsmasq-dns-5b74b5cfd5-p584x" Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.319412 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f1343ad6-1ec8-4249-826c-df0efc18fcb8-ovsdbserver-nb\") pod \"dnsmasq-dns-5b74b5cfd5-p584x\" (UID: \"f1343ad6-1ec8-4249-826c-df0efc18fcb8\") " pod="openstack/dnsmasq-dns-5b74b5cfd5-p584x" Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.321387 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b51e9fd-b41e-4a2d-a792-b21cdb16a895-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3b51e9fd-b41e-4a2d-a792-b21cdb16a895\") " pod="openstack/nova-scheduler-0" Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.324873 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b51e9fd-b41e-4a2d-a792-b21cdb16a895-config-data\") pod \"nova-scheduler-0\" (UID: \"3b51e9fd-b41e-4a2d-a792-b21cdb16a895\") " pod="openstack/nova-scheduler-0" Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.330907 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9x49\" (UniqueName: \"kubernetes.io/projected/3b51e9fd-b41e-4a2d-a792-b21cdb16a895-kube-api-access-v9x49\") pod \"nova-scheduler-0\" (UID: \"3b51e9fd-b41e-4a2d-a792-b21cdb16a895\") " pod="openstack/nova-scheduler-0" Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.340139 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m75qh\" (UniqueName: \"kubernetes.io/projected/f1343ad6-1ec8-4249-826c-df0efc18fcb8-kube-api-access-m75qh\") pod \"dnsmasq-dns-5b74b5cfd5-p584x\" (UID: \"f1343ad6-1ec8-4249-826c-df0efc18fcb8\") " pod="openstack/dnsmasq-dns-5b74b5cfd5-p584x" Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.563332 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.574191 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b74b5cfd5-p584x" Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.843560 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 14 07:22:44 crc kubenswrapper[4893]: W0314 07:22:44.850647 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded84bb9f_6d66_4b40_8d90_2a2785d0ddd2.slice/crio-7c3b7c70024d3897f51a3aee732a87b7135b1b5652f8c7f39f2fff1394bf3499 WatchSource:0}: Error finding container 7c3b7c70024d3897f51a3aee732a87b7135b1b5652f8c7f39f2fff1394bf3499: Status 404 returned error can't find the container with id 7c3b7c70024d3897f51a3aee732a87b7135b1b5652f8c7f39f2fff1394bf3499 Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.947590 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-5vwvm"] Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.949427 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-5vwvm" Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.952064 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.952096 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 14 07:22:44 crc kubenswrapper[4893]: I0314 07:22:44.964110 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-5vwvm"] Mar 14 07:22:45 crc kubenswrapper[4893]: I0314 07:22:45.030643 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 14 07:22:45 crc kubenswrapper[4893]: W0314 07:22:45.038663 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod777a623c_ec9d_4342_b3e6_4c88286184a7.slice/crio-ad2971fce39f05d0f2d1bd8a5b86d2bdfc53e137e5550e6e8e8e11cbbc434b53 WatchSource:0}: Error finding container ad2971fce39f05d0f2d1bd8a5b86d2bdfc53e137e5550e6e8e8e11cbbc434b53: Status 404 returned error can't find the container with id ad2971fce39f05d0f2d1bd8a5b86d2bdfc53e137e5550e6e8e8e11cbbc434b53 Mar 14 07:22:45 crc kubenswrapper[4893]: I0314 07:22:45.054045 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6d550ac-ae8f-4fec-97fc-0e9816dd6b7a-config-data\") pod \"nova-cell1-conductor-db-sync-5vwvm\" (UID: \"f6d550ac-ae8f-4fec-97fc-0e9816dd6b7a\") " pod="openstack/nova-cell1-conductor-db-sync-5vwvm" Mar 14 07:22:45 crc kubenswrapper[4893]: I0314 07:22:45.054804 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6d550ac-ae8f-4fec-97fc-0e9816dd6b7a-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-5vwvm\" (UID: \"f6d550ac-ae8f-4fec-97fc-0e9816dd6b7a\") " pod="openstack/nova-cell1-conductor-db-sync-5vwvm" Mar 14 07:22:45 crc kubenswrapper[4893]: I0314 07:22:45.055250 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78hzz\" (UniqueName: \"kubernetes.io/projected/f6d550ac-ae8f-4fec-97fc-0e9816dd6b7a-kube-api-access-78hzz\") pod \"nova-cell1-conductor-db-sync-5vwvm\" (UID: \"f6d550ac-ae8f-4fec-97fc-0e9816dd6b7a\") " pod="openstack/nova-cell1-conductor-db-sync-5vwvm" Mar 14 07:22:45 crc kubenswrapper[4893]: I0314 07:22:45.055660 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6d550ac-ae8f-4fec-97fc-0e9816dd6b7a-scripts\") pod \"nova-cell1-conductor-db-sync-5vwvm\" (UID: \"f6d550ac-ae8f-4fec-97fc-0e9816dd6b7a\") " pod="openstack/nova-cell1-conductor-db-sync-5vwvm" Mar 14 07:22:45 crc kubenswrapper[4893]: I0314 07:22:45.154549 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-k5lhh"] Mar 14 07:22:45 crc kubenswrapper[4893]: I0314 07:22:45.158103 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6d550ac-ae8f-4fec-97fc-0e9816dd6b7a-config-data\") pod \"nova-cell1-conductor-db-sync-5vwvm\" (UID: \"f6d550ac-ae8f-4fec-97fc-0e9816dd6b7a\") " pod="openstack/nova-cell1-conductor-db-sync-5vwvm" Mar 14 07:22:45 crc kubenswrapper[4893]: I0314 07:22:45.158143 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6d550ac-ae8f-4fec-97fc-0e9816dd6b7a-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-5vwvm\" (UID: \"f6d550ac-ae8f-4fec-97fc-0e9816dd6b7a\") " pod="openstack/nova-cell1-conductor-db-sync-5vwvm" Mar 14 07:22:45 crc kubenswrapper[4893]: I0314 07:22:45.158182 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78hzz\" (UniqueName: \"kubernetes.io/projected/f6d550ac-ae8f-4fec-97fc-0e9816dd6b7a-kube-api-access-78hzz\") pod \"nova-cell1-conductor-db-sync-5vwvm\" (UID: \"f6d550ac-ae8f-4fec-97fc-0e9816dd6b7a\") " pod="openstack/nova-cell1-conductor-db-sync-5vwvm" Mar 14 07:22:45 crc kubenswrapper[4893]: I0314 07:22:45.158211 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6d550ac-ae8f-4fec-97fc-0e9816dd6b7a-scripts\") pod \"nova-cell1-conductor-db-sync-5vwvm\" (UID: \"f6d550ac-ae8f-4fec-97fc-0e9816dd6b7a\") " pod="openstack/nova-cell1-conductor-db-sync-5vwvm" Mar 14 07:22:45 crc kubenswrapper[4893]: I0314 07:22:45.165280 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6d550ac-ae8f-4fec-97fc-0e9816dd6b7a-config-data\") pod \"nova-cell1-conductor-db-sync-5vwvm\" (UID: \"f6d550ac-ae8f-4fec-97fc-0e9816dd6b7a\") " pod="openstack/nova-cell1-conductor-db-sync-5vwvm" Mar 14 07:22:45 crc kubenswrapper[4893]: I0314 07:22:45.170473 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 07:22:45 crc kubenswrapper[4893]: I0314 07:22:45.181571 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6d550ac-ae8f-4fec-97fc-0e9816dd6b7a-scripts\") pod \"nova-cell1-conductor-db-sync-5vwvm\" (UID: \"f6d550ac-ae8f-4fec-97fc-0e9816dd6b7a\") " pod="openstack/nova-cell1-conductor-db-sync-5vwvm" Mar 14 07:22:45 crc kubenswrapper[4893]: I0314 07:22:45.186639 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6d550ac-ae8f-4fec-97fc-0e9816dd6b7a-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-5vwvm\" (UID: \"f6d550ac-ae8f-4fec-97fc-0e9816dd6b7a\") " pod="openstack/nova-cell1-conductor-db-sync-5vwvm" Mar 14 07:22:45 crc kubenswrapper[4893]: I0314 07:22:45.192283 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78hzz\" (UniqueName: \"kubernetes.io/projected/f6d550ac-ae8f-4fec-97fc-0e9816dd6b7a-kube-api-access-78hzz\") pod \"nova-cell1-conductor-db-sync-5vwvm\" (UID: \"f6d550ac-ae8f-4fec-97fc-0e9816dd6b7a\") " pod="openstack/nova-cell1-conductor-db-sync-5vwvm" Mar 14 07:22:45 crc kubenswrapper[4893]: I0314 07:22:45.206887 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 07:22:45 crc kubenswrapper[4893]: W0314 07:22:45.210775 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b51e9fd_b41e_4a2d_a792_b21cdb16a895.slice/crio-2812c536420517ddc3be9f39708aa14462650bb8f3e18b44cf2fbb6f880a5027 WatchSource:0}: Error finding container 2812c536420517ddc3be9f39708aa14462650bb8f3e18b44cf2fbb6f880a5027: Status 404 returned error can't find the container with id 2812c536420517ddc3be9f39708aa14462650bb8f3e18b44cf2fbb6f880a5027 Mar 14 07:22:45 crc kubenswrapper[4893]: I0314 07:22:45.298755 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-5vwvm" Mar 14 07:22:45 crc kubenswrapper[4893]: I0314 07:22:45.414658 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b74b5cfd5-p584x"] Mar 14 07:22:45 crc kubenswrapper[4893]: I0314 07:22:45.815126 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ed84bb9f-6d66-4b40-8d90-2a2785d0ddd2","Type":"ContainerStarted","Data":"7c3b7c70024d3897f51a3aee732a87b7135b1b5652f8c7f39f2fff1394bf3499"} Mar 14 07:22:45 crc kubenswrapper[4893]: I0314 07:22:45.819338 4893 generic.go:334] "Generic (PLEG): container finished" podID="f1343ad6-1ec8-4249-826c-df0efc18fcb8" containerID="bf35d0d0d553b90bea3ea8c8414f7f3701ce5091dc704314fb9c763a78963013" exitCode=0 Mar 14 07:22:45 crc kubenswrapper[4893]: I0314 07:22:45.819470 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b74b5cfd5-p584x" event={"ID":"f1343ad6-1ec8-4249-826c-df0efc18fcb8","Type":"ContainerDied","Data":"bf35d0d0d553b90bea3ea8c8414f7f3701ce5091dc704314fb9c763a78963013"} Mar 14 07:22:45 crc kubenswrapper[4893]: I0314 07:22:45.819533 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b74b5cfd5-p584x" event={"ID":"f1343ad6-1ec8-4249-826c-df0efc18fcb8","Type":"ContainerStarted","Data":"4b7bd6321dac21e164c9aa65c2c4f4148bf0e4e783d9ec2ceed62a2b8ce636c3"} Mar 14 07:22:45 crc kubenswrapper[4893]: I0314 07:22:45.822177 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-5vwvm"] Mar 14 07:22:45 crc kubenswrapper[4893]: I0314 07:22:45.825855 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3b51e9fd-b41e-4a2d-a792-b21cdb16a895","Type":"ContainerStarted","Data":"2812c536420517ddc3be9f39708aa14462650bb8f3e18b44cf2fbb6f880a5027"} Mar 14 07:22:45 crc kubenswrapper[4893]: I0314 07:22:45.840666 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2336e116-3fed-45af-968b-301825aa44a6","Type":"ContainerStarted","Data":"0a0538c7186fed0fc8f2e8dcc3a0485c473157eccddb3c5dd0d0627b415fcf59"} Mar 14 07:22:45 crc kubenswrapper[4893]: I0314 07:22:45.846777 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"777a623c-ec9d-4342-b3e6-4c88286184a7","Type":"ContainerStarted","Data":"ad2971fce39f05d0f2d1bd8a5b86d2bdfc53e137e5550e6e8e8e11cbbc434b53"} Mar 14 07:22:45 crc kubenswrapper[4893]: I0314 07:22:45.850011 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-k5lhh" event={"ID":"438ee5c6-8f2a-491b-903d-78537b8465f4","Type":"ContainerStarted","Data":"1d21f42482ee6e162a138e9e63502673b14426d96eadf78b516d61fd03d33c75"} Mar 14 07:22:45 crc kubenswrapper[4893]: I0314 07:22:45.850056 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-k5lhh" event={"ID":"438ee5c6-8f2a-491b-903d-78537b8465f4","Type":"ContainerStarted","Data":"3c367d3c41189ac945dccea5a017d16ce188ec89f96d0beb035c6de7d065fa65"} Mar 14 07:22:45 crc kubenswrapper[4893]: I0314 07:22:45.879137 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-k5lhh" podStartSLOduration=2.879113185 podStartE2EDuration="2.879113185s" podCreationTimestamp="2026-03-14 07:22:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:22:45.872154736 +0000 UTC m=+1445.134331548" watchObservedRunningTime="2026-03-14 07:22:45.879113185 +0000 UTC m=+1445.141289977" Mar 14 07:22:46 crc kubenswrapper[4893]: I0314 07:22:46.864910 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b74b5cfd5-p584x" event={"ID":"f1343ad6-1ec8-4249-826c-df0efc18fcb8","Type":"ContainerStarted","Data":"5610af0d28e422292b929087d858b3cea654e6cea1225c3623ef93e1dccb644e"} Mar 14 07:22:46 crc kubenswrapper[4893]: I0314 07:22:46.865439 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b74b5cfd5-p584x" Mar 14 07:22:46 crc kubenswrapper[4893]: I0314 07:22:46.869912 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-5vwvm" event={"ID":"f6d550ac-ae8f-4fec-97fc-0e9816dd6b7a","Type":"ContainerStarted","Data":"bf00332d3fc74a103886075ae2cfbfdc2a8883a1876737cc65040af966c58cfc"} Mar 14 07:22:46 crc kubenswrapper[4893]: I0314 07:22:46.869941 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-5vwvm" event={"ID":"f6d550ac-ae8f-4fec-97fc-0e9816dd6b7a","Type":"ContainerStarted","Data":"996d08b2604d52e591e785d01f6c412168ed75fc01667397c3acfb5347ff5a0d"} Mar 14 07:22:46 crc kubenswrapper[4893]: I0314 07:22:46.897779 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b74b5cfd5-p584x" podStartSLOduration=2.897763998 podStartE2EDuration="2.897763998s" podCreationTimestamp="2026-03-14 07:22:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:22:46.893240078 +0000 UTC m=+1446.155416890" watchObservedRunningTime="2026-03-14 07:22:46.897763998 +0000 UTC m=+1446.159940790" Mar 14 07:22:46 crc kubenswrapper[4893]: I0314 07:22:46.921271 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-5vwvm" podStartSLOduration=2.921255835 podStartE2EDuration="2.921255835s" podCreationTimestamp="2026-03-14 07:22:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:22:46.916781717 +0000 UTC m=+1446.178958539" watchObservedRunningTime="2026-03-14 07:22:46.921255835 +0000 UTC m=+1446.183432627" Mar 14 07:22:48 crc kubenswrapper[4893]: I0314 07:22:48.060717 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 07:22:48 crc kubenswrapper[4893]: I0314 07:22:48.071811 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 14 07:22:48 crc kubenswrapper[4893]: I0314 07:22:48.893869 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ed84bb9f-6d66-4b40-8d90-2a2785d0ddd2","Type":"ContainerStarted","Data":"0d858d0941b63a264b577e23bcc03044e08fb0342547f46acf47f20765728aa7"} Mar 14 07:22:48 crc kubenswrapper[4893]: I0314 07:22:48.894361 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="ed84bb9f-6d66-4b40-8d90-2a2785d0ddd2" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://0d858d0941b63a264b577e23bcc03044e08fb0342547f46acf47f20765728aa7" gracePeriod=30 Mar 14 07:22:48 crc kubenswrapper[4893]: I0314 07:22:48.895467 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3b51e9fd-b41e-4a2d-a792-b21cdb16a895","Type":"ContainerStarted","Data":"cd9cb53c3b340e2b7ee180907fde7f77dc72c35ab027e7836dbd5dafe2c865ab"} Mar 14 07:22:48 crc kubenswrapper[4893]: I0314 07:22:48.901865 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2336e116-3fed-45af-968b-301825aa44a6","Type":"ContainerStarted","Data":"62265d015ff4962fcbaf716844c4ef386cf89bbbd13c2dace731c8743206848b"} Mar 14 07:22:48 crc kubenswrapper[4893]: I0314 07:22:48.902077 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2336e116-3fed-45af-968b-301825aa44a6","Type":"ContainerStarted","Data":"4d7dd48993ccedf3d39ca9396519bdfa5e60db1086c1180d8829d4b46879e6ae"} Mar 14 07:22:48 crc kubenswrapper[4893]: I0314 07:22:48.901955 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2336e116-3fed-45af-968b-301825aa44a6" containerName="nova-metadata-metadata" containerID="cri-o://62265d015ff4962fcbaf716844c4ef386cf89bbbd13c2dace731c8743206848b" gracePeriod=30 Mar 14 07:22:48 crc kubenswrapper[4893]: I0314 07:22:48.901934 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2336e116-3fed-45af-968b-301825aa44a6" containerName="nova-metadata-log" containerID="cri-o://4d7dd48993ccedf3d39ca9396519bdfa5e60db1086c1180d8829d4b46879e6ae" gracePeriod=30 Mar 14 07:22:48 crc kubenswrapper[4893]: I0314 07:22:48.904919 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"777a623c-ec9d-4342-b3e6-4c88286184a7","Type":"ContainerStarted","Data":"52b184e480c05637f492ebde9c93d95639ec40495b2845bb32049914d45db2e9"} Mar 14 07:22:48 crc kubenswrapper[4893]: I0314 07:22:48.904993 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"777a623c-ec9d-4342-b3e6-4c88286184a7","Type":"ContainerStarted","Data":"4a6b9f7c58db739df5aa6f68c55533cd7497f82f56367f7172a25847abbbdf12"} Mar 14 07:22:48 crc kubenswrapper[4893]: I0314 07:22:48.916967 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.459232668 podStartE2EDuration="5.916948625s" podCreationTimestamp="2026-03-14 07:22:43 +0000 UTC" firstStartedPulling="2026-03-14 07:22:44.852629243 +0000 UTC m=+1444.114806035" lastFinishedPulling="2026-03-14 07:22:48.31034519 +0000 UTC m=+1447.572521992" observedRunningTime="2026-03-14 07:22:48.910102429 +0000 UTC m=+1448.172279231" watchObservedRunningTime="2026-03-14 07:22:48.916948625 +0000 UTC m=+1448.179125417" Mar 14 07:22:48 crc kubenswrapper[4893]: I0314 07:22:48.935643 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.8069388589999997 podStartE2EDuration="5.935625596s" podCreationTimestamp="2026-03-14 07:22:43 +0000 UTC" firstStartedPulling="2026-03-14 07:22:45.234283946 +0000 UTC m=+1444.496460738" lastFinishedPulling="2026-03-14 07:22:48.362970683 +0000 UTC m=+1447.625147475" observedRunningTime="2026-03-14 07:22:48.927152391 +0000 UTC m=+1448.189329203" watchObservedRunningTime="2026-03-14 07:22:48.935625596 +0000 UTC m=+1448.197802378" Mar 14 07:22:48 crc kubenswrapper[4893]: I0314 07:22:48.947167 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.793569995 podStartE2EDuration="5.947152305s" podCreationTimestamp="2026-03-14 07:22:43 +0000 UTC" firstStartedPulling="2026-03-14 07:22:45.197479815 +0000 UTC m=+1444.459656597" lastFinishedPulling="2026-03-14 07:22:48.351062115 +0000 UTC m=+1447.613238907" observedRunningTime="2026-03-14 07:22:48.945044234 +0000 UTC m=+1448.207221036" watchObservedRunningTime="2026-03-14 07:22:48.947152305 +0000 UTC m=+1448.209329097" Mar 14 07:22:48 crc kubenswrapper[4893]: I0314 07:22:48.965480 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.64038509 podStartE2EDuration="5.965460648s" podCreationTimestamp="2026-03-14 07:22:43 +0000 UTC" firstStartedPulling="2026-03-14 07:22:45.041270167 +0000 UTC m=+1444.303446959" lastFinishedPulling="2026-03-14 07:22:48.366345715 +0000 UTC m=+1447.628522517" observedRunningTime="2026-03-14 07:22:48.963650494 +0000 UTC m=+1448.225827296" watchObservedRunningTime="2026-03-14 07:22:48.965460648 +0000 UTC m=+1448.227637450" Mar 14 07:22:49 crc kubenswrapper[4893]: I0314 07:22:49.164499 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 14 07:22:49 crc kubenswrapper[4893]: I0314 07:22:49.564922 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 14 07:22:49 crc kubenswrapper[4893]: I0314 07:22:49.922802 4893 generic.go:334] "Generic (PLEG): container finished" podID="2336e116-3fed-45af-968b-301825aa44a6" containerID="4d7dd48993ccedf3d39ca9396519bdfa5e60db1086c1180d8829d4b46879e6ae" exitCode=143 Mar 14 07:22:49 crc kubenswrapper[4893]: I0314 07:22:49.922882 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2336e116-3fed-45af-968b-301825aa44a6","Type":"ContainerDied","Data":"4d7dd48993ccedf3d39ca9396519bdfa5e60db1086c1180d8829d4b46879e6ae"} Mar 14 07:22:52 crc kubenswrapper[4893]: I0314 07:22:52.954025 4893 generic.go:334] "Generic (PLEG): container finished" podID="438ee5c6-8f2a-491b-903d-78537b8465f4" containerID="1d21f42482ee6e162a138e9e63502673b14426d96eadf78b516d61fd03d33c75" exitCode=0 Mar 14 07:22:52 crc kubenswrapper[4893]: I0314 07:22:52.954098 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-k5lhh" event={"ID":"438ee5c6-8f2a-491b-903d-78537b8465f4","Type":"ContainerDied","Data":"1d21f42482ee6e162a138e9e63502673b14426d96eadf78b516d61fd03d33c75"} Mar 14 07:22:54 crc kubenswrapper[4893]: I0314 07:22:54.223290 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 14 07:22:54 crc kubenswrapper[4893]: I0314 07:22:54.223693 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 14 07:22:54 crc kubenswrapper[4893]: I0314 07:22:54.493612 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-k5lhh" Mar 14 07:22:54 crc kubenswrapper[4893]: I0314 07:22:54.531765 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwb2g\" (UniqueName: \"kubernetes.io/projected/438ee5c6-8f2a-491b-903d-78537b8465f4-kube-api-access-vwb2g\") pod \"438ee5c6-8f2a-491b-903d-78537b8465f4\" (UID: \"438ee5c6-8f2a-491b-903d-78537b8465f4\") " Mar 14 07:22:54 crc kubenswrapper[4893]: I0314 07:22:54.531972 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/438ee5c6-8f2a-491b-903d-78537b8465f4-combined-ca-bundle\") pod \"438ee5c6-8f2a-491b-903d-78537b8465f4\" (UID: \"438ee5c6-8f2a-491b-903d-78537b8465f4\") " Mar 14 07:22:54 crc kubenswrapper[4893]: I0314 07:22:54.532038 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/438ee5c6-8f2a-491b-903d-78537b8465f4-config-data\") pod \"438ee5c6-8f2a-491b-903d-78537b8465f4\" (UID: \"438ee5c6-8f2a-491b-903d-78537b8465f4\") " Mar 14 07:22:54 crc kubenswrapper[4893]: I0314 07:22:54.532075 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/438ee5c6-8f2a-491b-903d-78537b8465f4-scripts\") pod \"438ee5c6-8f2a-491b-903d-78537b8465f4\" (UID: \"438ee5c6-8f2a-491b-903d-78537b8465f4\") " Mar 14 07:22:54 crc kubenswrapper[4893]: I0314 07:22:54.541724 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/438ee5c6-8f2a-491b-903d-78537b8465f4-scripts" (OuterVolumeSpecName: "scripts") pod "438ee5c6-8f2a-491b-903d-78537b8465f4" (UID: "438ee5c6-8f2a-491b-903d-78537b8465f4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:54 crc kubenswrapper[4893]: I0314 07:22:54.547166 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/438ee5c6-8f2a-491b-903d-78537b8465f4-kube-api-access-vwb2g" (OuterVolumeSpecName: "kube-api-access-vwb2g") pod "438ee5c6-8f2a-491b-903d-78537b8465f4" (UID: "438ee5c6-8f2a-491b-903d-78537b8465f4"). InnerVolumeSpecName "kube-api-access-vwb2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:22:54 crc kubenswrapper[4893]: I0314 07:22:54.559951 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/438ee5c6-8f2a-491b-903d-78537b8465f4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "438ee5c6-8f2a-491b-903d-78537b8465f4" (UID: "438ee5c6-8f2a-491b-903d-78537b8465f4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:54 crc kubenswrapper[4893]: I0314 07:22:54.561611 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/438ee5c6-8f2a-491b-903d-78537b8465f4-config-data" (OuterVolumeSpecName: "config-data") pod "438ee5c6-8f2a-491b-903d-78537b8465f4" (UID: "438ee5c6-8f2a-491b-903d-78537b8465f4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:54 crc kubenswrapper[4893]: I0314 07:22:54.564546 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 14 07:22:54 crc kubenswrapper[4893]: I0314 07:22:54.582390 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b74b5cfd5-p584x" Mar 14 07:22:54 crc kubenswrapper[4893]: I0314 07:22:54.596223 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 14 07:22:54 crc kubenswrapper[4893]: I0314 07:22:54.635640 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwb2g\" (UniqueName: \"kubernetes.io/projected/438ee5c6-8f2a-491b-903d-78537b8465f4-kube-api-access-vwb2g\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:54 crc kubenswrapper[4893]: I0314 07:22:54.635671 4893 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/438ee5c6-8f2a-491b-903d-78537b8465f4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:54 crc kubenswrapper[4893]: I0314 07:22:54.635682 4893 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/438ee5c6-8f2a-491b-903d-78537b8465f4-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:54 crc kubenswrapper[4893]: I0314 07:22:54.635692 4893 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/438ee5c6-8f2a-491b-903d-78537b8465f4-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:54 crc kubenswrapper[4893]: I0314 07:22:54.653193 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5547746bbf-pkbmn"] Mar 14 07:22:54 crc kubenswrapper[4893]: I0314 07:22:54.653548 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5547746bbf-pkbmn" podUID="6efc151f-afa4-4275-af88-738c5c23c651" containerName="dnsmasq-dns" containerID="cri-o://3372b430ebf12e190f0b37cb7022f5ea88eeb59494e0e98c6812b07b2b2587eb" gracePeriod=10 Mar 14 07:22:54 crc kubenswrapper[4893]: I0314 07:22:54.976387 4893 generic.go:334] "Generic (PLEG): container finished" podID="6efc151f-afa4-4275-af88-738c5c23c651" containerID="3372b430ebf12e190f0b37cb7022f5ea88eeb59494e0e98c6812b07b2b2587eb" exitCode=0 Mar 14 07:22:54 crc kubenswrapper[4893]: I0314 07:22:54.976450 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5547746bbf-pkbmn" event={"ID":"6efc151f-afa4-4275-af88-738c5c23c651","Type":"ContainerDied","Data":"3372b430ebf12e190f0b37cb7022f5ea88eeb59494e0e98c6812b07b2b2587eb"} Mar 14 07:22:54 crc kubenswrapper[4893]: I0314 07:22:54.982463 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-k5lhh" Mar 14 07:22:54 crc kubenswrapper[4893]: I0314 07:22:54.982951 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-k5lhh" event={"ID":"438ee5c6-8f2a-491b-903d-78537b8465f4","Type":"ContainerDied","Data":"3c367d3c41189ac945dccea5a017d16ce188ec89f96d0beb035c6de7d065fa65"} Mar 14 07:22:54 crc kubenswrapper[4893]: I0314 07:22:54.982988 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c367d3c41189ac945dccea5a017d16ce188ec89f96d0beb035c6de7d065fa65" Mar 14 07:22:54 crc kubenswrapper[4893]: I0314 07:22:54.984425 4893 generic.go:334] "Generic (PLEG): container finished" podID="f6d550ac-ae8f-4fec-97fc-0e9816dd6b7a" containerID="bf00332d3fc74a103886075ae2cfbfdc2a8883a1876737cc65040af966c58cfc" exitCode=0 Mar 14 07:22:54 crc kubenswrapper[4893]: I0314 07:22:54.985338 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-5vwvm" event={"ID":"f6d550ac-ae8f-4fec-97fc-0e9816dd6b7a","Type":"ContainerDied","Data":"bf00332d3fc74a103886075ae2cfbfdc2a8883a1876737cc65040af966c58cfc"} Mar 14 07:22:55 crc kubenswrapper[4893]: I0314 07:22:55.028849 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 14 07:22:55 crc kubenswrapper[4893]: I0314 07:22:55.113848 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5547746bbf-pkbmn" Mar 14 07:22:55 crc kubenswrapper[4893]: I0314 07:22:55.142512 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6efc151f-afa4-4275-af88-738c5c23c651-dns-svc\") pod \"6efc151f-afa4-4275-af88-738c5c23c651\" (UID: \"6efc151f-afa4-4275-af88-738c5c23c651\") " Mar 14 07:22:55 crc kubenswrapper[4893]: I0314 07:22:55.142567 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6efc151f-afa4-4275-af88-738c5c23c651-ovsdbserver-nb\") pod \"6efc151f-afa4-4275-af88-738c5c23c651\" (UID: \"6efc151f-afa4-4275-af88-738c5c23c651\") " Mar 14 07:22:55 crc kubenswrapper[4893]: I0314 07:22:55.142695 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnl48\" (UniqueName: \"kubernetes.io/projected/6efc151f-afa4-4275-af88-738c5c23c651-kube-api-access-jnl48\") pod \"6efc151f-afa4-4275-af88-738c5c23c651\" (UID: \"6efc151f-afa4-4275-af88-738c5c23c651\") " Mar 14 07:22:55 crc kubenswrapper[4893]: I0314 07:22:55.142829 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6efc151f-afa4-4275-af88-738c5c23c651-config\") pod \"6efc151f-afa4-4275-af88-738c5c23c651\" (UID: \"6efc151f-afa4-4275-af88-738c5c23c651\") " Mar 14 07:22:55 crc kubenswrapper[4893]: I0314 07:22:55.142868 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6efc151f-afa4-4275-af88-738c5c23c651-dns-swift-storage-0\") pod \"6efc151f-afa4-4275-af88-738c5c23c651\" (UID: \"6efc151f-afa4-4275-af88-738c5c23c651\") " Mar 14 07:22:55 crc kubenswrapper[4893]: I0314 07:22:55.142903 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6efc151f-afa4-4275-af88-738c5c23c651-ovsdbserver-sb\") pod \"6efc151f-afa4-4275-af88-738c5c23c651\" (UID: \"6efc151f-afa4-4275-af88-738c5c23c651\") " Mar 14 07:22:55 crc kubenswrapper[4893]: I0314 07:22:55.149045 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6efc151f-afa4-4275-af88-738c5c23c651-kube-api-access-jnl48" (OuterVolumeSpecName: "kube-api-access-jnl48") pod "6efc151f-afa4-4275-af88-738c5c23c651" (UID: "6efc151f-afa4-4275-af88-738c5c23c651"). InnerVolumeSpecName "kube-api-access-jnl48". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:22:55 crc kubenswrapper[4893]: I0314 07:22:55.187238 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 14 07:22:55 crc kubenswrapper[4893]: I0314 07:22:55.187533 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="777a623c-ec9d-4342-b3e6-4c88286184a7" containerName="nova-api-log" containerID="cri-o://4a6b9f7c58db739df5aa6f68c55533cd7497f82f56367f7172a25847abbbdf12" gracePeriod=30 Mar 14 07:22:55 crc kubenswrapper[4893]: I0314 07:22:55.187723 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="777a623c-ec9d-4342-b3e6-4c88286184a7" containerName="nova-api-api" containerID="cri-o://52b184e480c05637f492ebde9c93d95639ec40495b2845bb32049914d45db2e9" gracePeriod=30 Mar 14 07:22:55 crc kubenswrapper[4893]: I0314 07:22:55.191854 4893 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="777a623c-ec9d-4342-b3e6-4c88286184a7" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.196:8774/\": EOF" Mar 14 07:22:55 crc kubenswrapper[4893]: I0314 07:22:55.191999 4893 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="777a623c-ec9d-4342-b3e6-4c88286184a7" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.196:8774/\": EOF" Mar 14 07:22:55 crc kubenswrapper[4893]: I0314 07:22:55.228800 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6efc151f-afa4-4275-af88-738c5c23c651-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6efc151f-afa4-4275-af88-738c5c23c651" (UID: "6efc151f-afa4-4275-af88-738c5c23c651"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:22:55 crc kubenswrapper[4893]: I0314 07:22:55.233031 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6efc151f-afa4-4275-af88-738c5c23c651-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6efc151f-afa4-4275-af88-738c5c23c651" (UID: "6efc151f-afa4-4275-af88-738c5c23c651"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:22:55 crc kubenswrapper[4893]: I0314 07:22:55.241821 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6efc151f-afa4-4275-af88-738c5c23c651-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6efc151f-afa4-4275-af88-738c5c23c651" (UID: "6efc151f-afa4-4275-af88-738c5c23c651"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:22:55 crc kubenswrapper[4893]: I0314 07:22:55.243785 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6efc151f-afa4-4275-af88-738c5c23c651-config" (OuterVolumeSpecName: "config") pod "6efc151f-afa4-4275-af88-738c5c23c651" (UID: "6efc151f-afa4-4275-af88-738c5c23c651"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:22:55 crc kubenswrapper[4893]: I0314 07:22:55.244133 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6efc151f-afa4-4275-af88-738c5c23c651-config\") pod \"6efc151f-afa4-4275-af88-738c5c23c651\" (UID: \"6efc151f-afa4-4275-af88-738c5c23c651\") " Mar 14 07:22:55 crc kubenswrapper[4893]: W0314 07:22:55.244268 4893 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/6efc151f-afa4-4275-af88-738c5c23c651/volumes/kubernetes.io~configmap/config Mar 14 07:22:55 crc kubenswrapper[4893]: I0314 07:22:55.244283 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6efc151f-afa4-4275-af88-738c5c23c651-config" (OuterVolumeSpecName: "config") pod "6efc151f-afa4-4275-af88-738c5c23c651" (UID: "6efc151f-afa4-4275-af88-738c5c23c651"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:22:55 crc kubenswrapper[4893]: I0314 07:22:55.244807 4893 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6efc151f-afa4-4275-af88-738c5c23c651-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:55 crc kubenswrapper[4893]: I0314 07:22:55.244897 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnl48\" (UniqueName: \"kubernetes.io/projected/6efc151f-afa4-4275-af88-738c5c23c651-kube-api-access-jnl48\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:55 crc kubenswrapper[4893]: I0314 07:22:55.244961 4893 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6efc151f-afa4-4275-af88-738c5c23c651-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:55 crc kubenswrapper[4893]: I0314 07:22:55.245028 4893 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6efc151f-afa4-4275-af88-738c5c23c651-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:55 crc kubenswrapper[4893]: I0314 07:22:55.245083 4893 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6efc151f-afa4-4275-af88-738c5c23c651-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:55 crc kubenswrapper[4893]: I0314 07:22:55.267341 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6efc151f-afa4-4275-af88-738c5c23c651-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6efc151f-afa4-4275-af88-738c5c23c651" (UID: "6efc151f-afa4-4275-af88-738c5c23c651"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:22:55 crc kubenswrapper[4893]: I0314 07:22:55.348025 4893 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6efc151f-afa4-4275-af88-738c5c23c651-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:55 crc kubenswrapper[4893]: I0314 07:22:55.572533 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 07:22:55 crc kubenswrapper[4893]: I0314 07:22:55.995042 4893 generic.go:334] "Generic (PLEG): container finished" podID="777a623c-ec9d-4342-b3e6-4c88286184a7" containerID="4a6b9f7c58db739df5aa6f68c55533cd7497f82f56367f7172a25847abbbdf12" exitCode=143 Mar 14 07:22:55 crc kubenswrapper[4893]: I0314 07:22:55.995130 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"777a623c-ec9d-4342-b3e6-4c88286184a7","Type":"ContainerDied","Data":"4a6b9f7c58db739df5aa6f68c55533cd7497f82f56367f7172a25847abbbdf12"} Mar 14 07:22:55 crc kubenswrapper[4893]: I0314 07:22:55.997645 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5547746bbf-pkbmn" Mar 14 07:22:55 crc kubenswrapper[4893]: I0314 07:22:55.998077 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5547746bbf-pkbmn" event={"ID":"6efc151f-afa4-4275-af88-738c5c23c651","Type":"ContainerDied","Data":"d2a8a60accc816ad6f3d08f4de05c449df684739146aab2f6a80623cab884520"} Mar 14 07:22:55 crc kubenswrapper[4893]: I0314 07:22:55.998108 4893 scope.go:117] "RemoveContainer" containerID="3372b430ebf12e190f0b37cb7022f5ea88eeb59494e0e98c6812b07b2b2587eb" Mar 14 07:22:56 crc kubenswrapper[4893]: I0314 07:22:56.023027 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5547746bbf-pkbmn"] Mar 14 07:22:56 crc kubenswrapper[4893]: I0314 07:22:56.035837 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5547746bbf-pkbmn"] Mar 14 07:22:56 crc kubenswrapper[4893]: I0314 07:22:56.038923 4893 scope.go:117] "RemoveContainer" containerID="91afdf74937623f4834bf2f3dec17b193ce76742c7b8d5c2664e65f31c0e9c85" Mar 14 07:22:56 crc kubenswrapper[4893]: I0314 07:22:56.392628 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-5vwvm" Mar 14 07:22:56 crc kubenswrapper[4893]: I0314 07:22:56.501216 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6d550ac-ae8f-4fec-97fc-0e9816dd6b7a-scripts\") pod \"f6d550ac-ae8f-4fec-97fc-0e9816dd6b7a\" (UID: \"f6d550ac-ae8f-4fec-97fc-0e9816dd6b7a\") " Mar 14 07:22:56 crc kubenswrapper[4893]: I0314 07:22:56.501564 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78hzz\" (UniqueName: \"kubernetes.io/projected/f6d550ac-ae8f-4fec-97fc-0e9816dd6b7a-kube-api-access-78hzz\") pod \"f6d550ac-ae8f-4fec-97fc-0e9816dd6b7a\" (UID: \"f6d550ac-ae8f-4fec-97fc-0e9816dd6b7a\") " Mar 14 07:22:56 crc kubenswrapper[4893]: I0314 07:22:56.501751 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6d550ac-ae8f-4fec-97fc-0e9816dd6b7a-config-data\") pod \"f6d550ac-ae8f-4fec-97fc-0e9816dd6b7a\" (UID: \"f6d550ac-ae8f-4fec-97fc-0e9816dd6b7a\") " Mar 14 07:22:56 crc kubenswrapper[4893]: I0314 07:22:56.501793 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6d550ac-ae8f-4fec-97fc-0e9816dd6b7a-combined-ca-bundle\") pod \"f6d550ac-ae8f-4fec-97fc-0e9816dd6b7a\" (UID: \"f6d550ac-ae8f-4fec-97fc-0e9816dd6b7a\") " Mar 14 07:22:56 crc kubenswrapper[4893]: I0314 07:22:56.507545 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6d550ac-ae8f-4fec-97fc-0e9816dd6b7a-scripts" (OuterVolumeSpecName: "scripts") pod "f6d550ac-ae8f-4fec-97fc-0e9816dd6b7a" (UID: "f6d550ac-ae8f-4fec-97fc-0e9816dd6b7a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:56 crc kubenswrapper[4893]: I0314 07:22:56.511031 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6d550ac-ae8f-4fec-97fc-0e9816dd6b7a-kube-api-access-78hzz" (OuterVolumeSpecName: "kube-api-access-78hzz") pod "f6d550ac-ae8f-4fec-97fc-0e9816dd6b7a" (UID: "f6d550ac-ae8f-4fec-97fc-0e9816dd6b7a"). InnerVolumeSpecName "kube-api-access-78hzz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:22:56 crc kubenswrapper[4893]: I0314 07:22:56.533163 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6d550ac-ae8f-4fec-97fc-0e9816dd6b7a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f6d550ac-ae8f-4fec-97fc-0e9816dd6b7a" (UID: "f6d550ac-ae8f-4fec-97fc-0e9816dd6b7a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:56 crc kubenswrapper[4893]: I0314 07:22:56.543170 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6d550ac-ae8f-4fec-97fc-0e9816dd6b7a-config-data" (OuterVolumeSpecName: "config-data") pod "f6d550ac-ae8f-4fec-97fc-0e9816dd6b7a" (UID: "f6d550ac-ae8f-4fec-97fc-0e9816dd6b7a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:56 crc kubenswrapper[4893]: I0314 07:22:56.604540 4893 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6d550ac-ae8f-4fec-97fc-0e9816dd6b7a-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:56 crc kubenswrapper[4893]: I0314 07:22:56.604582 4893 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6d550ac-ae8f-4fec-97fc-0e9816dd6b7a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:56 crc kubenswrapper[4893]: I0314 07:22:56.604595 4893 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6d550ac-ae8f-4fec-97fc-0e9816dd6b7a-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:56 crc kubenswrapper[4893]: I0314 07:22:56.604606 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78hzz\" (UniqueName: \"kubernetes.io/projected/f6d550ac-ae8f-4fec-97fc-0e9816dd6b7a-kube-api-access-78hzz\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:57 crc kubenswrapper[4893]: I0314 07:22:57.011333 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="3b51e9fd-b41e-4a2d-a792-b21cdb16a895" containerName="nova-scheduler-scheduler" containerID="cri-o://cd9cb53c3b340e2b7ee180907fde7f77dc72c35ab027e7836dbd5dafe2c865ab" gracePeriod=30 Mar 14 07:22:57 crc kubenswrapper[4893]: I0314 07:22:57.011597 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-5vwvm" event={"ID":"f6d550ac-ae8f-4fec-97fc-0e9816dd6b7a","Type":"ContainerDied","Data":"996d08b2604d52e591e785d01f6c412168ed75fc01667397c3acfb5347ff5a0d"} Mar 14 07:22:57 crc kubenswrapper[4893]: I0314 07:22:57.011693 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="996d08b2604d52e591e785d01f6c412168ed75fc01667397c3acfb5347ff5a0d" Mar 14 07:22:57 crc kubenswrapper[4893]: I0314 07:22:57.011938 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-5vwvm" Mar 14 07:22:57 crc kubenswrapper[4893]: I0314 07:22:57.093389 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 14 07:22:57 crc kubenswrapper[4893]: E0314 07:22:57.094303 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="438ee5c6-8f2a-491b-903d-78537b8465f4" containerName="nova-manage" Mar 14 07:22:57 crc kubenswrapper[4893]: I0314 07:22:57.094456 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="438ee5c6-8f2a-491b-903d-78537b8465f4" containerName="nova-manage" Mar 14 07:22:57 crc kubenswrapper[4893]: E0314 07:22:57.094626 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6efc151f-afa4-4275-af88-738c5c23c651" containerName="init" Mar 14 07:22:57 crc kubenswrapper[4893]: I0314 07:22:57.094740 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="6efc151f-afa4-4275-af88-738c5c23c651" containerName="init" Mar 14 07:22:57 crc kubenswrapper[4893]: E0314 07:22:57.094846 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6efc151f-afa4-4275-af88-738c5c23c651" containerName="dnsmasq-dns" Mar 14 07:22:57 crc kubenswrapper[4893]: I0314 07:22:57.094946 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="6efc151f-afa4-4275-af88-738c5c23c651" containerName="dnsmasq-dns" Mar 14 07:22:57 crc kubenswrapper[4893]: E0314 07:22:57.095069 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6d550ac-ae8f-4fec-97fc-0e9816dd6b7a" containerName="nova-cell1-conductor-db-sync" Mar 14 07:22:57 crc kubenswrapper[4893]: I0314 07:22:57.095183 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6d550ac-ae8f-4fec-97fc-0e9816dd6b7a" containerName="nova-cell1-conductor-db-sync" Mar 14 07:22:57 crc kubenswrapper[4893]: I0314 07:22:57.095680 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="6efc151f-afa4-4275-af88-738c5c23c651" containerName="dnsmasq-dns" Mar 14 07:22:57 crc kubenswrapper[4893]: I0314 07:22:57.095820 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="438ee5c6-8f2a-491b-903d-78537b8465f4" containerName="nova-manage" Mar 14 07:22:57 crc kubenswrapper[4893]: I0314 07:22:57.095938 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6d550ac-ae8f-4fec-97fc-0e9816dd6b7a" containerName="nova-cell1-conductor-db-sync" Mar 14 07:22:57 crc kubenswrapper[4893]: I0314 07:22:57.097053 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 14 07:22:57 crc kubenswrapper[4893]: I0314 07:22:57.099085 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 14 07:22:57 crc kubenswrapper[4893]: I0314 07:22:57.127963 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 14 07:22:57 crc kubenswrapper[4893]: I0314 07:22:57.214964 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5xhs\" (UniqueName: \"kubernetes.io/projected/d0f57646-651c-4b8f-b73d-6606d06fa3a3-kube-api-access-l5xhs\") pod \"nova-cell1-conductor-0\" (UID: \"d0f57646-651c-4b8f-b73d-6606d06fa3a3\") " pod="openstack/nova-cell1-conductor-0" Mar 14 07:22:57 crc kubenswrapper[4893]: I0314 07:22:57.215282 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0f57646-651c-4b8f-b73d-6606d06fa3a3-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"d0f57646-651c-4b8f-b73d-6606d06fa3a3\") " pod="openstack/nova-cell1-conductor-0" Mar 14 07:22:57 crc kubenswrapper[4893]: I0314 07:22:57.215453 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0f57646-651c-4b8f-b73d-6606d06fa3a3-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"d0f57646-651c-4b8f-b73d-6606d06fa3a3\") " pod="openstack/nova-cell1-conductor-0" Mar 14 07:22:57 crc kubenswrapper[4893]: I0314 07:22:57.317115 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0f57646-651c-4b8f-b73d-6606d06fa3a3-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"d0f57646-651c-4b8f-b73d-6606d06fa3a3\") " pod="openstack/nova-cell1-conductor-0" Mar 14 07:22:57 crc kubenswrapper[4893]: I0314 07:22:57.317296 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5xhs\" (UniqueName: \"kubernetes.io/projected/d0f57646-651c-4b8f-b73d-6606d06fa3a3-kube-api-access-l5xhs\") pod \"nova-cell1-conductor-0\" (UID: \"d0f57646-651c-4b8f-b73d-6606d06fa3a3\") " pod="openstack/nova-cell1-conductor-0" Mar 14 07:22:57 crc kubenswrapper[4893]: I0314 07:22:57.317381 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0f57646-651c-4b8f-b73d-6606d06fa3a3-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"d0f57646-651c-4b8f-b73d-6606d06fa3a3\") " pod="openstack/nova-cell1-conductor-0" Mar 14 07:22:57 crc kubenswrapper[4893]: I0314 07:22:57.330981 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0f57646-651c-4b8f-b73d-6606d06fa3a3-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"d0f57646-651c-4b8f-b73d-6606d06fa3a3\") " pod="openstack/nova-cell1-conductor-0" Mar 14 07:22:57 crc kubenswrapper[4893]: I0314 07:22:57.331393 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0f57646-651c-4b8f-b73d-6606d06fa3a3-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"d0f57646-651c-4b8f-b73d-6606d06fa3a3\") " pod="openstack/nova-cell1-conductor-0" Mar 14 07:22:57 crc kubenswrapper[4893]: I0314 07:22:57.349115 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5xhs\" (UniqueName: \"kubernetes.io/projected/d0f57646-651c-4b8f-b73d-6606d06fa3a3-kube-api-access-l5xhs\") pod \"nova-cell1-conductor-0\" (UID: \"d0f57646-651c-4b8f-b73d-6606d06fa3a3\") " pod="openstack/nova-cell1-conductor-0" Mar 14 07:22:57 crc kubenswrapper[4893]: I0314 07:22:57.396676 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6efc151f-afa4-4275-af88-738c5c23c651" path="/var/lib/kubelet/pods/6efc151f-afa4-4275-af88-738c5c23c651/volumes" Mar 14 07:22:57 crc kubenswrapper[4893]: I0314 07:22:57.415471 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 14 07:22:57 crc kubenswrapper[4893]: I0314 07:22:57.939673 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 14 07:22:58 crc kubenswrapper[4893]: I0314 07:22:58.025201 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"d0f57646-651c-4b8f-b73d-6606d06fa3a3","Type":"ContainerStarted","Data":"280b9a33b356361ff4e472b1ae89ba0321e49995373f4302c52cc974cfcc8c57"} Mar 14 07:22:59 crc kubenswrapper[4893]: I0314 07:22:59.038698 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"d0f57646-651c-4b8f-b73d-6606d06fa3a3","Type":"ContainerStarted","Data":"c9b16cf0fb0ddb6f10075d1c80980022cc10c492e67b85046aa17943882b5a0b"} Mar 14 07:22:59 crc kubenswrapper[4893]: I0314 07:22:59.040224 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 14 07:22:59 crc kubenswrapper[4893]: I0314 07:22:59.077752 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.077723989 podStartE2EDuration="2.077723989s" podCreationTimestamp="2026-03-14 07:22:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:22:59.069873418 +0000 UTC m=+1458.332050250" watchObservedRunningTime="2026-03-14 07:22:59.077723989 +0000 UTC m=+1458.339900821" Mar 14 07:22:59 crc kubenswrapper[4893]: E0314 07:22:59.566034 4893 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cd9cb53c3b340e2b7ee180907fde7f77dc72c35ab027e7836dbd5dafe2c865ab" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 14 07:22:59 crc kubenswrapper[4893]: E0314 07:22:59.569783 4893 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cd9cb53c3b340e2b7ee180907fde7f77dc72c35ab027e7836dbd5dafe2c865ab" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 14 07:22:59 crc kubenswrapper[4893]: E0314 07:22:59.571616 4893 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cd9cb53c3b340e2b7ee180907fde7f77dc72c35ab027e7836dbd5dafe2c865ab" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 14 07:22:59 crc kubenswrapper[4893]: E0314 07:22:59.571713 4893 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="3b51e9fd-b41e-4a2d-a792-b21cdb16a895" containerName="nova-scheduler-scheduler" Mar 14 07:22:59 crc kubenswrapper[4893]: I0314 07:22:59.916328 4893 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5547746bbf-pkbmn" podUID="6efc151f-afa4-4275-af88-738c5c23c651" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.172:5353: i/o timeout" Mar 14 07:23:00 crc kubenswrapper[4893]: I0314 07:23:00.050490 4893 generic.go:334] "Generic (PLEG): container finished" podID="3b51e9fd-b41e-4a2d-a792-b21cdb16a895" containerID="cd9cb53c3b340e2b7ee180907fde7f77dc72c35ab027e7836dbd5dafe2c865ab" exitCode=0 Mar 14 07:23:00 crc kubenswrapper[4893]: I0314 07:23:00.051369 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3b51e9fd-b41e-4a2d-a792-b21cdb16a895","Type":"ContainerDied","Data":"cd9cb53c3b340e2b7ee180907fde7f77dc72c35ab027e7836dbd5dafe2c865ab"} Mar 14 07:23:00 crc kubenswrapper[4893]: I0314 07:23:00.436765 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 14 07:23:00 crc kubenswrapper[4893]: I0314 07:23:00.601296 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b51e9fd-b41e-4a2d-a792-b21cdb16a895-combined-ca-bundle\") pod \"3b51e9fd-b41e-4a2d-a792-b21cdb16a895\" (UID: \"3b51e9fd-b41e-4a2d-a792-b21cdb16a895\") " Mar 14 07:23:00 crc kubenswrapper[4893]: I0314 07:23:00.601422 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9x49\" (UniqueName: \"kubernetes.io/projected/3b51e9fd-b41e-4a2d-a792-b21cdb16a895-kube-api-access-v9x49\") pod \"3b51e9fd-b41e-4a2d-a792-b21cdb16a895\" (UID: \"3b51e9fd-b41e-4a2d-a792-b21cdb16a895\") " Mar 14 07:23:00 crc kubenswrapper[4893]: I0314 07:23:00.601599 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b51e9fd-b41e-4a2d-a792-b21cdb16a895-config-data\") pod \"3b51e9fd-b41e-4a2d-a792-b21cdb16a895\" (UID: \"3b51e9fd-b41e-4a2d-a792-b21cdb16a895\") " Mar 14 07:23:00 crc kubenswrapper[4893]: I0314 07:23:00.627141 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b51e9fd-b41e-4a2d-a792-b21cdb16a895-kube-api-access-v9x49" (OuterVolumeSpecName: "kube-api-access-v9x49") pod "3b51e9fd-b41e-4a2d-a792-b21cdb16a895" (UID: "3b51e9fd-b41e-4a2d-a792-b21cdb16a895"). InnerVolumeSpecName "kube-api-access-v9x49". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:23:00 crc kubenswrapper[4893]: I0314 07:23:00.631292 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b51e9fd-b41e-4a2d-a792-b21cdb16a895-config-data" (OuterVolumeSpecName: "config-data") pod "3b51e9fd-b41e-4a2d-a792-b21cdb16a895" (UID: "3b51e9fd-b41e-4a2d-a792-b21cdb16a895"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:23:00 crc kubenswrapper[4893]: I0314 07:23:00.636700 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b51e9fd-b41e-4a2d-a792-b21cdb16a895-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3b51e9fd-b41e-4a2d-a792-b21cdb16a895" (UID: "3b51e9fd-b41e-4a2d-a792-b21cdb16a895"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:23:00 crc kubenswrapper[4893]: I0314 07:23:00.704984 4893 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b51e9fd-b41e-4a2d-a792-b21cdb16a895-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:00 crc kubenswrapper[4893]: I0314 07:23:00.705303 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9x49\" (UniqueName: \"kubernetes.io/projected/3b51e9fd-b41e-4a2d-a792-b21cdb16a895-kube-api-access-v9x49\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:00 crc kubenswrapper[4893]: I0314 07:23:00.705561 4893 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b51e9fd-b41e-4a2d-a792-b21cdb16a895-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:00 crc kubenswrapper[4893]: I0314 07:23:00.967233 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 07:23:01 crc kubenswrapper[4893]: I0314 07:23:01.058644 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 14 07:23:01 crc kubenswrapper[4893]: I0314 07:23:01.058641 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3b51e9fd-b41e-4a2d-a792-b21cdb16a895","Type":"ContainerDied","Data":"2812c536420517ddc3be9f39708aa14462650bb8f3e18b44cf2fbb6f880a5027"} Mar 14 07:23:01 crc kubenswrapper[4893]: I0314 07:23:01.058776 4893 scope.go:117] "RemoveContainer" containerID="cd9cb53c3b340e2b7ee180907fde7f77dc72c35ab027e7836dbd5dafe2c865ab" Mar 14 07:23:01 crc kubenswrapper[4893]: I0314 07:23:01.061114 4893 generic.go:334] "Generic (PLEG): container finished" podID="777a623c-ec9d-4342-b3e6-4c88286184a7" containerID="52b184e480c05637f492ebde9c93d95639ec40495b2845bb32049914d45db2e9" exitCode=0 Mar 14 07:23:01 crc kubenswrapper[4893]: I0314 07:23:01.061573 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"777a623c-ec9d-4342-b3e6-4c88286184a7","Type":"ContainerDied","Data":"52b184e480c05637f492ebde9c93d95639ec40495b2845bb32049914d45db2e9"} Mar 14 07:23:01 crc kubenswrapper[4893]: I0314 07:23:01.061597 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"777a623c-ec9d-4342-b3e6-4c88286184a7","Type":"ContainerDied","Data":"ad2971fce39f05d0f2d1bd8a5b86d2bdfc53e137e5550e6e8e8e11cbbc434b53"} Mar 14 07:23:01 crc kubenswrapper[4893]: I0314 07:23:01.061756 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 07:23:01 crc kubenswrapper[4893]: I0314 07:23:01.092150 4893 scope.go:117] "RemoveContainer" containerID="52b184e480c05637f492ebde9c93d95639ec40495b2845bb32049914d45db2e9" Mar 14 07:23:01 crc kubenswrapper[4893]: I0314 07:23:01.092408 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 07:23:01 crc kubenswrapper[4893]: I0314 07:23:01.109619 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 07:23:01 crc kubenswrapper[4893]: I0314 07:23:01.110097 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/777a623c-ec9d-4342-b3e6-4c88286184a7-logs\") pod \"777a623c-ec9d-4342-b3e6-4c88286184a7\" (UID: \"777a623c-ec9d-4342-b3e6-4c88286184a7\") " Mar 14 07:23:01 crc kubenswrapper[4893]: I0314 07:23:01.110179 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftj8h\" (UniqueName: \"kubernetes.io/projected/777a623c-ec9d-4342-b3e6-4c88286184a7-kube-api-access-ftj8h\") pod \"777a623c-ec9d-4342-b3e6-4c88286184a7\" (UID: \"777a623c-ec9d-4342-b3e6-4c88286184a7\") " Mar 14 07:23:01 crc kubenswrapper[4893]: I0314 07:23:01.110215 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/777a623c-ec9d-4342-b3e6-4c88286184a7-combined-ca-bundle\") pod \"777a623c-ec9d-4342-b3e6-4c88286184a7\" (UID: \"777a623c-ec9d-4342-b3e6-4c88286184a7\") " Mar 14 07:23:01 crc kubenswrapper[4893]: I0314 07:23:01.110328 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/777a623c-ec9d-4342-b3e6-4c88286184a7-config-data\") pod \"777a623c-ec9d-4342-b3e6-4c88286184a7\" (UID: \"777a623c-ec9d-4342-b3e6-4c88286184a7\") " Mar 14 07:23:01 crc kubenswrapper[4893]: I0314 07:23:01.110591 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/777a623c-ec9d-4342-b3e6-4c88286184a7-logs" (OuterVolumeSpecName: "logs") pod "777a623c-ec9d-4342-b3e6-4c88286184a7" (UID: "777a623c-ec9d-4342-b3e6-4c88286184a7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:23:01 crc kubenswrapper[4893]: I0314 07:23:01.111693 4893 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/777a623c-ec9d-4342-b3e6-4c88286184a7-logs\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:01 crc kubenswrapper[4893]: I0314 07:23:01.114720 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/777a623c-ec9d-4342-b3e6-4c88286184a7-kube-api-access-ftj8h" (OuterVolumeSpecName: "kube-api-access-ftj8h") pod "777a623c-ec9d-4342-b3e6-4c88286184a7" (UID: "777a623c-ec9d-4342-b3e6-4c88286184a7"). InnerVolumeSpecName "kube-api-access-ftj8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:23:01 crc kubenswrapper[4893]: I0314 07:23:01.136277 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 07:23:01 crc kubenswrapper[4893]: E0314 07:23:01.136746 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="777a623c-ec9d-4342-b3e6-4c88286184a7" containerName="nova-api-log" Mar 14 07:23:01 crc kubenswrapper[4893]: I0314 07:23:01.136768 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="777a623c-ec9d-4342-b3e6-4c88286184a7" containerName="nova-api-log" Mar 14 07:23:01 crc kubenswrapper[4893]: E0314 07:23:01.136795 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b51e9fd-b41e-4a2d-a792-b21cdb16a895" containerName="nova-scheduler-scheduler" Mar 14 07:23:01 crc kubenswrapper[4893]: I0314 07:23:01.136803 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b51e9fd-b41e-4a2d-a792-b21cdb16a895" containerName="nova-scheduler-scheduler" Mar 14 07:23:01 crc kubenswrapper[4893]: E0314 07:23:01.136821 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="777a623c-ec9d-4342-b3e6-4c88286184a7" containerName="nova-api-api" Mar 14 07:23:01 crc kubenswrapper[4893]: I0314 07:23:01.136828 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="777a623c-ec9d-4342-b3e6-4c88286184a7" containerName="nova-api-api" Mar 14 07:23:01 crc kubenswrapper[4893]: I0314 07:23:01.138653 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b51e9fd-b41e-4a2d-a792-b21cdb16a895" containerName="nova-scheduler-scheduler" Mar 14 07:23:01 crc kubenswrapper[4893]: I0314 07:23:01.138696 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="777a623c-ec9d-4342-b3e6-4c88286184a7" containerName="nova-api-api" Mar 14 07:23:01 crc kubenswrapper[4893]: I0314 07:23:01.138710 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="777a623c-ec9d-4342-b3e6-4c88286184a7" containerName="nova-api-log" Mar 14 07:23:01 crc kubenswrapper[4893]: I0314 07:23:01.140511 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 14 07:23:01 crc kubenswrapper[4893]: I0314 07:23:01.143373 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 14 07:23:01 crc kubenswrapper[4893]: I0314 07:23:01.145804 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/777a623c-ec9d-4342-b3e6-4c88286184a7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "777a623c-ec9d-4342-b3e6-4c88286184a7" (UID: "777a623c-ec9d-4342-b3e6-4c88286184a7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:23:01 crc kubenswrapper[4893]: I0314 07:23:01.157977 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 07:23:01 crc kubenswrapper[4893]: I0314 07:23:01.159165 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/777a623c-ec9d-4342-b3e6-4c88286184a7-config-data" (OuterVolumeSpecName: "config-data") pod "777a623c-ec9d-4342-b3e6-4c88286184a7" (UID: "777a623c-ec9d-4342-b3e6-4c88286184a7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:23:01 crc kubenswrapper[4893]: I0314 07:23:01.214248 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftj8h\" (UniqueName: \"kubernetes.io/projected/777a623c-ec9d-4342-b3e6-4c88286184a7-kube-api-access-ftj8h\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:01 crc kubenswrapper[4893]: I0314 07:23:01.215920 4893 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/777a623c-ec9d-4342-b3e6-4c88286184a7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:01 crc kubenswrapper[4893]: I0314 07:23:01.215954 4893 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/777a623c-ec9d-4342-b3e6-4c88286184a7-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:01 crc kubenswrapper[4893]: I0314 07:23:01.227696 4893 scope.go:117] "RemoveContainer" containerID="4a6b9f7c58db739df5aa6f68c55533cd7497f82f56367f7172a25847abbbdf12" Mar 14 07:23:01 crc kubenswrapper[4893]: I0314 07:23:01.244744 4893 scope.go:117] "RemoveContainer" containerID="52b184e480c05637f492ebde9c93d95639ec40495b2845bb32049914d45db2e9" Mar 14 07:23:01 crc kubenswrapper[4893]: E0314 07:23:01.245111 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52b184e480c05637f492ebde9c93d95639ec40495b2845bb32049914d45db2e9\": container with ID starting with 52b184e480c05637f492ebde9c93d95639ec40495b2845bb32049914d45db2e9 not found: ID does not exist" containerID="52b184e480c05637f492ebde9c93d95639ec40495b2845bb32049914d45db2e9" Mar 14 07:23:01 crc kubenswrapper[4893]: I0314 07:23:01.245164 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52b184e480c05637f492ebde9c93d95639ec40495b2845bb32049914d45db2e9"} err="failed to get container status \"52b184e480c05637f492ebde9c93d95639ec40495b2845bb32049914d45db2e9\": rpc error: code = NotFound desc = could not find container \"52b184e480c05637f492ebde9c93d95639ec40495b2845bb32049914d45db2e9\": container with ID starting with 52b184e480c05637f492ebde9c93d95639ec40495b2845bb32049914d45db2e9 not found: ID does not exist" Mar 14 07:23:01 crc kubenswrapper[4893]: I0314 07:23:01.245189 4893 scope.go:117] "RemoveContainer" containerID="4a6b9f7c58db739df5aa6f68c55533cd7497f82f56367f7172a25847abbbdf12" Mar 14 07:23:01 crc kubenswrapper[4893]: E0314 07:23:01.245513 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a6b9f7c58db739df5aa6f68c55533cd7497f82f56367f7172a25847abbbdf12\": container with ID starting with 4a6b9f7c58db739df5aa6f68c55533cd7497f82f56367f7172a25847abbbdf12 not found: ID does not exist" containerID="4a6b9f7c58db739df5aa6f68c55533cd7497f82f56367f7172a25847abbbdf12" Mar 14 07:23:01 crc kubenswrapper[4893]: I0314 07:23:01.245559 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a6b9f7c58db739df5aa6f68c55533cd7497f82f56367f7172a25847abbbdf12"} err="failed to get container status \"4a6b9f7c58db739df5aa6f68c55533cd7497f82f56367f7172a25847abbbdf12\": rpc error: code = NotFound desc = could not find container \"4a6b9f7c58db739df5aa6f68c55533cd7497f82f56367f7172a25847abbbdf12\": container with ID starting with 4a6b9f7c58db739df5aa6f68c55533cd7497f82f56367f7172a25847abbbdf12 not found: ID does not exist" Mar 14 07:23:01 crc kubenswrapper[4893]: I0314 07:23:01.317918 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bbd017e-988b-409e-861c-098bb3ab86ca-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4bbd017e-988b-409e-861c-098bb3ab86ca\") " pod="openstack/nova-scheduler-0" Mar 14 07:23:01 crc kubenswrapper[4893]: I0314 07:23:01.318605 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bbd017e-988b-409e-861c-098bb3ab86ca-config-data\") pod \"nova-scheduler-0\" (UID: \"4bbd017e-988b-409e-861c-098bb3ab86ca\") " pod="openstack/nova-scheduler-0" Mar 14 07:23:01 crc kubenswrapper[4893]: I0314 07:23:01.318686 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9rgt\" (UniqueName: \"kubernetes.io/projected/4bbd017e-988b-409e-861c-098bb3ab86ca-kube-api-access-l9rgt\") pod \"nova-scheduler-0\" (UID: \"4bbd017e-988b-409e-861c-098bb3ab86ca\") " pod="openstack/nova-scheduler-0" Mar 14 07:23:01 crc kubenswrapper[4893]: I0314 07:23:01.400750 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b51e9fd-b41e-4a2d-a792-b21cdb16a895" path="/var/lib/kubelet/pods/3b51e9fd-b41e-4a2d-a792-b21cdb16a895/volumes" Mar 14 07:23:01 crc kubenswrapper[4893]: I0314 07:23:01.421310 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bbd017e-988b-409e-861c-098bb3ab86ca-config-data\") pod \"nova-scheduler-0\" (UID: \"4bbd017e-988b-409e-861c-098bb3ab86ca\") " pod="openstack/nova-scheduler-0" Mar 14 07:23:01 crc kubenswrapper[4893]: I0314 07:23:01.426695 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9rgt\" (UniqueName: \"kubernetes.io/projected/4bbd017e-988b-409e-861c-098bb3ab86ca-kube-api-access-l9rgt\") pod \"nova-scheduler-0\" (UID: \"4bbd017e-988b-409e-861c-098bb3ab86ca\") " pod="openstack/nova-scheduler-0" Mar 14 07:23:01 crc kubenswrapper[4893]: I0314 07:23:01.426898 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bbd017e-988b-409e-861c-098bb3ab86ca-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4bbd017e-988b-409e-861c-098bb3ab86ca\") " pod="openstack/nova-scheduler-0" Mar 14 07:23:01 crc kubenswrapper[4893]: I0314 07:23:01.428489 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bbd017e-988b-409e-861c-098bb3ab86ca-config-data\") pod \"nova-scheduler-0\" (UID: \"4bbd017e-988b-409e-861c-098bb3ab86ca\") " pod="openstack/nova-scheduler-0" Mar 14 07:23:01 crc kubenswrapper[4893]: I0314 07:23:01.428628 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 14 07:23:01 crc kubenswrapper[4893]: I0314 07:23:01.432009 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bbd017e-988b-409e-861c-098bb3ab86ca-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4bbd017e-988b-409e-861c-098bb3ab86ca\") " pod="openstack/nova-scheduler-0" Mar 14 07:23:01 crc kubenswrapper[4893]: I0314 07:23:01.436221 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 14 07:23:01 crc kubenswrapper[4893]: I0314 07:23:01.447783 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 14 07:23:01 crc kubenswrapper[4893]: I0314 07:23:01.449796 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 07:23:01 crc kubenswrapper[4893]: I0314 07:23:01.450192 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9rgt\" (UniqueName: \"kubernetes.io/projected/4bbd017e-988b-409e-861c-098bb3ab86ca-kube-api-access-l9rgt\") pod \"nova-scheduler-0\" (UID: \"4bbd017e-988b-409e-861c-098bb3ab86ca\") " pod="openstack/nova-scheduler-0" Mar 14 07:23:01 crc kubenswrapper[4893]: I0314 07:23:01.461955 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 14 07:23:01 crc kubenswrapper[4893]: I0314 07:23:01.473587 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 14 07:23:01 crc kubenswrapper[4893]: I0314 07:23:01.526661 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 14 07:23:01 crc kubenswrapper[4893]: I0314 07:23:01.527932 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b55f3b4-1214-471c-a15c-6364e97d0818-logs\") pod \"nova-api-0\" (UID: \"2b55f3b4-1214-471c-a15c-6364e97d0818\") " pod="openstack/nova-api-0" Mar 14 07:23:01 crc kubenswrapper[4893]: I0314 07:23:01.528073 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b55f3b4-1214-471c-a15c-6364e97d0818-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2b55f3b4-1214-471c-a15c-6364e97d0818\") " pod="openstack/nova-api-0" Mar 14 07:23:01 crc kubenswrapper[4893]: I0314 07:23:01.528214 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swj5n\" (UniqueName: \"kubernetes.io/projected/2b55f3b4-1214-471c-a15c-6364e97d0818-kube-api-access-swj5n\") pod \"nova-api-0\" (UID: \"2b55f3b4-1214-471c-a15c-6364e97d0818\") " pod="openstack/nova-api-0" Mar 14 07:23:01 crc kubenswrapper[4893]: I0314 07:23:01.528347 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b55f3b4-1214-471c-a15c-6364e97d0818-config-data\") pod \"nova-api-0\" (UID: \"2b55f3b4-1214-471c-a15c-6364e97d0818\") " pod="openstack/nova-api-0" Mar 14 07:23:01 crc kubenswrapper[4893]: I0314 07:23:01.630690 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b55f3b4-1214-471c-a15c-6364e97d0818-logs\") pod \"nova-api-0\" (UID: \"2b55f3b4-1214-471c-a15c-6364e97d0818\") " pod="openstack/nova-api-0" Mar 14 07:23:01 crc kubenswrapper[4893]: I0314 07:23:01.630758 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b55f3b4-1214-471c-a15c-6364e97d0818-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2b55f3b4-1214-471c-a15c-6364e97d0818\") " pod="openstack/nova-api-0" Mar 14 07:23:01 crc kubenswrapper[4893]: I0314 07:23:01.630822 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swj5n\" (UniqueName: \"kubernetes.io/projected/2b55f3b4-1214-471c-a15c-6364e97d0818-kube-api-access-swj5n\") pod \"nova-api-0\" (UID: \"2b55f3b4-1214-471c-a15c-6364e97d0818\") " pod="openstack/nova-api-0" Mar 14 07:23:01 crc kubenswrapper[4893]: I0314 07:23:01.630942 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b55f3b4-1214-471c-a15c-6364e97d0818-config-data\") pod \"nova-api-0\" (UID: \"2b55f3b4-1214-471c-a15c-6364e97d0818\") " pod="openstack/nova-api-0" Mar 14 07:23:01 crc kubenswrapper[4893]: I0314 07:23:01.631581 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b55f3b4-1214-471c-a15c-6364e97d0818-logs\") pod \"nova-api-0\" (UID: \"2b55f3b4-1214-471c-a15c-6364e97d0818\") " pod="openstack/nova-api-0" Mar 14 07:23:01 crc kubenswrapper[4893]: I0314 07:23:01.634871 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b55f3b4-1214-471c-a15c-6364e97d0818-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2b55f3b4-1214-471c-a15c-6364e97d0818\") " pod="openstack/nova-api-0" Mar 14 07:23:01 crc kubenswrapper[4893]: I0314 07:23:01.636072 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b55f3b4-1214-471c-a15c-6364e97d0818-config-data\") pod \"nova-api-0\" (UID: \"2b55f3b4-1214-471c-a15c-6364e97d0818\") " pod="openstack/nova-api-0" Mar 14 07:23:01 crc kubenswrapper[4893]: I0314 07:23:01.678660 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swj5n\" (UniqueName: \"kubernetes.io/projected/2b55f3b4-1214-471c-a15c-6364e97d0818-kube-api-access-swj5n\") pod \"nova-api-0\" (UID: \"2b55f3b4-1214-471c-a15c-6364e97d0818\") " pod="openstack/nova-api-0" Mar 14 07:23:02 crc kubenswrapper[4893]: I0314 07:23:01.813796 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 07:23:02 crc kubenswrapper[4893]: I0314 07:23:02.020166 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 07:23:02 crc kubenswrapper[4893]: I0314 07:23:02.072648 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4bbd017e-988b-409e-861c-098bb3ab86ca","Type":"ContainerStarted","Data":"07ac464ecbf483021ef0be8dc7349796787f2d387226a01986484c3f32863e30"} Mar 14 07:23:02 crc kubenswrapper[4893]: I0314 07:23:02.239334 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 14 07:23:02 crc kubenswrapper[4893]: I0314 07:23:02.239396 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 14 07:23:02 crc kubenswrapper[4893]: I0314 07:23:02.819801 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 14 07:23:03 crc kubenswrapper[4893]: I0314 07:23:03.088852 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2b55f3b4-1214-471c-a15c-6364e97d0818","Type":"ContainerStarted","Data":"85823d86f7f607494f364ff61b9ba224eb4ff5e8b1dddee81c142042cdf0d7fd"} Mar 14 07:23:03 crc kubenswrapper[4893]: I0314 07:23:03.089256 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2b55f3b4-1214-471c-a15c-6364e97d0818","Type":"ContainerStarted","Data":"85100369830d64a14a0d03d6d73fd7dedf15485e7088c4d8d2cf3e4586783005"} Mar 14 07:23:03 crc kubenswrapper[4893]: I0314 07:23:03.090826 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4bbd017e-988b-409e-861c-098bb3ab86ca","Type":"ContainerStarted","Data":"b2311004221e23c0af13a452a232d08910d8dc3bb94ea73ec138b9f7e2f2e811"} Mar 14 07:23:03 crc kubenswrapper[4893]: I0314 07:23:03.126711 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.126688989 podStartE2EDuration="2.126688989s" podCreationTimestamp="2026-03-14 07:23:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:23:03.115893748 +0000 UTC m=+1462.378070560" watchObservedRunningTime="2026-03-14 07:23:03.126688989 +0000 UTC m=+1462.388865821" Mar 14 07:23:03 crc kubenswrapper[4893]: I0314 07:23:03.305091 4893 scope.go:117] "RemoveContainer" containerID="f79609637c7322108aa2e766ff58838b6e7ab55adc2ab5d441df7f60abf1fb4f" Mar 14 07:23:03 crc kubenswrapper[4893]: I0314 07:23:03.408945 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="777a623c-ec9d-4342-b3e6-4c88286184a7" path="/var/lib/kubelet/pods/777a623c-ec9d-4342-b3e6-4c88286184a7/volumes" Mar 14 07:23:04 crc kubenswrapper[4893]: I0314 07:23:04.102513 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2b55f3b4-1214-471c-a15c-6364e97d0818","Type":"ContainerStarted","Data":"62be3b20c75ccf945fc1b6ffcd40a952892e1ad548e1d8f510579789c375347c"} Mar 14 07:23:04 crc kubenswrapper[4893]: I0314 07:23:04.134169 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.134149421 podStartE2EDuration="3.134149421s" podCreationTimestamp="2026-03-14 07:23:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:23:04.126204198 +0000 UTC m=+1463.388381010" watchObservedRunningTime="2026-03-14 07:23:04.134149421 +0000 UTC m=+1463.396326213" Mar 14 07:23:05 crc kubenswrapper[4893]: I0314 07:23:05.391211 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 14 07:23:06 crc kubenswrapper[4893]: I0314 07:23:06.526772 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 14 07:23:07 crc kubenswrapper[4893]: I0314 07:23:07.467457 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 14 07:23:09 crc kubenswrapper[4893]: I0314 07:23:09.194103 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 14 07:23:09 crc kubenswrapper[4893]: I0314 07:23:09.194606 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="cb3f59e3-5322-4aa3-94c7-de3cc01c39cc" containerName="kube-state-metrics" containerID="cri-o://79bd7e0e66d8f0475a7827f0b449209a2d19414d3528e69c2515efbdf5c781ce" gracePeriod=30 Mar 14 07:23:09 crc kubenswrapper[4893]: I0314 07:23:09.697699 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 14 07:23:09 crc kubenswrapper[4893]: I0314 07:23:09.794656 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxw86\" (UniqueName: \"kubernetes.io/projected/cb3f59e3-5322-4aa3-94c7-de3cc01c39cc-kube-api-access-qxw86\") pod \"cb3f59e3-5322-4aa3-94c7-de3cc01c39cc\" (UID: \"cb3f59e3-5322-4aa3-94c7-de3cc01c39cc\") " Mar 14 07:23:09 crc kubenswrapper[4893]: I0314 07:23:09.799433 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb3f59e3-5322-4aa3-94c7-de3cc01c39cc-kube-api-access-qxw86" (OuterVolumeSpecName: "kube-api-access-qxw86") pod "cb3f59e3-5322-4aa3-94c7-de3cc01c39cc" (UID: "cb3f59e3-5322-4aa3-94c7-de3cc01c39cc"). InnerVolumeSpecName "kube-api-access-qxw86". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:23:09 crc kubenswrapper[4893]: I0314 07:23:09.896963 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxw86\" (UniqueName: \"kubernetes.io/projected/cb3f59e3-5322-4aa3-94c7-de3cc01c39cc-kube-api-access-qxw86\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:10 crc kubenswrapper[4893]: I0314 07:23:10.185351 4893 generic.go:334] "Generic (PLEG): container finished" podID="cb3f59e3-5322-4aa3-94c7-de3cc01c39cc" containerID="79bd7e0e66d8f0475a7827f0b449209a2d19414d3528e69c2515efbdf5c781ce" exitCode=2 Mar 14 07:23:10 crc kubenswrapper[4893]: I0314 07:23:10.185440 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"cb3f59e3-5322-4aa3-94c7-de3cc01c39cc","Type":"ContainerDied","Data":"79bd7e0e66d8f0475a7827f0b449209a2d19414d3528e69c2515efbdf5c781ce"} Mar 14 07:23:10 crc kubenswrapper[4893]: I0314 07:23:10.185453 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 14 07:23:10 crc kubenswrapper[4893]: I0314 07:23:10.185489 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"cb3f59e3-5322-4aa3-94c7-de3cc01c39cc","Type":"ContainerDied","Data":"8eeb92ae66581a8ddc0966be1f4628f6975f2e307c6746198108c7c0efca84f4"} Mar 14 07:23:10 crc kubenswrapper[4893]: I0314 07:23:10.185505 4893 scope.go:117] "RemoveContainer" containerID="79bd7e0e66d8f0475a7827f0b449209a2d19414d3528e69c2515efbdf5c781ce" Mar 14 07:23:10 crc kubenswrapper[4893]: I0314 07:23:10.214898 4893 scope.go:117] "RemoveContainer" containerID="79bd7e0e66d8f0475a7827f0b449209a2d19414d3528e69c2515efbdf5c781ce" Mar 14 07:23:10 crc kubenswrapper[4893]: E0314 07:23:10.215553 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79bd7e0e66d8f0475a7827f0b449209a2d19414d3528e69c2515efbdf5c781ce\": container with ID starting with 79bd7e0e66d8f0475a7827f0b449209a2d19414d3528e69c2515efbdf5c781ce not found: ID does not exist" containerID="79bd7e0e66d8f0475a7827f0b449209a2d19414d3528e69c2515efbdf5c781ce" Mar 14 07:23:10 crc kubenswrapper[4893]: I0314 07:23:10.215588 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79bd7e0e66d8f0475a7827f0b449209a2d19414d3528e69c2515efbdf5c781ce"} err="failed to get container status \"79bd7e0e66d8f0475a7827f0b449209a2d19414d3528e69c2515efbdf5c781ce\": rpc error: code = NotFound desc = could not find container \"79bd7e0e66d8f0475a7827f0b449209a2d19414d3528e69c2515efbdf5c781ce\": container with ID starting with 79bd7e0e66d8f0475a7827f0b449209a2d19414d3528e69c2515efbdf5c781ce not found: ID does not exist" Mar 14 07:23:10 crc kubenswrapper[4893]: I0314 07:23:10.232983 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 14 07:23:10 crc kubenswrapper[4893]: I0314 07:23:10.249973 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 14 07:23:10 crc kubenswrapper[4893]: I0314 07:23:10.259651 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 14 07:23:10 crc kubenswrapper[4893]: E0314 07:23:10.261450 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb3f59e3-5322-4aa3-94c7-de3cc01c39cc" containerName="kube-state-metrics" Mar 14 07:23:10 crc kubenswrapper[4893]: I0314 07:23:10.261475 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb3f59e3-5322-4aa3-94c7-de3cc01c39cc" containerName="kube-state-metrics" Mar 14 07:23:10 crc kubenswrapper[4893]: I0314 07:23:10.261804 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb3f59e3-5322-4aa3-94c7-de3cc01c39cc" containerName="kube-state-metrics" Mar 14 07:23:10 crc kubenswrapper[4893]: I0314 07:23:10.266591 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 14 07:23:10 crc kubenswrapper[4893]: I0314 07:23:10.268602 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 14 07:23:10 crc kubenswrapper[4893]: I0314 07:23:10.269058 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 14 07:23:10 crc kubenswrapper[4893]: I0314 07:23:10.273881 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 14 07:23:10 crc kubenswrapper[4893]: I0314 07:23:10.405849 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/1cad381a-c3bf-4fc8-a314-6f45028f3482-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"1cad381a-c3bf-4fc8-a314-6f45028f3482\") " pod="openstack/kube-state-metrics-0" Mar 14 07:23:10 crc kubenswrapper[4893]: I0314 07:23:10.405931 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cad381a-c3bf-4fc8-a314-6f45028f3482-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"1cad381a-c3bf-4fc8-a314-6f45028f3482\") " pod="openstack/kube-state-metrics-0" Mar 14 07:23:10 crc kubenswrapper[4893]: I0314 07:23:10.406392 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrkvb\" (UniqueName: \"kubernetes.io/projected/1cad381a-c3bf-4fc8-a314-6f45028f3482-kube-api-access-wrkvb\") pod \"kube-state-metrics-0\" (UID: \"1cad381a-c3bf-4fc8-a314-6f45028f3482\") " pod="openstack/kube-state-metrics-0" Mar 14 07:23:10 crc kubenswrapper[4893]: I0314 07:23:10.406798 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/1cad381a-c3bf-4fc8-a314-6f45028f3482-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"1cad381a-c3bf-4fc8-a314-6f45028f3482\") " pod="openstack/kube-state-metrics-0" Mar 14 07:23:10 crc kubenswrapper[4893]: I0314 07:23:10.509111 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/1cad381a-c3bf-4fc8-a314-6f45028f3482-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"1cad381a-c3bf-4fc8-a314-6f45028f3482\") " pod="openstack/kube-state-metrics-0" Mar 14 07:23:10 crc kubenswrapper[4893]: I0314 07:23:10.509621 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/1cad381a-c3bf-4fc8-a314-6f45028f3482-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"1cad381a-c3bf-4fc8-a314-6f45028f3482\") " pod="openstack/kube-state-metrics-0" Mar 14 07:23:10 crc kubenswrapper[4893]: I0314 07:23:10.509727 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cad381a-c3bf-4fc8-a314-6f45028f3482-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"1cad381a-c3bf-4fc8-a314-6f45028f3482\") " pod="openstack/kube-state-metrics-0" Mar 14 07:23:10 crc kubenswrapper[4893]: I0314 07:23:10.509906 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrkvb\" (UniqueName: \"kubernetes.io/projected/1cad381a-c3bf-4fc8-a314-6f45028f3482-kube-api-access-wrkvb\") pod \"kube-state-metrics-0\" (UID: \"1cad381a-c3bf-4fc8-a314-6f45028f3482\") " pod="openstack/kube-state-metrics-0" Mar 14 07:23:10 crc kubenswrapper[4893]: I0314 07:23:10.513636 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cad381a-c3bf-4fc8-a314-6f45028f3482-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"1cad381a-c3bf-4fc8-a314-6f45028f3482\") " pod="openstack/kube-state-metrics-0" Mar 14 07:23:10 crc kubenswrapper[4893]: I0314 07:23:10.514501 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/1cad381a-c3bf-4fc8-a314-6f45028f3482-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"1cad381a-c3bf-4fc8-a314-6f45028f3482\") " pod="openstack/kube-state-metrics-0" Mar 14 07:23:10 crc kubenswrapper[4893]: I0314 07:23:10.515957 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/1cad381a-c3bf-4fc8-a314-6f45028f3482-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"1cad381a-c3bf-4fc8-a314-6f45028f3482\") " pod="openstack/kube-state-metrics-0" Mar 14 07:23:10 crc kubenswrapper[4893]: I0314 07:23:10.533709 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrkvb\" (UniqueName: \"kubernetes.io/projected/1cad381a-c3bf-4fc8-a314-6f45028f3482-kube-api-access-wrkvb\") pod \"kube-state-metrics-0\" (UID: \"1cad381a-c3bf-4fc8-a314-6f45028f3482\") " pod="openstack/kube-state-metrics-0" Mar 14 07:23:10 crc kubenswrapper[4893]: I0314 07:23:10.592377 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 14 07:23:10 crc kubenswrapper[4893]: I0314 07:23:10.985457 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:23:10 crc kubenswrapper[4893]: I0314 07:23:10.986830 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1f8dfb58-4c36-4f9f-b142-29a325099f3e" containerName="ceilometer-central-agent" containerID="cri-o://950611c56e3344fdc62264065d2587e5a1938230c6418d51d23815de1934eaea" gracePeriod=30 Mar 14 07:23:10 crc kubenswrapper[4893]: I0314 07:23:10.986965 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1f8dfb58-4c36-4f9f-b142-29a325099f3e" containerName="proxy-httpd" containerID="cri-o://3b4dcfde76f399ad63952470692ea7797b03fc5391b57f10b894e2301aa465e2" gracePeriod=30 Mar 14 07:23:10 crc kubenswrapper[4893]: I0314 07:23:10.987064 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1f8dfb58-4c36-4f9f-b142-29a325099f3e" containerName="ceilometer-notification-agent" containerID="cri-o://d5aab96a43fad3ce075801aad0ba40c5256b90c39be16a195a7a51e462662ae4" gracePeriod=30 Mar 14 07:23:10 crc kubenswrapper[4893]: I0314 07:23:10.987302 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1f8dfb58-4c36-4f9f-b142-29a325099f3e" containerName="sg-core" containerID="cri-o://7a443e5103fbd27a22cea9d965cf65d3ae0ac1bbfe32f671c5f3f620ad3f8284" gracePeriod=30 Mar 14 07:23:11 crc kubenswrapper[4893]: I0314 07:23:11.060952 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 14 07:23:11 crc kubenswrapper[4893]: I0314 07:23:11.200128 4893 generic.go:334] "Generic (PLEG): container finished" podID="1f8dfb58-4c36-4f9f-b142-29a325099f3e" containerID="3b4dcfde76f399ad63952470692ea7797b03fc5391b57f10b894e2301aa465e2" exitCode=0 Mar 14 07:23:11 crc kubenswrapper[4893]: I0314 07:23:11.200185 4893 generic.go:334] "Generic (PLEG): container finished" podID="1f8dfb58-4c36-4f9f-b142-29a325099f3e" containerID="7a443e5103fbd27a22cea9d965cf65d3ae0ac1bbfe32f671c5f3f620ad3f8284" exitCode=2 Mar 14 07:23:11 crc kubenswrapper[4893]: I0314 07:23:11.200420 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1f8dfb58-4c36-4f9f-b142-29a325099f3e","Type":"ContainerDied","Data":"3b4dcfde76f399ad63952470692ea7797b03fc5391b57f10b894e2301aa465e2"} Mar 14 07:23:11 crc kubenswrapper[4893]: I0314 07:23:11.200471 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1f8dfb58-4c36-4f9f-b142-29a325099f3e","Type":"ContainerDied","Data":"7a443e5103fbd27a22cea9d965cf65d3ae0ac1bbfe32f671c5f3f620ad3f8284"} Mar 14 07:23:11 crc kubenswrapper[4893]: I0314 07:23:11.201570 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1cad381a-c3bf-4fc8-a314-6f45028f3482","Type":"ContainerStarted","Data":"4b6ebdcad9604ec201a9052264933fd32ee7e23b9e9bcc41a262190418c54706"} Mar 14 07:23:11 crc kubenswrapper[4893]: I0314 07:23:11.391176 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb3f59e3-5322-4aa3-94c7-de3cc01c39cc" path="/var/lib/kubelet/pods/cb3f59e3-5322-4aa3-94c7-de3cc01c39cc/volumes" Mar 14 07:23:11 crc kubenswrapper[4893]: I0314 07:23:11.527572 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 14 07:23:11 crc kubenswrapper[4893]: I0314 07:23:11.561218 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 14 07:23:11 crc kubenswrapper[4893]: I0314 07:23:11.814607 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 14 07:23:11 crc kubenswrapper[4893]: I0314 07:23:11.814669 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 14 07:23:12 crc kubenswrapper[4893]: I0314 07:23:12.213809 4893 generic.go:334] "Generic (PLEG): container finished" podID="1f8dfb58-4c36-4f9f-b142-29a325099f3e" containerID="950611c56e3344fdc62264065d2587e5a1938230c6418d51d23815de1934eaea" exitCode=0 Mar 14 07:23:12 crc kubenswrapper[4893]: I0314 07:23:12.213888 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1f8dfb58-4c36-4f9f-b142-29a325099f3e","Type":"ContainerDied","Data":"950611c56e3344fdc62264065d2587e5a1938230c6418d51d23815de1934eaea"} Mar 14 07:23:12 crc kubenswrapper[4893]: I0314 07:23:12.217277 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1cad381a-c3bf-4fc8-a314-6f45028f3482","Type":"ContainerStarted","Data":"1479c505fa63564b68dc67398eb06f224838c90b598b1f33bb9bfdfc5ff3333d"} Mar 14 07:23:12 crc kubenswrapper[4893]: I0314 07:23:12.217347 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 14 07:23:12 crc kubenswrapper[4893]: I0314 07:23:12.241797 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.8708444530000001 podStartE2EDuration="2.241768726s" podCreationTimestamp="2026-03-14 07:23:10 +0000 UTC" firstStartedPulling="2026-03-14 07:23:11.072709755 +0000 UTC m=+1470.334886537" lastFinishedPulling="2026-03-14 07:23:11.443634018 +0000 UTC m=+1470.705810810" observedRunningTime="2026-03-14 07:23:12.233857874 +0000 UTC m=+1471.496034686" watchObservedRunningTime="2026-03-14 07:23:12.241768726 +0000 UTC m=+1471.503945538" Mar 14 07:23:12 crc kubenswrapper[4893]: I0314 07:23:12.248429 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 14 07:23:12 crc kubenswrapper[4893]: I0314 07:23:12.896766 4893 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2b55f3b4-1214-471c-a15c-6364e97d0818" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.203:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 07:23:12 crc kubenswrapper[4893]: I0314 07:23:12.896805 4893 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2b55f3b4-1214-471c-a15c-6364e97d0818" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.203:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 07:23:14 crc kubenswrapper[4893]: I0314 07:23:14.242362 4893 generic.go:334] "Generic (PLEG): container finished" podID="1f8dfb58-4c36-4f9f-b142-29a325099f3e" containerID="d5aab96a43fad3ce075801aad0ba40c5256b90c39be16a195a7a51e462662ae4" exitCode=0 Mar 14 07:23:14 crc kubenswrapper[4893]: I0314 07:23:14.242718 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1f8dfb58-4c36-4f9f-b142-29a325099f3e","Type":"ContainerDied","Data":"d5aab96a43fad3ce075801aad0ba40c5256b90c39be16a195a7a51e462662ae4"} Mar 14 07:23:14 crc kubenswrapper[4893]: I0314 07:23:14.242747 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1f8dfb58-4c36-4f9f-b142-29a325099f3e","Type":"ContainerDied","Data":"a01f2492a7de0996b71785e6edf868e0a74744b84c9b8c845e954ff7d27725d5"} Mar 14 07:23:14 crc kubenswrapper[4893]: I0314 07:23:14.242761 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a01f2492a7de0996b71785e6edf868e0a74744b84c9b8c845e954ff7d27725d5" Mar 14 07:23:14 crc kubenswrapper[4893]: I0314 07:23:14.259353 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 07:23:14 crc kubenswrapper[4893]: I0314 07:23:14.389646 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1f8dfb58-4c36-4f9f-b142-29a325099f3e-sg-core-conf-yaml\") pod \"1f8dfb58-4c36-4f9f-b142-29a325099f3e\" (UID: \"1f8dfb58-4c36-4f9f-b142-29a325099f3e\") " Mar 14 07:23:14 crc kubenswrapper[4893]: I0314 07:23:14.389695 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zlgtv\" (UniqueName: \"kubernetes.io/projected/1f8dfb58-4c36-4f9f-b142-29a325099f3e-kube-api-access-zlgtv\") pod \"1f8dfb58-4c36-4f9f-b142-29a325099f3e\" (UID: \"1f8dfb58-4c36-4f9f-b142-29a325099f3e\") " Mar 14 07:23:14 crc kubenswrapper[4893]: I0314 07:23:14.389756 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f8dfb58-4c36-4f9f-b142-29a325099f3e-log-httpd\") pod \"1f8dfb58-4c36-4f9f-b142-29a325099f3e\" (UID: \"1f8dfb58-4c36-4f9f-b142-29a325099f3e\") " Mar 14 07:23:14 crc kubenswrapper[4893]: I0314 07:23:14.389838 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f8dfb58-4c36-4f9f-b142-29a325099f3e-config-data\") pod \"1f8dfb58-4c36-4f9f-b142-29a325099f3e\" (UID: \"1f8dfb58-4c36-4f9f-b142-29a325099f3e\") " Mar 14 07:23:14 crc kubenswrapper[4893]: I0314 07:23:14.389905 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f8dfb58-4c36-4f9f-b142-29a325099f3e-combined-ca-bundle\") pod \"1f8dfb58-4c36-4f9f-b142-29a325099f3e\" (UID: \"1f8dfb58-4c36-4f9f-b142-29a325099f3e\") " Mar 14 07:23:14 crc kubenswrapper[4893]: I0314 07:23:14.389938 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f8dfb58-4c36-4f9f-b142-29a325099f3e-scripts\") pod \"1f8dfb58-4c36-4f9f-b142-29a325099f3e\" (UID: \"1f8dfb58-4c36-4f9f-b142-29a325099f3e\") " Mar 14 07:23:14 crc kubenswrapper[4893]: I0314 07:23:14.389998 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f8dfb58-4c36-4f9f-b142-29a325099f3e-run-httpd\") pod \"1f8dfb58-4c36-4f9f-b142-29a325099f3e\" (UID: \"1f8dfb58-4c36-4f9f-b142-29a325099f3e\") " Mar 14 07:23:14 crc kubenswrapper[4893]: I0314 07:23:14.391033 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f8dfb58-4c36-4f9f-b142-29a325099f3e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1f8dfb58-4c36-4f9f-b142-29a325099f3e" (UID: "1f8dfb58-4c36-4f9f-b142-29a325099f3e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:23:14 crc kubenswrapper[4893]: I0314 07:23:14.391273 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f8dfb58-4c36-4f9f-b142-29a325099f3e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1f8dfb58-4c36-4f9f-b142-29a325099f3e" (UID: "1f8dfb58-4c36-4f9f-b142-29a325099f3e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:23:14 crc kubenswrapper[4893]: I0314 07:23:14.395209 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f8dfb58-4c36-4f9f-b142-29a325099f3e-scripts" (OuterVolumeSpecName: "scripts") pod "1f8dfb58-4c36-4f9f-b142-29a325099f3e" (UID: "1f8dfb58-4c36-4f9f-b142-29a325099f3e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:23:14 crc kubenswrapper[4893]: I0314 07:23:14.396444 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f8dfb58-4c36-4f9f-b142-29a325099f3e-kube-api-access-zlgtv" (OuterVolumeSpecName: "kube-api-access-zlgtv") pod "1f8dfb58-4c36-4f9f-b142-29a325099f3e" (UID: "1f8dfb58-4c36-4f9f-b142-29a325099f3e"). InnerVolumeSpecName "kube-api-access-zlgtv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:23:14 crc kubenswrapper[4893]: I0314 07:23:14.432879 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f8dfb58-4c36-4f9f-b142-29a325099f3e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1f8dfb58-4c36-4f9f-b142-29a325099f3e" (UID: "1f8dfb58-4c36-4f9f-b142-29a325099f3e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:23:14 crc kubenswrapper[4893]: I0314 07:23:14.462288 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f8dfb58-4c36-4f9f-b142-29a325099f3e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1f8dfb58-4c36-4f9f-b142-29a325099f3e" (UID: "1f8dfb58-4c36-4f9f-b142-29a325099f3e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:23:14 crc kubenswrapper[4893]: I0314 07:23:14.489118 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f8dfb58-4c36-4f9f-b142-29a325099f3e-config-data" (OuterVolumeSpecName: "config-data") pod "1f8dfb58-4c36-4f9f-b142-29a325099f3e" (UID: "1f8dfb58-4c36-4f9f-b142-29a325099f3e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:23:14 crc kubenswrapper[4893]: I0314 07:23:14.494079 4893 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1f8dfb58-4c36-4f9f-b142-29a325099f3e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:14 crc kubenswrapper[4893]: I0314 07:23:14.494121 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zlgtv\" (UniqueName: \"kubernetes.io/projected/1f8dfb58-4c36-4f9f-b142-29a325099f3e-kube-api-access-zlgtv\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:14 crc kubenswrapper[4893]: I0314 07:23:14.494157 4893 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f8dfb58-4c36-4f9f-b142-29a325099f3e-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:14 crc kubenswrapper[4893]: I0314 07:23:14.494240 4893 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f8dfb58-4c36-4f9f-b142-29a325099f3e-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:14 crc kubenswrapper[4893]: I0314 07:23:14.494257 4893 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f8dfb58-4c36-4f9f-b142-29a325099f3e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:14 crc kubenswrapper[4893]: I0314 07:23:14.494274 4893 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f8dfb58-4c36-4f9f-b142-29a325099f3e-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:14 crc kubenswrapper[4893]: I0314 07:23:14.494293 4893 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f8dfb58-4c36-4f9f-b142-29a325099f3e-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:15 crc kubenswrapper[4893]: I0314 07:23:15.253932 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 07:23:15 crc kubenswrapper[4893]: I0314 07:23:15.292861 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:23:15 crc kubenswrapper[4893]: I0314 07:23:15.301850 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:23:15 crc kubenswrapper[4893]: I0314 07:23:15.334088 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:23:15 crc kubenswrapper[4893]: E0314 07:23:15.334884 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f8dfb58-4c36-4f9f-b142-29a325099f3e" containerName="ceilometer-notification-agent" Mar 14 07:23:15 crc kubenswrapper[4893]: I0314 07:23:15.334924 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f8dfb58-4c36-4f9f-b142-29a325099f3e" containerName="ceilometer-notification-agent" Mar 14 07:23:15 crc kubenswrapper[4893]: E0314 07:23:15.334953 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f8dfb58-4c36-4f9f-b142-29a325099f3e" containerName="ceilometer-central-agent" Mar 14 07:23:15 crc kubenswrapper[4893]: I0314 07:23:15.334969 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f8dfb58-4c36-4f9f-b142-29a325099f3e" containerName="ceilometer-central-agent" Mar 14 07:23:15 crc kubenswrapper[4893]: E0314 07:23:15.335053 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f8dfb58-4c36-4f9f-b142-29a325099f3e" containerName="proxy-httpd" Mar 14 07:23:15 crc kubenswrapper[4893]: I0314 07:23:15.335070 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f8dfb58-4c36-4f9f-b142-29a325099f3e" containerName="proxy-httpd" Mar 14 07:23:15 crc kubenswrapper[4893]: E0314 07:23:15.335096 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f8dfb58-4c36-4f9f-b142-29a325099f3e" containerName="sg-core" Mar 14 07:23:15 crc kubenswrapper[4893]: I0314 07:23:15.335109 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f8dfb58-4c36-4f9f-b142-29a325099f3e" containerName="sg-core" Mar 14 07:23:15 crc kubenswrapper[4893]: I0314 07:23:15.335432 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f8dfb58-4c36-4f9f-b142-29a325099f3e" containerName="proxy-httpd" Mar 14 07:23:15 crc kubenswrapper[4893]: I0314 07:23:15.335473 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f8dfb58-4c36-4f9f-b142-29a325099f3e" containerName="sg-core" Mar 14 07:23:15 crc kubenswrapper[4893]: I0314 07:23:15.335503 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f8dfb58-4c36-4f9f-b142-29a325099f3e" containerName="ceilometer-central-agent" Mar 14 07:23:15 crc kubenswrapper[4893]: I0314 07:23:15.335554 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f8dfb58-4c36-4f9f-b142-29a325099f3e" containerName="ceilometer-notification-agent" Mar 14 07:23:15 crc kubenswrapper[4893]: I0314 07:23:15.338698 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 07:23:15 crc kubenswrapper[4893]: I0314 07:23:15.341813 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 14 07:23:15 crc kubenswrapper[4893]: I0314 07:23:15.344689 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 14 07:23:15 crc kubenswrapper[4893]: I0314 07:23:15.345436 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 14 07:23:15 crc kubenswrapper[4893]: I0314 07:23:15.345743 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:23:15 crc kubenswrapper[4893]: I0314 07:23:15.387490 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f8dfb58-4c36-4f9f-b142-29a325099f3e" path="/var/lib/kubelet/pods/1f8dfb58-4c36-4f9f-b142-29a325099f3e/volumes" Mar 14 07:23:15 crc kubenswrapper[4893]: I0314 07:23:15.514326 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a97b4a0c-e753-44fc-ba00-12c82648c30a-scripts\") pod \"ceilometer-0\" (UID: \"a97b4a0c-e753-44fc-ba00-12c82648c30a\") " pod="openstack/ceilometer-0" Mar 14 07:23:15 crc kubenswrapper[4893]: I0314 07:23:15.514706 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pg692\" (UniqueName: \"kubernetes.io/projected/a97b4a0c-e753-44fc-ba00-12c82648c30a-kube-api-access-pg692\") pod \"ceilometer-0\" (UID: \"a97b4a0c-e753-44fc-ba00-12c82648c30a\") " pod="openstack/ceilometer-0" Mar 14 07:23:15 crc kubenswrapper[4893]: I0314 07:23:15.514823 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a97b4a0c-e753-44fc-ba00-12c82648c30a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a97b4a0c-e753-44fc-ba00-12c82648c30a\") " pod="openstack/ceilometer-0" Mar 14 07:23:15 crc kubenswrapper[4893]: I0314 07:23:15.515073 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a97b4a0c-e753-44fc-ba00-12c82648c30a-config-data\") pod \"ceilometer-0\" (UID: \"a97b4a0c-e753-44fc-ba00-12c82648c30a\") " pod="openstack/ceilometer-0" Mar 14 07:23:15 crc kubenswrapper[4893]: I0314 07:23:15.515124 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a97b4a0c-e753-44fc-ba00-12c82648c30a-run-httpd\") pod \"ceilometer-0\" (UID: \"a97b4a0c-e753-44fc-ba00-12c82648c30a\") " pod="openstack/ceilometer-0" Mar 14 07:23:15 crc kubenswrapper[4893]: I0314 07:23:15.515185 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a97b4a0c-e753-44fc-ba00-12c82648c30a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a97b4a0c-e753-44fc-ba00-12c82648c30a\") " pod="openstack/ceilometer-0" Mar 14 07:23:15 crc kubenswrapper[4893]: I0314 07:23:15.515270 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a97b4a0c-e753-44fc-ba00-12c82648c30a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a97b4a0c-e753-44fc-ba00-12c82648c30a\") " pod="openstack/ceilometer-0" Mar 14 07:23:15 crc kubenswrapper[4893]: I0314 07:23:15.515310 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a97b4a0c-e753-44fc-ba00-12c82648c30a-log-httpd\") pod \"ceilometer-0\" (UID: \"a97b4a0c-e753-44fc-ba00-12c82648c30a\") " pod="openstack/ceilometer-0" Mar 14 07:23:15 crc kubenswrapper[4893]: I0314 07:23:15.617781 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a97b4a0c-e753-44fc-ba00-12c82648c30a-config-data\") pod \"ceilometer-0\" (UID: \"a97b4a0c-e753-44fc-ba00-12c82648c30a\") " pod="openstack/ceilometer-0" Mar 14 07:23:15 crc kubenswrapper[4893]: I0314 07:23:15.617831 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a97b4a0c-e753-44fc-ba00-12c82648c30a-run-httpd\") pod \"ceilometer-0\" (UID: \"a97b4a0c-e753-44fc-ba00-12c82648c30a\") " pod="openstack/ceilometer-0" Mar 14 07:23:15 crc kubenswrapper[4893]: I0314 07:23:15.617870 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a97b4a0c-e753-44fc-ba00-12c82648c30a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a97b4a0c-e753-44fc-ba00-12c82648c30a\") " pod="openstack/ceilometer-0" Mar 14 07:23:15 crc kubenswrapper[4893]: I0314 07:23:15.617904 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a97b4a0c-e753-44fc-ba00-12c82648c30a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a97b4a0c-e753-44fc-ba00-12c82648c30a\") " pod="openstack/ceilometer-0" Mar 14 07:23:15 crc kubenswrapper[4893]: I0314 07:23:15.617928 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a97b4a0c-e753-44fc-ba00-12c82648c30a-log-httpd\") pod \"ceilometer-0\" (UID: \"a97b4a0c-e753-44fc-ba00-12c82648c30a\") " pod="openstack/ceilometer-0" Mar 14 07:23:15 crc kubenswrapper[4893]: I0314 07:23:15.617965 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a97b4a0c-e753-44fc-ba00-12c82648c30a-scripts\") pod \"ceilometer-0\" (UID: \"a97b4a0c-e753-44fc-ba00-12c82648c30a\") " pod="openstack/ceilometer-0" Mar 14 07:23:15 crc kubenswrapper[4893]: I0314 07:23:15.618050 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pg692\" (UniqueName: \"kubernetes.io/projected/a97b4a0c-e753-44fc-ba00-12c82648c30a-kube-api-access-pg692\") pod \"ceilometer-0\" (UID: \"a97b4a0c-e753-44fc-ba00-12c82648c30a\") " pod="openstack/ceilometer-0" Mar 14 07:23:15 crc kubenswrapper[4893]: I0314 07:23:15.618084 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a97b4a0c-e753-44fc-ba00-12c82648c30a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a97b4a0c-e753-44fc-ba00-12c82648c30a\") " pod="openstack/ceilometer-0" Mar 14 07:23:15 crc kubenswrapper[4893]: I0314 07:23:15.619831 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a97b4a0c-e753-44fc-ba00-12c82648c30a-run-httpd\") pod \"ceilometer-0\" (UID: \"a97b4a0c-e753-44fc-ba00-12c82648c30a\") " pod="openstack/ceilometer-0" Mar 14 07:23:15 crc kubenswrapper[4893]: I0314 07:23:15.619931 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a97b4a0c-e753-44fc-ba00-12c82648c30a-log-httpd\") pod \"ceilometer-0\" (UID: \"a97b4a0c-e753-44fc-ba00-12c82648c30a\") " pod="openstack/ceilometer-0" Mar 14 07:23:15 crc kubenswrapper[4893]: I0314 07:23:15.624076 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a97b4a0c-e753-44fc-ba00-12c82648c30a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a97b4a0c-e753-44fc-ba00-12c82648c30a\") " pod="openstack/ceilometer-0" Mar 14 07:23:15 crc kubenswrapper[4893]: I0314 07:23:15.624573 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a97b4a0c-e753-44fc-ba00-12c82648c30a-config-data\") pod \"ceilometer-0\" (UID: \"a97b4a0c-e753-44fc-ba00-12c82648c30a\") " pod="openstack/ceilometer-0" Mar 14 07:23:15 crc kubenswrapper[4893]: I0314 07:23:15.624787 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a97b4a0c-e753-44fc-ba00-12c82648c30a-scripts\") pod \"ceilometer-0\" (UID: \"a97b4a0c-e753-44fc-ba00-12c82648c30a\") " pod="openstack/ceilometer-0" Mar 14 07:23:15 crc kubenswrapper[4893]: I0314 07:23:15.632925 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a97b4a0c-e753-44fc-ba00-12c82648c30a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a97b4a0c-e753-44fc-ba00-12c82648c30a\") " pod="openstack/ceilometer-0" Mar 14 07:23:15 crc kubenswrapper[4893]: I0314 07:23:15.634123 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a97b4a0c-e753-44fc-ba00-12c82648c30a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a97b4a0c-e753-44fc-ba00-12c82648c30a\") " pod="openstack/ceilometer-0" Mar 14 07:23:15 crc kubenswrapper[4893]: I0314 07:23:15.635379 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pg692\" (UniqueName: \"kubernetes.io/projected/a97b4a0c-e753-44fc-ba00-12c82648c30a-kube-api-access-pg692\") pod \"ceilometer-0\" (UID: \"a97b4a0c-e753-44fc-ba00-12c82648c30a\") " pod="openstack/ceilometer-0" Mar 14 07:23:15 crc kubenswrapper[4893]: I0314 07:23:15.660623 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 07:23:16 crc kubenswrapper[4893]: W0314 07:23:16.081689 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda97b4a0c_e753_44fc_ba00_12c82648c30a.slice/crio-bb91a8390830b0c618d5e8a389922817dba3f5ce79916f2036e02da4b6bce2fd WatchSource:0}: Error finding container bb91a8390830b0c618d5e8a389922817dba3f5ce79916f2036e02da4b6bce2fd: Status 404 returned error can't find the container with id bb91a8390830b0c618d5e8a389922817dba3f5ce79916f2036e02da4b6bce2fd Mar 14 07:23:16 crc kubenswrapper[4893]: I0314 07:23:16.082174 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:23:16 crc kubenswrapper[4893]: I0314 07:23:16.263065 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a97b4a0c-e753-44fc-ba00-12c82648c30a","Type":"ContainerStarted","Data":"bb91a8390830b0c618d5e8a389922817dba3f5ce79916f2036e02da4b6bce2fd"} Mar 14 07:23:17 crc kubenswrapper[4893]: I0314 07:23:17.296158 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a97b4a0c-e753-44fc-ba00-12c82648c30a","Type":"ContainerStarted","Data":"f93c2e6852c1f30ddcb040cabab75f3dbad059392845c126c070852b93a48323"} Mar 14 07:23:18 crc kubenswrapper[4893]: I0314 07:23:18.305376 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a97b4a0c-e753-44fc-ba00-12c82648c30a","Type":"ContainerStarted","Data":"212a975ff9e1cacce9d3e6283aa65866d8f8e34990736ef7141b9b6c2cfaca29"} Mar 14 07:23:18 crc kubenswrapper[4893]: I0314 07:23:18.305718 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a97b4a0c-e753-44fc-ba00-12c82648c30a","Type":"ContainerStarted","Data":"7facda4f79e49359930910e74711a0b4e237b3502da367234832f322f7e87812"} Mar 14 07:23:19 crc kubenswrapper[4893]: I0314 07:23:19.323360 4893 generic.go:334] "Generic (PLEG): container finished" podID="2336e116-3fed-45af-968b-301825aa44a6" containerID="62265d015ff4962fcbaf716844c4ef386cf89bbbd13c2dace731c8743206848b" exitCode=137 Mar 14 07:23:19 crc kubenswrapper[4893]: I0314 07:23:19.323634 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2336e116-3fed-45af-968b-301825aa44a6","Type":"ContainerDied","Data":"62265d015ff4962fcbaf716844c4ef386cf89bbbd13c2dace731c8743206848b"} Mar 14 07:23:19 crc kubenswrapper[4893]: I0314 07:23:19.324953 4893 generic.go:334] "Generic (PLEG): container finished" podID="ed84bb9f-6d66-4b40-8d90-2a2785d0ddd2" containerID="0d858d0941b63a264b577e23bcc03044e08fb0342547f46acf47f20765728aa7" exitCode=137 Mar 14 07:23:19 crc kubenswrapper[4893]: I0314 07:23:19.324975 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ed84bb9f-6d66-4b40-8d90-2a2785d0ddd2","Type":"ContainerDied","Data":"0d858d0941b63a264b577e23bcc03044e08fb0342547f46acf47f20765728aa7"} Mar 14 07:23:19 crc kubenswrapper[4893]: I0314 07:23:19.417809 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 14 07:23:19 crc kubenswrapper[4893]: I0314 07:23:19.422582 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 14 07:23:19 crc kubenswrapper[4893]: I0314 07:23:19.500126 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2336e116-3fed-45af-968b-301825aa44a6-config-data\") pod \"2336e116-3fed-45af-968b-301825aa44a6\" (UID: \"2336e116-3fed-45af-968b-301825aa44a6\") " Mar 14 07:23:19 crc kubenswrapper[4893]: I0314 07:23:19.500408 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2336e116-3fed-45af-968b-301825aa44a6-combined-ca-bundle\") pod \"2336e116-3fed-45af-968b-301825aa44a6\" (UID: \"2336e116-3fed-45af-968b-301825aa44a6\") " Mar 14 07:23:19 crc kubenswrapper[4893]: I0314 07:23:19.500445 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddwc6\" (UniqueName: \"kubernetes.io/projected/2336e116-3fed-45af-968b-301825aa44a6-kube-api-access-ddwc6\") pod \"2336e116-3fed-45af-968b-301825aa44a6\" (UID: \"2336e116-3fed-45af-968b-301825aa44a6\") " Mar 14 07:23:19 crc kubenswrapper[4893]: I0314 07:23:19.500974 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2336e116-3fed-45af-968b-301825aa44a6-logs\") pod \"2336e116-3fed-45af-968b-301825aa44a6\" (UID: \"2336e116-3fed-45af-968b-301825aa44a6\") " Mar 14 07:23:19 crc kubenswrapper[4893]: I0314 07:23:19.501018 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmght\" (UniqueName: \"kubernetes.io/projected/ed84bb9f-6d66-4b40-8d90-2a2785d0ddd2-kube-api-access-vmght\") pod \"ed84bb9f-6d66-4b40-8d90-2a2785d0ddd2\" (UID: \"ed84bb9f-6d66-4b40-8d90-2a2785d0ddd2\") " Mar 14 07:23:19 crc kubenswrapper[4893]: I0314 07:23:19.501084 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed84bb9f-6d66-4b40-8d90-2a2785d0ddd2-combined-ca-bundle\") pod \"ed84bb9f-6d66-4b40-8d90-2a2785d0ddd2\" (UID: \"ed84bb9f-6d66-4b40-8d90-2a2785d0ddd2\") " Mar 14 07:23:19 crc kubenswrapper[4893]: I0314 07:23:19.501178 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed84bb9f-6d66-4b40-8d90-2a2785d0ddd2-config-data\") pod \"ed84bb9f-6d66-4b40-8d90-2a2785d0ddd2\" (UID: \"ed84bb9f-6d66-4b40-8d90-2a2785d0ddd2\") " Mar 14 07:23:19 crc kubenswrapper[4893]: I0314 07:23:19.501487 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2336e116-3fed-45af-968b-301825aa44a6-logs" (OuterVolumeSpecName: "logs") pod "2336e116-3fed-45af-968b-301825aa44a6" (UID: "2336e116-3fed-45af-968b-301825aa44a6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:23:19 crc kubenswrapper[4893]: I0314 07:23:19.501654 4893 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2336e116-3fed-45af-968b-301825aa44a6-logs\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:19 crc kubenswrapper[4893]: I0314 07:23:19.508323 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed84bb9f-6d66-4b40-8d90-2a2785d0ddd2-kube-api-access-vmght" (OuterVolumeSpecName: "kube-api-access-vmght") pod "ed84bb9f-6d66-4b40-8d90-2a2785d0ddd2" (UID: "ed84bb9f-6d66-4b40-8d90-2a2785d0ddd2"). InnerVolumeSpecName "kube-api-access-vmght". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:23:19 crc kubenswrapper[4893]: I0314 07:23:19.508477 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2336e116-3fed-45af-968b-301825aa44a6-kube-api-access-ddwc6" (OuterVolumeSpecName: "kube-api-access-ddwc6") pod "2336e116-3fed-45af-968b-301825aa44a6" (UID: "2336e116-3fed-45af-968b-301825aa44a6"). InnerVolumeSpecName "kube-api-access-ddwc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:23:19 crc kubenswrapper[4893]: I0314 07:23:19.530727 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed84bb9f-6d66-4b40-8d90-2a2785d0ddd2-config-data" (OuterVolumeSpecName: "config-data") pod "ed84bb9f-6d66-4b40-8d90-2a2785d0ddd2" (UID: "ed84bb9f-6d66-4b40-8d90-2a2785d0ddd2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:23:19 crc kubenswrapper[4893]: I0314 07:23:19.531633 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed84bb9f-6d66-4b40-8d90-2a2785d0ddd2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ed84bb9f-6d66-4b40-8d90-2a2785d0ddd2" (UID: "ed84bb9f-6d66-4b40-8d90-2a2785d0ddd2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:23:19 crc kubenswrapper[4893]: I0314 07:23:19.534321 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2336e116-3fed-45af-968b-301825aa44a6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2336e116-3fed-45af-968b-301825aa44a6" (UID: "2336e116-3fed-45af-968b-301825aa44a6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:23:19 crc kubenswrapper[4893]: I0314 07:23:19.561493 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2336e116-3fed-45af-968b-301825aa44a6-config-data" (OuterVolumeSpecName: "config-data") pod "2336e116-3fed-45af-968b-301825aa44a6" (UID: "2336e116-3fed-45af-968b-301825aa44a6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:23:19 crc kubenswrapper[4893]: I0314 07:23:19.603959 4893 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed84bb9f-6d66-4b40-8d90-2a2785d0ddd2-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:19 crc kubenswrapper[4893]: I0314 07:23:19.603997 4893 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2336e116-3fed-45af-968b-301825aa44a6-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:19 crc kubenswrapper[4893]: I0314 07:23:19.604012 4893 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2336e116-3fed-45af-968b-301825aa44a6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:19 crc kubenswrapper[4893]: I0314 07:23:19.604028 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddwc6\" (UniqueName: \"kubernetes.io/projected/2336e116-3fed-45af-968b-301825aa44a6-kube-api-access-ddwc6\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:19 crc kubenswrapper[4893]: I0314 07:23:19.604042 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmght\" (UniqueName: \"kubernetes.io/projected/ed84bb9f-6d66-4b40-8d90-2a2785d0ddd2-kube-api-access-vmght\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:19 crc kubenswrapper[4893]: I0314 07:23:19.604056 4893 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed84bb9f-6d66-4b40-8d90-2a2785d0ddd2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:19 crc kubenswrapper[4893]: I0314 07:23:19.814041 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 14 07:23:19 crc kubenswrapper[4893]: I0314 07:23:19.814083 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 14 07:23:20 crc kubenswrapper[4893]: I0314 07:23:20.340388 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a97b4a0c-e753-44fc-ba00-12c82648c30a","Type":"ContainerStarted","Data":"073b9c184ce9046866e4943bc04f1b040fc75258df01b2366699930f65b01981"} Mar 14 07:23:20 crc kubenswrapper[4893]: I0314 07:23:20.340851 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 14 07:23:20 crc kubenswrapper[4893]: I0314 07:23:20.343567 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ed84bb9f-6d66-4b40-8d90-2a2785d0ddd2","Type":"ContainerDied","Data":"7c3b7c70024d3897f51a3aee732a87b7135b1b5652f8c7f39f2fff1394bf3499"} Mar 14 07:23:20 crc kubenswrapper[4893]: I0314 07:23:20.343637 4893 scope.go:117] "RemoveContainer" containerID="0d858d0941b63a264b577e23bcc03044e08fb0342547f46acf47f20765728aa7" Mar 14 07:23:20 crc kubenswrapper[4893]: I0314 07:23:20.344542 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 14 07:23:20 crc kubenswrapper[4893]: I0314 07:23:20.354735 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2336e116-3fed-45af-968b-301825aa44a6","Type":"ContainerDied","Data":"0a0538c7186fed0fc8f2e8dcc3a0485c473157eccddb3c5dd0d0627b415fcf59"} Mar 14 07:23:20 crc kubenswrapper[4893]: I0314 07:23:20.354780 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 14 07:23:20 crc kubenswrapper[4893]: I0314 07:23:20.371039 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.943143219 podStartE2EDuration="5.371009364s" podCreationTimestamp="2026-03-14 07:23:15 +0000 UTC" firstStartedPulling="2026-03-14 07:23:16.083924783 +0000 UTC m=+1475.346101575" lastFinishedPulling="2026-03-14 07:23:19.511790928 +0000 UTC m=+1478.773967720" observedRunningTime="2026-03-14 07:23:20.36713776 +0000 UTC m=+1479.629314552" watchObservedRunningTime="2026-03-14 07:23:20.371009364 +0000 UTC m=+1479.633186156" Mar 14 07:23:20 crc kubenswrapper[4893]: I0314 07:23:20.403517 4893 scope.go:117] "RemoveContainer" containerID="62265d015ff4962fcbaf716844c4ef386cf89bbbd13c2dace731c8743206848b" Mar 14 07:23:20 crc kubenswrapper[4893]: I0314 07:23:20.411610 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 07:23:20 crc kubenswrapper[4893]: I0314 07:23:20.437552 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 07:23:20 crc kubenswrapper[4893]: I0314 07:23:20.447609 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 14 07:23:20 crc kubenswrapper[4893]: I0314 07:23:20.452014 4893 scope.go:117] "RemoveContainer" containerID="4d7dd48993ccedf3d39ca9396519bdfa5e60db1086c1180d8829d4b46879e6ae" Mar 14 07:23:20 crc kubenswrapper[4893]: I0314 07:23:20.465716 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 14 07:23:20 crc kubenswrapper[4893]: I0314 07:23:20.468294 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 14 07:23:20 crc kubenswrapper[4893]: E0314 07:23:20.468884 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed84bb9f-6d66-4b40-8d90-2a2785d0ddd2" containerName="nova-cell1-novncproxy-novncproxy" Mar 14 07:23:20 crc kubenswrapper[4893]: I0314 07:23:20.468909 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed84bb9f-6d66-4b40-8d90-2a2785d0ddd2" containerName="nova-cell1-novncproxy-novncproxy" Mar 14 07:23:20 crc kubenswrapper[4893]: E0314 07:23:20.468936 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2336e116-3fed-45af-968b-301825aa44a6" containerName="nova-metadata-metadata" Mar 14 07:23:20 crc kubenswrapper[4893]: I0314 07:23:20.468945 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="2336e116-3fed-45af-968b-301825aa44a6" containerName="nova-metadata-metadata" Mar 14 07:23:20 crc kubenswrapper[4893]: E0314 07:23:20.468966 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2336e116-3fed-45af-968b-301825aa44a6" containerName="nova-metadata-log" Mar 14 07:23:20 crc kubenswrapper[4893]: I0314 07:23:20.468975 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="2336e116-3fed-45af-968b-301825aa44a6" containerName="nova-metadata-log" Mar 14 07:23:20 crc kubenswrapper[4893]: I0314 07:23:20.469219 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="2336e116-3fed-45af-968b-301825aa44a6" containerName="nova-metadata-log" Mar 14 07:23:20 crc kubenswrapper[4893]: I0314 07:23:20.469259 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="2336e116-3fed-45af-968b-301825aa44a6" containerName="nova-metadata-metadata" Mar 14 07:23:20 crc kubenswrapper[4893]: I0314 07:23:20.469275 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed84bb9f-6d66-4b40-8d90-2a2785d0ddd2" containerName="nova-cell1-novncproxy-novncproxy" Mar 14 07:23:20 crc kubenswrapper[4893]: I0314 07:23:20.470471 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 14 07:23:20 crc kubenswrapper[4893]: I0314 07:23:20.479484 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 14 07:23:20 crc kubenswrapper[4893]: I0314 07:23:20.480869 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 14 07:23:20 crc kubenswrapper[4893]: I0314 07:23:20.484308 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 14 07:23:20 crc kubenswrapper[4893]: I0314 07:23:20.484354 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 14 07:23:20 crc kubenswrapper[4893]: I0314 07:23:20.484615 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 14 07:23:20 crc kubenswrapper[4893]: I0314 07:23:20.484728 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 14 07:23:20 crc kubenswrapper[4893]: I0314 07:23:20.484826 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 14 07:23:20 crc kubenswrapper[4893]: I0314 07:23:20.487545 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 07:23:20 crc kubenswrapper[4893]: I0314 07:23:20.496135 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 14 07:23:20 crc kubenswrapper[4893]: I0314 07:23:20.603067 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 14 07:23:20 crc kubenswrapper[4893]: I0314 07:23:20.628772 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12b3c392-b02f-435f-8a96-04ad97890449-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"12b3c392-b02f-435f-8a96-04ad97890449\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 07:23:20 crc kubenswrapper[4893]: I0314 07:23:20.628827 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jm94\" (UniqueName: \"kubernetes.io/projected/37ed3c89-6d6e-481f-a8cb-c96b04c5c13a-kube-api-access-2jm94\") pod \"nova-metadata-0\" (UID: \"37ed3c89-6d6e-481f-a8cb-c96b04c5c13a\") " pod="openstack/nova-metadata-0" Mar 14 07:23:20 crc kubenswrapper[4893]: I0314 07:23:20.628858 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37ed3c89-6d6e-481f-a8cb-c96b04c5c13a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"37ed3c89-6d6e-481f-a8cb-c96b04c5c13a\") " pod="openstack/nova-metadata-0" Mar 14 07:23:20 crc kubenswrapper[4893]: I0314 07:23:20.628933 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/12b3c392-b02f-435f-8a96-04ad97890449-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"12b3c392-b02f-435f-8a96-04ad97890449\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 07:23:20 crc kubenswrapper[4893]: I0314 07:23:20.629028 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37ed3c89-6d6e-481f-a8cb-c96b04c5c13a-logs\") pod \"nova-metadata-0\" (UID: \"37ed3c89-6d6e-481f-a8cb-c96b04c5c13a\") " pod="openstack/nova-metadata-0" Mar 14 07:23:20 crc kubenswrapper[4893]: I0314 07:23:20.629113 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8cxh\" (UniqueName: \"kubernetes.io/projected/12b3c392-b02f-435f-8a96-04ad97890449-kube-api-access-t8cxh\") pod \"nova-cell1-novncproxy-0\" (UID: \"12b3c392-b02f-435f-8a96-04ad97890449\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 07:23:20 crc kubenswrapper[4893]: I0314 07:23:20.629220 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/37ed3c89-6d6e-481f-a8cb-c96b04c5c13a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"37ed3c89-6d6e-481f-a8cb-c96b04c5c13a\") " pod="openstack/nova-metadata-0" Mar 14 07:23:20 crc kubenswrapper[4893]: I0314 07:23:20.629308 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/12b3c392-b02f-435f-8a96-04ad97890449-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"12b3c392-b02f-435f-8a96-04ad97890449\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 07:23:20 crc kubenswrapper[4893]: I0314 07:23:20.629382 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37ed3c89-6d6e-481f-a8cb-c96b04c5c13a-config-data\") pod \"nova-metadata-0\" (UID: \"37ed3c89-6d6e-481f-a8cb-c96b04c5c13a\") " pod="openstack/nova-metadata-0" Mar 14 07:23:20 crc kubenswrapper[4893]: I0314 07:23:20.629573 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12b3c392-b02f-435f-8a96-04ad97890449-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"12b3c392-b02f-435f-8a96-04ad97890449\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 07:23:20 crc kubenswrapper[4893]: I0314 07:23:20.730605 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12b3c392-b02f-435f-8a96-04ad97890449-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"12b3c392-b02f-435f-8a96-04ad97890449\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 07:23:20 crc kubenswrapper[4893]: I0314 07:23:20.730657 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jm94\" (UniqueName: \"kubernetes.io/projected/37ed3c89-6d6e-481f-a8cb-c96b04c5c13a-kube-api-access-2jm94\") pod \"nova-metadata-0\" (UID: \"37ed3c89-6d6e-481f-a8cb-c96b04c5c13a\") " pod="openstack/nova-metadata-0" Mar 14 07:23:20 crc kubenswrapper[4893]: I0314 07:23:20.730685 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37ed3c89-6d6e-481f-a8cb-c96b04c5c13a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"37ed3c89-6d6e-481f-a8cb-c96b04c5c13a\") " pod="openstack/nova-metadata-0" Mar 14 07:23:20 crc kubenswrapper[4893]: I0314 07:23:20.730704 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/12b3c392-b02f-435f-8a96-04ad97890449-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"12b3c392-b02f-435f-8a96-04ad97890449\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 07:23:20 crc kubenswrapper[4893]: I0314 07:23:20.730744 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37ed3c89-6d6e-481f-a8cb-c96b04c5c13a-logs\") pod \"nova-metadata-0\" (UID: \"37ed3c89-6d6e-481f-a8cb-c96b04c5c13a\") " pod="openstack/nova-metadata-0" Mar 14 07:23:20 crc kubenswrapper[4893]: I0314 07:23:20.730781 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8cxh\" (UniqueName: \"kubernetes.io/projected/12b3c392-b02f-435f-8a96-04ad97890449-kube-api-access-t8cxh\") pod \"nova-cell1-novncproxy-0\" (UID: \"12b3c392-b02f-435f-8a96-04ad97890449\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 07:23:20 crc kubenswrapper[4893]: I0314 07:23:20.730819 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/37ed3c89-6d6e-481f-a8cb-c96b04c5c13a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"37ed3c89-6d6e-481f-a8cb-c96b04c5c13a\") " pod="openstack/nova-metadata-0" Mar 14 07:23:20 crc kubenswrapper[4893]: I0314 07:23:20.730854 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/12b3c392-b02f-435f-8a96-04ad97890449-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"12b3c392-b02f-435f-8a96-04ad97890449\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 07:23:20 crc kubenswrapper[4893]: I0314 07:23:20.730881 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37ed3c89-6d6e-481f-a8cb-c96b04c5c13a-config-data\") pod \"nova-metadata-0\" (UID: \"37ed3c89-6d6e-481f-a8cb-c96b04c5c13a\") " pod="openstack/nova-metadata-0" Mar 14 07:23:20 crc kubenswrapper[4893]: I0314 07:23:20.730940 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12b3c392-b02f-435f-8a96-04ad97890449-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"12b3c392-b02f-435f-8a96-04ad97890449\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 07:23:20 crc kubenswrapper[4893]: I0314 07:23:20.733943 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37ed3c89-6d6e-481f-a8cb-c96b04c5c13a-logs\") pod \"nova-metadata-0\" (UID: \"37ed3c89-6d6e-481f-a8cb-c96b04c5c13a\") " pod="openstack/nova-metadata-0" Mar 14 07:23:20 crc kubenswrapper[4893]: I0314 07:23:20.736192 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/12b3c392-b02f-435f-8a96-04ad97890449-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"12b3c392-b02f-435f-8a96-04ad97890449\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 07:23:20 crc kubenswrapper[4893]: I0314 07:23:20.736236 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12b3c392-b02f-435f-8a96-04ad97890449-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"12b3c392-b02f-435f-8a96-04ad97890449\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 07:23:20 crc kubenswrapper[4893]: I0314 07:23:20.736292 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12b3c392-b02f-435f-8a96-04ad97890449-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"12b3c392-b02f-435f-8a96-04ad97890449\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 07:23:20 crc kubenswrapper[4893]: I0314 07:23:20.737498 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37ed3c89-6d6e-481f-a8cb-c96b04c5c13a-config-data\") pod \"nova-metadata-0\" (UID: \"37ed3c89-6d6e-481f-a8cb-c96b04c5c13a\") " pod="openstack/nova-metadata-0" Mar 14 07:23:20 crc kubenswrapper[4893]: I0314 07:23:20.739233 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/37ed3c89-6d6e-481f-a8cb-c96b04c5c13a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"37ed3c89-6d6e-481f-a8cb-c96b04c5c13a\") " pod="openstack/nova-metadata-0" Mar 14 07:23:20 crc kubenswrapper[4893]: I0314 07:23:20.745745 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37ed3c89-6d6e-481f-a8cb-c96b04c5c13a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"37ed3c89-6d6e-481f-a8cb-c96b04c5c13a\") " pod="openstack/nova-metadata-0" Mar 14 07:23:20 crc kubenswrapper[4893]: I0314 07:23:20.746128 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jm94\" (UniqueName: \"kubernetes.io/projected/37ed3c89-6d6e-481f-a8cb-c96b04c5c13a-kube-api-access-2jm94\") pod \"nova-metadata-0\" (UID: \"37ed3c89-6d6e-481f-a8cb-c96b04c5c13a\") " pod="openstack/nova-metadata-0" Mar 14 07:23:20 crc kubenswrapper[4893]: I0314 07:23:20.746610 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8cxh\" (UniqueName: \"kubernetes.io/projected/12b3c392-b02f-435f-8a96-04ad97890449-kube-api-access-t8cxh\") pod \"nova-cell1-novncproxy-0\" (UID: \"12b3c392-b02f-435f-8a96-04ad97890449\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 07:23:20 crc kubenswrapper[4893]: I0314 07:23:20.746937 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/12b3c392-b02f-435f-8a96-04ad97890449-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"12b3c392-b02f-435f-8a96-04ad97890449\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 07:23:20 crc kubenswrapper[4893]: I0314 07:23:20.802614 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 14 07:23:20 crc kubenswrapper[4893]: I0314 07:23:20.809807 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 14 07:23:21 crc kubenswrapper[4893]: I0314 07:23:21.282635 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 07:23:21 crc kubenswrapper[4893]: I0314 07:23:21.297023 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 14 07:23:21 crc kubenswrapper[4893]: I0314 07:23:21.397625 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2336e116-3fed-45af-968b-301825aa44a6" path="/var/lib/kubelet/pods/2336e116-3fed-45af-968b-301825aa44a6/volumes" Mar 14 07:23:21 crc kubenswrapper[4893]: I0314 07:23:21.398176 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed84bb9f-6d66-4b40-8d90-2a2785d0ddd2" path="/var/lib/kubelet/pods/ed84bb9f-6d66-4b40-8d90-2a2785d0ddd2/volumes" Mar 14 07:23:21 crc kubenswrapper[4893]: I0314 07:23:21.398651 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"37ed3c89-6d6e-481f-a8cb-c96b04c5c13a","Type":"ContainerStarted","Data":"be20cb920ae2eb8c6359ac73e2e243eb4089bce3f398bc0ee44624b444c23810"} Mar 14 07:23:21 crc kubenswrapper[4893]: I0314 07:23:21.398669 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"12b3c392-b02f-435f-8a96-04ad97890449","Type":"ContainerStarted","Data":"15eb4af16117dcb4aace993e70f4192634171d7f8aa9f27dbd85e3a66bc48a50"} Mar 14 07:23:21 crc kubenswrapper[4893]: I0314 07:23:21.819414 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 14 07:23:21 crc kubenswrapper[4893]: I0314 07:23:21.823002 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 14 07:23:21 crc kubenswrapper[4893]: I0314 07:23:21.826854 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 14 07:23:22 crc kubenswrapper[4893]: I0314 07:23:22.408695 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"37ed3c89-6d6e-481f-a8cb-c96b04c5c13a","Type":"ContainerStarted","Data":"a2d2e00f0379f1d18e6c37655d633b0f21a0f8144a77f9ea02d818de5a06ae1a"} Mar 14 07:23:22 crc kubenswrapper[4893]: I0314 07:23:22.409013 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"37ed3c89-6d6e-481f-a8cb-c96b04c5c13a","Type":"ContainerStarted","Data":"45d66f4f6aaf1243be6d7f8337f16f24a2c1f42d48bef2ec7ad7845befd6e10e"} Mar 14 07:23:22 crc kubenswrapper[4893]: I0314 07:23:22.410352 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"12b3c392-b02f-435f-8a96-04ad97890449","Type":"ContainerStarted","Data":"8df0344d0d349890a42132c18330de316fe3521d61a0cdd8284849c4aacee7c9"} Mar 14 07:23:22 crc kubenswrapper[4893]: I0314 07:23:22.420037 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 14 07:23:22 crc kubenswrapper[4893]: I0314 07:23:22.434395 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.434377079 podStartE2EDuration="2.434377079s" podCreationTimestamp="2026-03-14 07:23:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:23:22.432290199 +0000 UTC m=+1481.694467021" watchObservedRunningTime="2026-03-14 07:23:22.434377079 +0000 UTC m=+1481.696553871" Mar 14 07:23:22 crc kubenswrapper[4893]: I0314 07:23:22.454911 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.454889086 podStartE2EDuration="2.454889086s" podCreationTimestamp="2026-03-14 07:23:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:23:22.452231062 +0000 UTC m=+1481.714407884" watchObservedRunningTime="2026-03-14 07:23:22.454889086 +0000 UTC m=+1481.717065898" Mar 14 07:23:22 crc kubenswrapper[4893]: I0314 07:23:22.649207 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-69ffc749-7qsrf"] Mar 14 07:23:22 crc kubenswrapper[4893]: I0314 07:23:22.650968 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69ffc749-7qsrf" Mar 14 07:23:22 crc kubenswrapper[4893]: I0314 07:23:22.659476 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69ffc749-7qsrf"] Mar 14 07:23:22 crc kubenswrapper[4893]: I0314 07:23:22.784448 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvlcj\" (UniqueName: \"kubernetes.io/projected/5241cf43-f60b-4499-ae07-6b449f6ef57e-kube-api-access-nvlcj\") pod \"dnsmasq-dns-69ffc749-7qsrf\" (UID: \"5241cf43-f60b-4499-ae07-6b449f6ef57e\") " pod="openstack/dnsmasq-dns-69ffc749-7qsrf" Mar 14 07:23:22 crc kubenswrapper[4893]: I0314 07:23:22.784847 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5241cf43-f60b-4499-ae07-6b449f6ef57e-dns-swift-storage-0\") pod \"dnsmasq-dns-69ffc749-7qsrf\" (UID: \"5241cf43-f60b-4499-ae07-6b449f6ef57e\") " pod="openstack/dnsmasq-dns-69ffc749-7qsrf" Mar 14 07:23:22 crc kubenswrapper[4893]: I0314 07:23:22.784920 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5241cf43-f60b-4499-ae07-6b449f6ef57e-ovsdbserver-nb\") pod \"dnsmasq-dns-69ffc749-7qsrf\" (UID: \"5241cf43-f60b-4499-ae07-6b449f6ef57e\") " pod="openstack/dnsmasq-dns-69ffc749-7qsrf" Mar 14 07:23:22 crc kubenswrapper[4893]: I0314 07:23:22.784956 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5241cf43-f60b-4499-ae07-6b449f6ef57e-dns-svc\") pod \"dnsmasq-dns-69ffc749-7qsrf\" (UID: \"5241cf43-f60b-4499-ae07-6b449f6ef57e\") " pod="openstack/dnsmasq-dns-69ffc749-7qsrf" Mar 14 07:23:22 crc kubenswrapper[4893]: I0314 07:23:22.785248 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5241cf43-f60b-4499-ae07-6b449f6ef57e-config\") pod \"dnsmasq-dns-69ffc749-7qsrf\" (UID: \"5241cf43-f60b-4499-ae07-6b449f6ef57e\") " pod="openstack/dnsmasq-dns-69ffc749-7qsrf" Mar 14 07:23:22 crc kubenswrapper[4893]: I0314 07:23:22.785362 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5241cf43-f60b-4499-ae07-6b449f6ef57e-ovsdbserver-sb\") pod \"dnsmasq-dns-69ffc749-7qsrf\" (UID: \"5241cf43-f60b-4499-ae07-6b449f6ef57e\") " pod="openstack/dnsmasq-dns-69ffc749-7qsrf" Mar 14 07:23:22 crc kubenswrapper[4893]: I0314 07:23:22.887163 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5241cf43-f60b-4499-ae07-6b449f6ef57e-dns-swift-storage-0\") pod \"dnsmasq-dns-69ffc749-7qsrf\" (UID: \"5241cf43-f60b-4499-ae07-6b449f6ef57e\") " pod="openstack/dnsmasq-dns-69ffc749-7qsrf" Mar 14 07:23:22 crc kubenswrapper[4893]: I0314 07:23:22.887255 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5241cf43-f60b-4499-ae07-6b449f6ef57e-ovsdbserver-nb\") pod \"dnsmasq-dns-69ffc749-7qsrf\" (UID: \"5241cf43-f60b-4499-ae07-6b449f6ef57e\") " pod="openstack/dnsmasq-dns-69ffc749-7qsrf" Mar 14 07:23:22 crc kubenswrapper[4893]: I0314 07:23:22.887306 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5241cf43-f60b-4499-ae07-6b449f6ef57e-dns-svc\") pod \"dnsmasq-dns-69ffc749-7qsrf\" (UID: \"5241cf43-f60b-4499-ae07-6b449f6ef57e\") " pod="openstack/dnsmasq-dns-69ffc749-7qsrf" Mar 14 07:23:22 crc kubenswrapper[4893]: I0314 07:23:22.887340 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5241cf43-f60b-4499-ae07-6b449f6ef57e-config\") pod \"dnsmasq-dns-69ffc749-7qsrf\" (UID: \"5241cf43-f60b-4499-ae07-6b449f6ef57e\") " pod="openstack/dnsmasq-dns-69ffc749-7qsrf" Mar 14 07:23:22 crc kubenswrapper[4893]: I0314 07:23:22.887382 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5241cf43-f60b-4499-ae07-6b449f6ef57e-ovsdbserver-sb\") pod \"dnsmasq-dns-69ffc749-7qsrf\" (UID: \"5241cf43-f60b-4499-ae07-6b449f6ef57e\") " pod="openstack/dnsmasq-dns-69ffc749-7qsrf" Mar 14 07:23:22 crc kubenswrapper[4893]: I0314 07:23:22.887504 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvlcj\" (UniqueName: \"kubernetes.io/projected/5241cf43-f60b-4499-ae07-6b449f6ef57e-kube-api-access-nvlcj\") pod \"dnsmasq-dns-69ffc749-7qsrf\" (UID: \"5241cf43-f60b-4499-ae07-6b449f6ef57e\") " pod="openstack/dnsmasq-dns-69ffc749-7qsrf" Mar 14 07:23:22 crc kubenswrapper[4893]: I0314 07:23:22.888738 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5241cf43-f60b-4499-ae07-6b449f6ef57e-dns-swift-storage-0\") pod \"dnsmasq-dns-69ffc749-7qsrf\" (UID: \"5241cf43-f60b-4499-ae07-6b449f6ef57e\") " pod="openstack/dnsmasq-dns-69ffc749-7qsrf" Mar 14 07:23:22 crc kubenswrapper[4893]: I0314 07:23:22.889412 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5241cf43-f60b-4499-ae07-6b449f6ef57e-ovsdbserver-nb\") pod \"dnsmasq-dns-69ffc749-7qsrf\" (UID: \"5241cf43-f60b-4499-ae07-6b449f6ef57e\") " pod="openstack/dnsmasq-dns-69ffc749-7qsrf" Mar 14 07:23:22 crc kubenswrapper[4893]: I0314 07:23:22.890034 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5241cf43-f60b-4499-ae07-6b449f6ef57e-dns-svc\") pod \"dnsmasq-dns-69ffc749-7qsrf\" (UID: \"5241cf43-f60b-4499-ae07-6b449f6ef57e\") " pod="openstack/dnsmasq-dns-69ffc749-7qsrf" Mar 14 07:23:22 crc kubenswrapper[4893]: I0314 07:23:22.890662 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5241cf43-f60b-4499-ae07-6b449f6ef57e-config\") pod \"dnsmasq-dns-69ffc749-7qsrf\" (UID: \"5241cf43-f60b-4499-ae07-6b449f6ef57e\") " pod="openstack/dnsmasq-dns-69ffc749-7qsrf" Mar 14 07:23:22 crc kubenswrapper[4893]: I0314 07:23:22.891276 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5241cf43-f60b-4499-ae07-6b449f6ef57e-ovsdbserver-sb\") pod \"dnsmasq-dns-69ffc749-7qsrf\" (UID: \"5241cf43-f60b-4499-ae07-6b449f6ef57e\") " pod="openstack/dnsmasq-dns-69ffc749-7qsrf" Mar 14 07:23:22 crc kubenswrapper[4893]: I0314 07:23:22.928538 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvlcj\" (UniqueName: \"kubernetes.io/projected/5241cf43-f60b-4499-ae07-6b449f6ef57e-kube-api-access-nvlcj\") pod \"dnsmasq-dns-69ffc749-7qsrf\" (UID: \"5241cf43-f60b-4499-ae07-6b449f6ef57e\") " pod="openstack/dnsmasq-dns-69ffc749-7qsrf" Mar 14 07:23:22 crc kubenswrapper[4893]: I0314 07:23:22.976433 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69ffc749-7qsrf" Mar 14 07:23:23 crc kubenswrapper[4893]: I0314 07:23:23.468635 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69ffc749-7qsrf"] Mar 14 07:23:24 crc kubenswrapper[4893]: I0314 07:23:24.436809 4893 generic.go:334] "Generic (PLEG): container finished" podID="5241cf43-f60b-4499-ae07-6b449f6ef57e" containerID="8a23c67b80013882f7c50d0e0b6c68e638069643a08b3c1cf1fe8d7791b8d9ab" exitCode=0 Mar 14 07:23:24 crc kubenswrapper[4893]: I0314 07:23:24.437092 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69ffc749-7qsrf" event={"ID":"5241cf43-f60b-4499-ae07-6b449f6ef57e","Type":"ContainerDied","Data":"8a23c67b80013882f7c50d0e0b6c68e638069643a08b3c1cf1fe8d7791b8d9ab"} Mar 14 07:23:24 crc kubenswrapper[4893]: I0314 07:23:24.437128 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69ffc749-7qsrf" event={"ID":"5241cf43-f60b-4499-ae07-6b449f6ef57e","Type":"ContainerStarted","Data":"1d96da80206a893b073d69a3fa67a054b52bdf5bab173b3ca52d2ed565c61cd3"} Mar 14 07:23:24 crc kubenswrapper[4893]: I0314 07:23:24.683839 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:23:24 crc kubenswrapper[4893]: I0314 07:23:24.684390 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a97b4a0c-e753-44fc-ba00-12c82648c30a" containerName="ceilometer-central-agent" containerID="cri-o://f93c2e6852c1f30ddcb040cabab75f3dbad059392845c126c070852b93a48323" gracePeriod=30 Mar 14 07:23:24 crc kubenswrapper[4893]: I0314 07:23:24.684468 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a97b4a0c-e753-44fc-ba00-12c82648c30a" containerName="sg-core" containerID="cri-o://212a975ff9e1cacce9d3e6283aa65866d8f8e34990736ef7141b9b6c2cfaca29" gracePeriod=30 Mar 14 07:23:24 crc kubenswrapper[4893]: I0314 07:23:24.684468 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a97b4a0c-e753-44fc-ba00-12c82648c30a" containerName="proxy-httpd" containerID="cri-o://073b9c184ce9046866e4943bc04f1b040fc75258df01b2366699930f65b01981" gracePeriod=30 Mar 14 07:23:24 crc kubenswrapper[4893]: I0314 07:23:24.684492 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a97b4a0c-e753-44fc-ba00-12c82648c30a" containerName="ceilometer-notification-agent" containerID="cri-o://7facda4f79e49359930910e74711a0b4e237b3502da367234832f322f7e87812" gracePeriod=30 Mar 14 07:23:24 crc kubenswrapper[4893]: I0314 07:23:24.951118 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 14 07:23:25 crc kubenswrapper[4893]: I0314 07:23:25.447891 4893 generic.go:334] "Generic (PLEG): container finished" podID="a97b4a0c-e753-44fc-ba00-12c82648c30a" containerID="073b9c184ce9046866e4943bc04f1b040fc75258df01b2366699930f65b01981" exitCode=0 Mar 14 07:23:25 crc kubenswrapper[4893]: I0314 07:23:25.448228 4893 generic.go:334] "Generic (PLEG): container finished" podID="a97b4a0c-e753-44fc-ba00-12c82648c30a" containerID="212a975ff9e1cacce9d3e6283aa65866d8f8e34990736ef7141b9b6c2cfaca29" exitCode=2 Mar 14 07:23:25 crc kubenswrapper[4893]: I0314 07:23:25.448245 4893 generic.go:334] "Generic (PLEG): container finished" podID="a97b4a0c-e753-44fc-ba00-12c82648c30a" containerID="7facda4f79e49359930910e74711a0b4e237b3502da367234832f322f7e87812" exitCode=0 Mar 14 07:23:25 crc kubenswrapper[4893]: I0314 07:23:25.448258 4893 generic.go:334] "Generic (PLEG): container finished" podID="a97b4a0c-e753-44fc-ba00-12c82648c30a" containerID="f93c2e6852c1f30ddcb040cabab75f3dbad059392845c126c070852b93a48323" exitCode=0 Mar 14 07:23:25 crc kubenswrapper[4893]: I0314 07:23:25.447926 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a97b4a0c-e753-44fc-ba00-12c82648c30a","Type":"ContainerDied","Data":"073b9c184ce9046866e4943bc04f1b040fc75258df01b2366699930f65b01981"} Mar 14 07:23:25 crc kubenswrapper[4893]: I0314 07:23:25.448348 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a97b4a0c-e753-44fc-ba00-12c82648c30a","Type":"ContainerDied","Data":"212a975ff9e1cacce9d3e6283aa65866d8f8e34990736ef7141b9b6c2cfaca29"} Mar 14 07:23:25 crc kubenswrapper[4893]: I0314 07:23:25.448372 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a97b4a0c-e753-44fc-ba00-12c82648c30a","Type":"ContainerDied","Data":"7facda4f79e49359930910e74711a0b4e237b3502da367234832f322f7e87812"} Mar 14 07:23:25 crc kubenswrapper[4893]: I0314 07:23:25.448390 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a97b4a0c-e753-44fc-ba00-12c82648c30a","Type":"ContainerDied","Data":"f93c2e6852c1f30ddcb040cabab75f3dbad059392845c126c070852b93a48323"} Mar 14 07:23:25 crc kubenswrapper[4893]: I0314 07:23:25.448405 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a97b4a0c-e753-44fc-ba00-12c82648c30a","Type":"ContainerDied","Data":"bb91a8390830b0c618d5e8a389922817dba3f5ce79916f2036e02da4b6bce2fd"} Mar 14 07:23:25 crc kubenswrapper[4893]: I0314 07:23:25.448416 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb91a8390830b0c618d5e8a389922817dba3f5ce79916f2036e02da4b6bce2fd" Mar 14 07:23:25 crc kubenswrapper[4893]: I0314 07:23:25.450436 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69ffc749-7qsrf" event={"ID":"5241cf43-f60b-4499-ae07-6b449f6ef57e","Type":"ContainerStarted","Data":"acd718ecc01f85e0d44e4851299d2daef071eb198233eb55a80f4008d7f7cece"} Mar 14 07:23:25 crc kubenswrapper[4893]: I0314 07:23:25.450689 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="2b55f3b4-1214-471c-a15c-6364e97d0818" containerName="nova-api-api" containerID="cri-o://62be3b20c75ccf945fc1b6ffcd40a952892e1ad548e1d8f510579789c375347c" gracePeriod=30 Mar 14 07:23:25 crc kubenswrapper[4893]: I0314 07:23:25.450899 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="2b55f3b4-1214-471c-a15c-6364e97d0818" containerName="nova-api-log" containerID="cri-o://85823d86f7f607494f364ff61b9ba224eb4ff5e8b1dddee81c142042cdf0d7fd" gracePeriod=30 Mar 14 07:23:25 crc kubenswrapper[4893]: I0314 07:23:25.475075 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-69ffc749-7qsrf" podStartSLOduration=3.475049878 podStartE2EDuration="3.475049878s" podCreationTimestamp="2026-03-14 07:23:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:23:25.47179528 +0000 UTC m=+1484.733972092" watchObservedRunningTime="2026-03-14 07:23:25.475049878 +0000 UTC m=+1484.737226670" Mar 14 07:23:25 crc kubenswrapper[4893]: I0314 07:23:25.552396 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 07:23:25 crc kubenswrapper[4893]: I0314 07:23:25.634875 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a97b4a0c-e753-44fc-ba00-12c82648c30a-run-httpd\") pod \"a97b4a0c-e753-44fc-ba00-12c82648c30a\" (UID: \"a97b4a0c-e753-44fc-ba00-12c82648c30a\") " Mar 14 07:23:25 crc kubenswrapper[4893]: I0314 07:23:25.634922 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pg692\" (UniqueName: \"kubernetes.io/projected/a97b4a0c-e753-44fc-ba00-12c82648c30a-kube-api-access-pg692\") pod \"a97b4a0c-e753-44fc-ba00-12c82648c30a\" (UID: \"a97b4a0c-e753-44fc-ba00-12c82648c30a\") " Mar 14 07:23:25 crc kubenswrapper[4893]: I0314 07:23:25.634995 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a97b4a0c-e753-44fc-ba00-12c82648c30a-ceilometer-tls-certs\") pod \"a97b4a0c-e753-44fc-ba00-12c82648c30a\" (UID: \"a97b4a0c-e753-44fc-ba00-12c82648c30a\") " Mar 14 07:23:25 crc kubenswrapper[4893]: I0314 07:23:25.635047 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a97b4a0c-e753-44fc-ba00-12c82648c30a-log-httpd\") pod \"a97b4a0c-e753-44fc-ba00-12c82648c30a\" (UID: \"a97b4a0c-e753-44fc-ba00-12c82648c30a\") " Mar 14 07:23:25 crc kubenswrapper[4893]: I0314 07:23:25.635068 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a97b4a0c-e753-44fc-ba00-12c82648c30a-combined-ca-bundle\") pod \"a97b4a0c-e753-44fc-ba00-12c82648c30a\" (UID: \"a97b4a0c-e753-44fc-ba00-12c82648c30a\") " Mar 14 07:23:25 crc kubenswrapper[4893]: I0314 07:23:25.635128 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a97b4a0c-e753-44fc-ba00-12c82648c30a-scripts\") pod \"a97b4a0c-e753-44fc-ba00-12c82648c30a\" (UID: \"a97b4a0c-e753-44fc-ba00-12c82648c30a\") " Mar 14 07:23:25 crc kubenswrapper[4893]: I0314 07:23:25.635175 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a97b4a0c-e753-44fc-ba00-12c82648c30a-sg-core-conf-yaml\") pod \"a97b4a0c-e753-44fc-ba00-12c82648c30a\" (UID: \"a97b4a0c-e753-44fc-ba00-12c82648c30a\") " Mar 14 07:23:25 crc kubenswrapper[4893]: I0314 07:23:25.635285 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a97b4a0c-e753-44fc-ba00-12c82648c30a-config-data\") pod \"a97b4a0c-e753-44fc-ba00-12c82648c30a\" (UID: \"a97b4a0c-e753-44fc-ba00-12c82648c30a\") " Mar 14 07:23:25 crc kubenswrapper[4893]: I0314 07:23:25.637180 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a97b4a0c-e753-44fc-ba00-12c82648c30a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a97b4a0c-e753-44fc-ba00-12c82648c30a" (UID: "a97b4a0c-e753-44fc-ba00-12c82648c30a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:23:25 crc kubenswrapper[4893]: I0314 07:23:25.637515 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a97b4a0c-e753-44fc-ba00-12c82648c30a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a97b4a0c-e753-44fc-ba00-12c82648c30a" (UID: "a97b4a0c-e753-44fc-ba00-12c82648c30a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:23:25 crc kubenswrapper[4893]: I0314 07:23:25.641500 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a97b4a0c-e753-44fc-ba00-12c82648c30a-scripts" (OuterVolumeSpecName: "scripts") pod "a97b4a0c-e753-44fc-ba00-12c82648c30a" (UID: "a97b4a0c-e753-44fc-ba00-12c82648c30a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:23:25 crc kubenswrapper[4893]: I0314 07:23:25.641773 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a97b4a0c-e753-44fc-ba00-12c82648c30a-kube-api-access-pg692" (OuterVolumeSpecName: "kube-api-access-pg692") pod "a97b4a0c-e753-44fc-ba00-12c82648c30a" (UID: "a97b4a0c-e753-44fc-ba00-12c82648c30a"). InnerVolumeSpecName "kube-api-access-pg692". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:23:25 crc kubenswrapper[4893]: I0314 07:23:25.673908 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a97b4a0c-e753-44fc-ba00-12c82648c30a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a97b4a0c-e753-44fc-ba00-12c82648c30a" (UID: "a97b4a0c-e753-44fc-ba00-12c82648c30a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:23:25 crc kubenswrapper[4893]: I0314 07:23:25.692802 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a97b4a0c-e753-44fc-ba00-12c82648c30a-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "a97b4a0c-e753-44fc-ba00-12c82648c30a" (UID: "a97b4a0c-e753-44fc-ba00-12c82648c30a"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:23:25 crc kubenswrapper[4893]: I0314 07:23:25.740571 4893 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a97b4a0c-e753-44fc-ba00-12c82648c30a-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:25 crc kubenswrapper[4893]: I0314 07:23:25.740629 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pg692\" (UniqueName: \"kubernetes.io/projected/a97b4a0c-e753-44fc-ba00-12c82648c30a-kube-api-access-pg692\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:25 crc kubenswrapper[4893]: I0314 07:23:25.740649 4893 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a97b4a0c-e753-44fc-ba00-12c82648c30a-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:25 crc kubenswrapper[4893]: I0314 07:23:25.740664 4893 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a97b4a0c-e753-44fc-ba00-12c82648c30a-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:25 crc kubenswrapper[4893]: I0314 07:23:25.740678 4893 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a97b4a0c-e753-44fc-ba00-12c82648c30a-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:25 crc kubenswrapper[4893]: I0314 07:23:25.740692 4893 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a97b4a0c-e753-44fc-ba00-12c82648c30a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:25 crc kubenswrapper[4893]: E0314 07:23:25.743919 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a97b4a0c-e753-44fc-ba00-12c82648c30a-config-data podName:a97b4a0c-e753-44fc-ba00-12c82648c30a nodeName:}" failed. No retries permitted until 2026-03-14 07:23:26.243883252 +0000 UTC m=+1485.506060104 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "config-data" (UniqueName: "kubernetes.io/secret/a97b4a0c-e753-44fc-ba00-12c82648c30a-config-data") pod "a97b4a0c-e753-44fc-ba00-12c82648c30a" (UID: "a97b4a0c-e753-44fc-ba00-12c82648c30a") : error deleting /var/lib/kubelet/pods/a97b4a0c-e753-44fc-ba00-12c82648c30a/volume-subpaths: remove /var/lib/kubelet/pods/a97b4a0c-e753-44fc-ba00-12c82648c30a/volume-subpaths: no such file or directory Mar 14 07:23:25 crc kubenswrapper[4893]: I0314 07:23:25.746727 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a97b4a0c-e753-44fc-ba00-12c82648c30a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a97b4a0c-e753-44fc-ba00-12c82648c30a" (UID: "a97b4a0c-e753-44fc-ba00-12c82648c30a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:23:25 crc kubenswrapper[4893]: I0314 07:23:25.810179 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 14 07:23:25 crc kubenswrapper[4893]: I0314 07:23:25.842910 4893 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a97b4a0c-e753-44fc-ba00-12c82648c30a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:26 crc kubenswrapper[4893]: I0314 07:23:26.251437 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a97b4a0c-e753-44fc-ba00-12c82648c30a-config-data\") pod \"a97b4a0c-e753-44fc-ba00-12c82648c30a\" (UID: \"a97b4a0c-e753-44fc-ba00-12c82648c30a\") " Mar 14 07:23:26 crc kubenswrapper[4893]: I0314 07:23:26.255774 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a97b4a0c-e753-44fc-ba00-12c82648c30a-config-data" (OuterVolumeSpecName: "config-data") pod "a97b4a0c-e753-44fc-ba00-12c82648c30a" (UID: "a97b4a0c-e753-44fc-ba00-12c82648c30a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:23:26 crc kubenswrapper[4893]: I0314 07:23:26.354092 4893 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a97b4a0c-e753-44fc-ba00-12c82648c30a-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:26 crc kubenswrapper[4893]: I0314 07:23:26.464847 4893 generic.go:334] "Generic (PLEG): container finished" podID="2b55f3b4-1214-471c-a15c-6364e97d0818" containerID="85823d86f7f607494f364ff61b9ba224eb4ff5e8b1dddee81c142042cdf0d7fd" exitCode=143 Mar 14 07:23:26 crc kubenswrapper[4893]: I0314 07:23:26.464936 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 07:23:26 crc kubenswrapper[4893]: I0314 07:23:26.464931 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2b55f3b4-1214-471c-a15c-6364e97d0818","Type":"ContainerDied","Data":"85823d86f7f607494f364ff61b9ba224eb4ff5e8b1dddee81c142042cdf0d7fd"} Mar 14 07:23:26 crc kubenswrapper[4893]: I0314 07:23:26.467119 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-69ffc749-7qsrf" Mar 14 07:23:26 crc kubenswrapper[4893]: I0314 07:23:26.512670 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:23:26 crc kubenswrapper[4893]: I0314 07:23:26.531654 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:23:26 crc kubenswrapper[4893]: I0314 07:23:26.550381 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:23:26 crc kubenswrapper[4893]: E0314 07:23:26.550942 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a97b4a0c-e753-44fc-ba00-12c82648c30a" containerName="ceilometer-notification-agent" Mar 14 07:23:26 crc kubenswrapper[4893]: I0314 07:23:26.550970 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="a97b4a0c-e753-44fc-ba00-12c82648c30a" containerName="ceilometer-notification-agent" Mar 14 07:23:26 crc kubenswrapper[4893]: E0314 07:23:26.550997 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a97b4a0c-e753-44fc-ba00-12c82648c30a" containerName="sg-core" Mar 14 07:23:26 crc kubenswrapper[4893]: I0314 07:23:26.551006 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="a97b4a0c-e753-44fc-ba00-12c82648c30a" containerName="sg-core" Mar 14 07:23:26 crc kubenswrapper[4893]: E0314 07:23:26.551031 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a97b4a0c-e753-44fc-ba00-12c82648c30a" containerName="proxy-httpd" Mar 14 07:23:26 crc kubenswrapper[4893]: I0314 07:23:26.551043 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="a97b4a0c-e753-44fc-ba00-12c82648c30a" containerName="proxy-httpd" Mar 14 07:23:26 crc kubenswrapper[4893]: E0314 07:23:26.551057 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a97b4a0c-e753-44fc-ba00-12c82648c30a" containerName="ceilometer-central-agent" Mar 14 07:23:26 crc kubenswrapper[4893]: I0314 07:23:26.551064 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="a97b4a0c-e753-44fc-ba00-12c82648c30a" containerName="ceilometer-central-agent" Mar 14 07:23:26 crc kubenswrapper[4893]: I0314 07:23:26.551287 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="a97b4a0c-e753-44fc-ba00-12c82648c30a" containerName="ceilometer-central-agent" Mar 14 07:23:26 crc kubenswrapper[4893]: I0314 07:23:26.551314 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="a97b4a0c-e753-44fc-ba00-12c82648c30a" containerName="ceilometer-notification-agent" Mar 14 07:23:26 crc kubenswrapper[4893]: I0314 07:23:26.551350 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="a97b4a0c-e753-44fc-ba00-12c82648c30a" containerName="sg-core" Mar 14 07:23:26 crc kubenswrapper[4893]: I0314 07:23:26.551370 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="a97b4a0c-e753-44fc-ba00-12c82648c30a" containerName="proxy-httpd" Mar 14 07:23:26 crc kubenswrapper[4893]: I0314 07:23:26.553603 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 07:23:26 crc kubenswrapper[4893]: I0314 07:23:26.556300 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 14 07:23:26 crc kubenswrapper[4893]: I0314 07:23:26.556450 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 14 07:23:26 crc kubenswrapper[4893]: I0314 07:23:26.556482 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 14 07:23:26 crc kubenswrapper[4893]: I0314 07:23:26.560561 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:23:26 crc kubenswrapper[4893]: I0314 07:23:26.599031 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:23:26 crc kubenswrapper[4893]: E0314 07:23:26.599762 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[ceilometer-tls-certs combined-ca-bundle config-data kube-api-access-59dqs log-httpd run-httpd scripts sg-core-conf-yaml], unattached volumes=[], failed to process volumes=[ceilometer-tls-certs combined-ca-bundle config-data kube-api-access-59dqs log-httpd run-httpd scripts sg-core-conf-yaml]: context canceled" pod="openstack/ceilometer-0" podUID="d80346c2-8721-4c5b-8848-2c44d1b8d2e5" Mar 14 07:23:26 crc kubenswrapper[4893]: I0314 07:23:26.659880 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d80346c2-8721-4c5b-8848-2c44d1b8d2e5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d80346c2-8721-4c5b-8848-2c44d1b8d2e5\") " pod="openstack/ceilometer-0" Mar 14 07:23:26 crc kubenswrapper[4893]: I0314 07:23:26.659959 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d80346c2-8721-4c5b-8848-2c44d1b8d2e5-log-httpd\") pod \"ceilometer-0\" (UID: \"d80346c2-8721-4c5b-8848-2c44d1b8d2e5\") " pod="openstack/ceilometer-0" Mar 14 07:23:26 crc kubenswrapper[4893]: I0314 07:23:26.659980 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d80346c2-8721-4c5b-8848-2c44d1b8d2e5-config-data\") pod \"ceilometer-0\" (UID: \"d80346c2-8721-4c5b-8848-2c44d1b8d2e5\") " pod="openstack/ceilometer-0" Mar 14 07:23:26 crc kubenswrapper[4893]: I0314 07:23:26.660212 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d80346c2-8721-4c5b-8848-2c44d1b8d2e5-run-httpd\") pod \"ceilometer-0\" (UID: \"d80346c2-8721-4c5b-8848-2c44d1b8d2e5\") " pod="openstack/ceilometer-0" Mar 14 07:23:26 crc kubenswrapper[4893]: I0314 07:23:26.660248 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59dqs\" (UniqueName: \"kubernetes.io/projected/d80346c2-8721-4c5b-8848-2c44d1b8d2e5-kube-api-access-59dqs\") pod \"ceilometer-0\" (UID: \"d80346c2-8721-4c5b-8848-2c44d1b8d2e5\") " pod="openstack/ceilometer-0" Mar 14 07:23:26 crc kubenswrapper[4893]: I0314 07:23:26.660391 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d80346c2-8721-4c5b-8848-2c44d1b8d2e5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d80346c2-8721-4c5b-8848-2c44d1b8d2e5\") " pod="openstack/ceilometer-0" Mar 14 07:23:26 crc kubenswrapper[4893]: I0314 07:23:26.660739 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d80346c2-8721-4c5b-8848-2c44d1b8d2e5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d80346c2-8721-4c5b-8848-2c44d1b8d2e5\") " pod="openstack/ceilometer-0" Mar 14 07:23:26 crc kubenswrapper[4893]: I0314 07:23:26.660985 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d80346c2-8721-4c5b-8848-2c44d1b8d2e5-scripts\") pod \"ceilometer-0\" (UID: \"d80346c2-8721-4c5b-8848-2c44d1b8d2e5\") " pod="openstack/ceilometer-0" Mar 14 07:23:26 crc kubenswrapper[4893]: I0314 07:23:26.763557 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d80346c2-8721-4c5b-8848-2c44d1b8d2e5-scripts\") pod \"ceilometer-0\" (UID: \"d80346c2-8721-4c5b-8848-2c44d1b8d2e5\") " pod="openstack/ceilometer-0" Mar 14 07:23:26 crc kubenswrapper[4893]: I0314 07:23:26.763644 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d80346c2-8721-4c5b-8848-2c44d1b8d2e5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d80346c2-8721-4c5b-8848-2c44d1b8d2e5\") " pod="openstack/ceilometer-0" Mar 14 07:23:26 crc kubenswrapper[4893]: I0314 07:23:26.763710 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d80346c2-8721-4c5b-8848-2c44d1b8d2e5-config-data\") pod \"ceilometer-0\" (UID: \"d80346c2-8721-4c5b-8848-2c44d1b8d2e5\") " pod="openstack/ceilometer-0" Mar 14 07:23:26 crc kubenswrapper[4893]: I0314 07:23:26.763731 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d80346c2-8721-4c5b-8848-2c44d1b8d2e5-log-httpd\") pod \"ceilometer-0\" (UID: \"d80346c2-8721-4c5b-8848-2c44d1b8d2e5\") " pod="openstack/ceilometer-0" Mar 14 07:23:26 crc kubenswrapper[4893]: I0314 07:23:26.763809 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d80346c2-8721-4c5b-8848-2c44d1b8d2e5-run-httpd\") pod \"ceilometer-0\" (UID: \"d80346c2-8721-4c5b-8848-2c44d1b8d2e5\") " pod="openstack/ceilometer-0" Mar 14 07:23:26 crc kubenswrapper[4893]: I0314 07:23:26.763842 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59dqs\" (UniqueName: \"kubernetes.io/projected/d80346c2-8721-4c5b-8848-2c44d1b8d2e5-kube-api-access-59dqs\") pod \"ceilometer-0\" (UID: \"d80346c2-8721-4c5b-8848-2c44d1b8d2e5\") " pod="openstack/ceilometer-0" Mar 14 07:23:26 crc kubenswrapper[4893]: I0314 07:23:26.763866 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d80346c2-8721-4c5b-8848-2c44d1b8d2e5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d80346c2-8721-4c5b-8848-2c44d1b8d2e5\") " pod="openstack/ceilometer-0" Mar 14 07:23:26 crc kubenswrapper[4893]: I0314 07:23:26.763903 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d80346c2-8721-4c5b-8848-2c44d1b8d2e5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d80346c2-8721-4c5b-8848-2c44d1b8d2e5\") " pod="openstack/ceilometer-0" Mar 14 07:23:26 crc kubenswrapper[4893]: I0314 07:23:26.764586 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d80346c2-8721-4c5b-8848-2c44d1b8d2e5-run-httpd\") pod \"ceilometer-0\" (UID: \"d80346c2-8721-4c5b-8848-2c44d1b8d2e5\") " pod="openstack/ceilometer-0" Mar 14 07:23:26 crc kubenswrapper[4893]: I0314 07:23:26.764634 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d80346c2-8721-4c5b-8848-2c44d1b8d2e5-log-httpd\") pod \"ceilometer-0\" (UID: \"d80346c2-8721-4c5b-8848-2c44d1b8d2e5\") " pod="openstack/ceilometer-0" Mar 14 07:23:26 crc kubenswrapper[4893]: I0314 07:23:26.767837 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d80346c2-8721-4c5b-8848-2c44d1b8d2e5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d80346c2-8721-4c5b-8848-2c44d1b8d2e5\") " pod="openstack/ceilometer-0" Mar 14 07:23:26 crc kubenswrapper[4893]: I0314 07:23:26.767993 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d80346c2-8721-4c5b-8848-2c44d1b8d2e5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d80346c2-8721-4c5b-8848-2c44d1b8d2e5\") " pod="openstack/ceilometer-0" Mar 14 07:23:26 crc kubenswrapper[4893]: I0314 07:23:26.768438 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d80346c2-8721-4c5b-8848-2c44d1b8d2e5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d80346c2-8721-4c5b-8848-2c44d1b8d2e5\") " pod="openstack/ceilometer-0" Mar 14 07:23:26 crc kubenswrapper[4893]: I0314 07:23:26.769785 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d80346c2-8721-4c5b-8848-2c44d1b8d2e5-scripts\") pod \"ceilometer-0\" (UID: \"d80346c2-8721-4c5b-8848-2c44d1b8d2e5\") " pod="openstack/ceilometer-0" Mar 14 07:23:26 crc kubenswrapper[4893]: I0314 07:23:26.780430 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d80346c2-8721-4c5b-8848-2c44d1b8d2e5-config-data\") pod \"ceilometer-0\" (UID: \"d80346c2-8721-4c5b-8848-2c44d1b8d2e5\") " pod="openstack/ceilometer-0" Mar 14 07:23:26 crc kubenswrapper[4893]: I0314 07:23:26.794807 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59dqs\" (UniqueName: \"kubernetes.io/projected/d80346c2-8721-4c5b-8848-2c44d1b8d2e5-kube-api-access-59dqs\") pod \"ceilometer-0\" (UID: \"d80346c2-8721-4c5b-8848-2c44d1b8d2e5\") " pod="openstack/ceilometer-0" Mar 14 07:23:27 crc kubenswrapper[4893]: I0314 07:23:27.395984 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a97b4a0c-e753-44fc-ba00-12c82648c30a" path="/var/lib/kubelet/pods/a97b4a0c-e753-44fc-ba00-12c82648c30a/volumes" Mar 14 07:23:27 crc kubenswrapper[4893]: I0314 07:23:27.476916 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 07:23:27 crc kubenswrapper[4893]: I0314 07:23:27.494633 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 07:23:27 crc kubenswrapper[4893]: I0314 07:23:27.578377 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d80346c2-8721-4c5b-8848-2c44d1b8d2e5-combined-ca-bundle\") pod \"d80346c2-8721-4c5b-8848-2c44d1b8d2e5\" (UID: \"d80346c2-8721-4c5b-8848-2c44d1b8d2e5\") " Mar 14 07:23:27 crc kubenswrapper[4893]: I0314 07:23:27.578882 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d80346c2-8721-4c5b-8848-2c44d1b8d2e5-log-httpd\") pod \"d80346c2-8721-4c5b-8848-2c44d1b8d2e5\" (UID: \"d80346c2-8721-4c5b-8848-2c44d1b8d2e5\") " Mar 14 07:23:27 crc kubenswrapper[4893]: I0314 07:23:27.579092 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d80346c2-8721-4c5b-8848-2c44d1b8d2e5-scripts\") pod \"d80346c2-8721-4c5b-8848-2c44d1b8d2e5\" (UID: \"d80346c2-8721-4c5b-8848-2c44d1b8d2e5\") " Mar 14 07:23:27 crc kubenswrapper[4893]: I0314 07:23:27.579323 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d80346c2-8721-4c5b-8848-2c44d1b8d2e5-run-httpd\") pod \"d80346c2-8721-4c5b-8848-2c44d1b8d2e5\" (UID: \"d80346c2-8721-4c5b-8848-2c44d1b8d2e5\") " Mar 14 07:23:27 crc kubenswrapper[4893]: I0314 07:23:27.579456 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d80346c2-8721-4c5b-8848-2c44d1b8d2e5-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d80346c2-8721-4c5b-8848-2c44d1b8d2e5" (UID: "d80346c2-8721-4c5b-8848-2c44d1b8d2e5"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:23:27 crc kubenswrapper[4893]: I0314 07:23:27.579689 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59dqs\" (UniqueName: \"kubernetes.io/projected/d80346c2-8721-4c5b-8848-2c44d1b8d2e5-kube-api-access-59dqs\") pod \"d80346c2-8721-4c5b-8848-2c44d1b8d2e5\" (UID: \"d80346c2-8721-4c5b-8848-2c44d1b8d2e5\") " Mar 14 07:23:27 crc kubenswrapper[4893]: I0314 07:23:27.579848 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d80346c2-8721-4c5b-8848-2c44d1b8d2e5-sg-core-conf-yaml\") pod \"d80346c2-8721-4c5b-8848-2c44d1b8d2e5\" (UID: \"d80346c2-8721-4c5b-8848-2c44d1b8d2e5\") " Mar 14 07:23:27 crc kubenswrapper[4893]: I0314 07:23:27.580019 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d80346c2-8721-4c5b-8848-2c44d1b8d2e5-config-data\") pod \"d80346c2-8721-4c5b-8848-2c44d1b8d2e5\" (UID: \"d80346c2-8721-4c5b-8848-2c44d1b8d2e5\") " Mar 14 07:23:27 crc kubenswrapper[4893]: I0314 07:23:27.579919 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d80346c2-8721-4c5b-8848-2c44d1b8d2e5-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d80346c2-8721-4c5b-8848-2c44d1b8d2e5" (UID: "d80346c2-8721-4c5b-8848-2c44d1b8d2e5"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:23:27 crc kubenswrapper[4893]: I0314 07:23:27.580369 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d80346c2-8721-4c5b-8848-2c44d1b8d2e5-ceilometer-tls-certs\") pod \"d80346c2-8721-4c5b-8848-2c44d1b8d2e5\" (UID: \"d80346c2-8721-4c5b-8848-2c44d1b8d2e5\") " Mar 14 07:23:27 crc kubenswrapper[4893]: I0314 07:23:27.581320 4893 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d80346c2-8721-4c5b-8848-2c44d1b8d2e5-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:27 crc kubenswrapper[4893]: I0314 07:23:27.581460 4893 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d80346c2-8721-4c5b-8848-2c44d1b8d2e5-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:27 crc kubenswrapper[4893]: I0314 07:23:27.584828 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d80346c2-8721-4c5b-8848-2c44d1b8d2e5-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d80346c2-8721-4c5b-8848-2c44d1b8d2e5" (UID: "d80346c2-8721-4c5b-8848-2c44d1b8d2e5"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:23:27 crc kubenswrapper[4893]: I0314 07:23:27.586181 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d80346c2-8721-4c5b-8848-2c44d1b8d2e5-kube-api-access-59dqs" (OuterVolumeSpecName: "kube-api-access-59dqs") pod "d80346c2-8721-4c5b-8848-2c44d1b8d2e5" (UID: "d80346c2-8721-4c5b-8848-2c44d1b8d2e5"). InnerVolumeSpecName "kube-api-access-59dqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:23:27 crc kubenswrapper[4893]: I0314 07:23:27.587304 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d80346c2-8721-4c5b-8848-2c44d1b8d2e5-scripts" (OuterVolumeSpecName: "scripts") pod "d80346c2-8721-4c5b-8848-2c44d1b8d2e5" (UID: "d80346c2-8721-4c5b-8848-2c44d1b8d2e5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:23:27 crc kubenswrapper[4893]: I0314 07:23:27.587368 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d80346c2-8721-4c5b-8848-2c44d1b8d2e5-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "d80346c2-8721-4c5b-8848-2c44d1b8d2e5" (UID: "d80346c2-8721-4c5b-8848-2c44d1b8d2e5"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:23:27 crc kubenswrapper[4893]: I0314 07:23:27.588261 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d80346c2-8721-4c5b-8848-2c44d1b8d2e5-config-data" (OuterVolumeSpecName: "config-data") pod "d80346c2-8721-4c5b-8848-2c44d1b8d2e5" (UID: "d80346c2-8721-4c5b-8848-2c44d1b8d2e5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:23:27 crc kubenswrapper[4893]: I0314 07:23:27.593716 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d80346c2-8721-4c5b-8848-2c44d1b8d2e5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d80346c2-8721-4c5b-8848-2c44d1b8d2e5" (UID: "d80346c2-8721-4c5b-8848-2c44d1b8d2e5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:23:27 crc kubenswrapper[4893]: I0314 07:23:27.684995 4893 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d80346c2-8721-4c5b-8848-2c44d1b8d2e5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:27 crc kubenswrapper[4893]: I0314 07:23:27.685078 4893 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d80346c2-8721-4c5b-8848-2c44d1b8d2e5-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:27 crc kubenswrapper[4893]: I0314 07:23:27.685094 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59dqs\" (UniqueName: \"kubernetes.io/projected/d80346c2-8721-4c5b-8848-2c44d1b8d2e5-kube-api-access-59dqs\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:27 crc kubenswrapper[4893]: I0314 07:23:27.685112 4893 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d80346c2-8721-4c5b-8848-2c44d1b8d2e5-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:27 crc kubenswrapper[4893]: I0314 07:23:27.685126 4893 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d80346c2-8721-4c5b-8848-2c44d1b8d2e5-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:27 crc kubenswrapper[4893]: I0314 07:23:27.685138 4893 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d80346c2-8721-4c5b-8848-2c44d1b8d2e5-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:28 crc kubenswrapper[4893]: I0314 07:23:28.490479 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 07:23:28 crc kubenswrapper[4893]: I0314 07:23:28.576683 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:23:28 crc kubenswrapper[4893]: I0314 07:23:28.588358 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:23:28 crc kubenswrapper[4893]: I0314 07:23:28.605644 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:23:28 crc kubenswrapper[4893]: I0314 07:23:28.607906 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 07:23:28 crc kubenswrapper[4893]: I0314 07:23:28.610162 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 14 07:23:28 crc kubenswrapper[4893]: I0314 07:23:28.610945 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 14 07:23:28 crc kubenswrapper[4893]: I0314 07:23:28.611985 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 14 07:23:28 crc kubenswrapper[4893]: I0314 07:23:28.619269 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:23:28 crc kubenswrapper[4893]: I0314 07:23:28.708072 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc7c1963-417f-453f-8983-1c03d349f76d-run-httpd\") pod \"ceilometer-0\" (UID: \"dc7c1963-417f-453f-8983-1c03d349f76d\") " pod="openstack/ceilometer-0" Mar 14 07:23:28 crc kubenswrapper[4893]: I0314 07:23:28.708477 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc7c1963-417f-453f-8983-1c03d349f76d-config-data\") pod \"ceilometer-0\" (UID: \"dc7c1963-417f-453f-8983-1c03d349f76d\") " pod="openstack/ceilometer-0" Mar 14 07:23:28 crc kubenswrapper[4893]: I0314 07:23:28.708513 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc7c1963-417f-453f-8983-1c03d349f76d-scripts\") pod \"ceilometer-0\" (UID: \"dc7c1963-417f-453f-8983-1c03d349f76d\") " pod="openstack/ceilometer-0" Mar 14 07:23:28 crc kubenswrapper[4893]: I0314 07:23:28.708928 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5bx6\" (UniqueName: \"kubernetes.io/projected/dc7c1963-417f-453f-8983-1c03d349f76d-kube-api-access-p5bx6\") pod \"ceilometer-0\" (UID: \"dc7c1963-417f-453f-8983-1c03d349f76d\") " pod="openstack/ceilometer-0" Mar 14 07:23:28 crc kubenswrapper[4893]: I0314 07:23:28.709022 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc7c1963-417f-453f-8983-1c03d349f76d-log-httpd\") pod \"ceilometer-0\" (UID: \"dc7c1963-417f-453f-8983-1c03d349f76d\") " pod="openstack/ceilometer-0" Mar 14 07:23:28 crc kubenswrapper[4893]: I0314 07:23:28.709064 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc7c1963-417f-453f-8983-1c03d349f76d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dc7c1963-417f-453f-8983-1c03d349f76d\") " pod="openstack/ceilometer-0" Mar 14 07:23:28 crc kubenswrapper[4893]: I0314 07:23:28.709137 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc7c1963-417f-453f-8983-1c03d349f76d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dc7c1963-417f-453f-8983-1c03d349f76d\") " pod="openstack/ceilometer-0" Mar 14 07:23:28 crc kubenswrapper[4893]: I0314 07:23:28.709211 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc7c1963-417f-453f-8983-1c03d349f76d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"dc7c1963-417f-453f-8983-1c03d349f76d\") " pod="openstack/ceilometer-0" Mar 14 07:23:28 crc kubenswrapper[4893]: I0314 07:23:28.810475 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5bx6\" (UniqueName: \"kubernetes.io/projected/dc7c1963-417f-453f-8983-1c03d349f76d-kube-api-access-p5bx6\") pod \"ceilometer-0\" (UID: \"dc7c1963-417f-453f-8983-1c03d349f76d\") " pod="openstack/ceilometer-0" Mar 14 07:23:28 crc kubenswrapper[4893]: I0314 07:23:28.810554 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc7c1963-417f-453f-8983-1c03d349f76d-log-httpd\") pod \"ceilometer-0\" (UID: \"dc7c1963-417f-453f-8983-1c03d349f76d\") " pod="openstack/ceilometer-0" Mar 14 07:23:28 crc kubenswrapper[4893]: I0314 07:23:28.810576 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc7c1963-417f-453f-8983-1c03d349f76d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dc7c1963-417f-453f-8983-1c03d349f76d\") " pod="openstack/ceilometer-0" Mar 14 07:23:28 crc kubenswrapper[4893]: I0314 07:23:28.810602 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc7c1963-417f-453f-8983-1c03d349f76d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dc7c1963-417f-453f-8983-1c03d349f76d\") " pod="openstack/ceilometer-0" Mar 14 07:23:28 crc kubenswrapper[4893]: I0314 07:23:28.810630 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc7c1963-417f-453f-8983-1c03d349f76d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"dc7c1963-417f-453f-8983-1c03d349f76d\") " pod="openstack/ceilometer-0" Mar 14 07:23:28 crc kubenswrapper[4893]: I0314 07:23:28.810684 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc7c1963-417f-453f-8983-1c03d349f76d-run-httpd\") pod \"ceilometer-0\" (UID: \"dc7c1963-417f-453f-8983-1c03d349f76d\") " pod="openstack/ceilometer-0" Mar 14 07:23:28 crc kubenswrapper[4893]: I0314 07:23:28.810709 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc7c1963-417f-453f-8983-1c03d349f76d-config-data\") pod \"ceilometer-0\" (UID: \"dc7c1963-417f-453f-8983-1c03d349f76d\") " pod="openstack/ceilometer-0" Mar 14 07:23:28 crc kubenswrapper[4893]: I0314 07:23:28.810725 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc7c1963-417f-453f-8983-1c03d349f76d-scripts\") pod \"ceilometer-0\" (UID: \"dc7c1963-417f-453f-8983-1c03d349f76d\") " pod="openstack/ceilometer-0" Mar 14 07:23:28 crc kubenswrapper[4893]: I0314 07:23:28.818348 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc7c1963-417f-453f-8983-1c03d349f76d-scripts\") pod \"ceilometer-0\" (UID: \"dc7c1963-417f-453f-8983-1c03d349f76d\") " pod="openstack/ceilometer-0" Mar 14 07:23:28 crc kubenswrapper[4893]: I0314 07:23:28.819033 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc7c1963-417f-453f-8983-1c03d349f76d-log-httpd\") pod \"ceilometer-0\" (UID: \"dc7c1963-417f-453f-8983-1c03d349f76d\") " pod="openstack/ceilometer-0" Mar 14 07:23:28 crc kubenswrapper[4893]: I0314 07:23:28.822258 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc7c1963-417f-453f-8983-1c03d349f76d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dc7c1963-417f-453f-8983-1c03d349f76d\") " pod="openstack/ceilometer-0" Mar 14 07:23:28 crc kubenswrapper[4893]: I0314 07:23:28.822591 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc7c1963-417f-453f-8983-1c03d349f76d-run-httpd\") pod \"ceilometer-0\" (UID: \"dc7c1963-417f-453f-8983-1c03d349f76d\") " pod="openstack/ceilometer-0" Mar 14 07:23:28 crc kubenswrapper[4893]: I0314 07:23:28.825040 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc7c1963-417f-453f-8983-1c03d349f76d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"dc7c1963-417f-453f-8983-1c03d349f76d\") " pod="openstack/ceilometer-0" Mar 14 07:23:28 crc kubenswrapper[4893]: I0314 07:23:28.827010 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc7c1963-417f-453f-8983-1c03d349f76d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dc7c1963-417f-453f-8983-1c03d349f76d\") " pod="openstack/ceilometer-0" Mar 14 07:23:28 crc kubenswrapper[4893]: I0314 07:23:28.830919 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc7c1963-417f-453f-8983-1c03d349f76d-config-data\") pod \"ceilometer-0\" (UID: \"dc7c1963-417f-453f-8983-1c03d349f76d\") " pod="openstack/ceilometer-0" Mar 14 07:23:28 crc kubenswrapper[4893]: I0314 07:23:28.839032 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5bx6\" (UniqueName: \"kubernetes.io/projected/dc7c1963-417f-453f-8983-1c03d349f76d-kube-api-access-p5bx6\") pod \"ceilometer-0\" (UID: \"dc7c1963-417f-453f-8983-1c03d349f76d\") " pod="openstack/ceilometer-0" Mar 14 07:23:28 crc kubenswrapper[4893]: I0314 07:23:28.935703 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 07:23:29 crc kubenswrapper[4893]: I0314 07:23:29.044744 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 07:23:29 crc kubenswrapper[4893]: I0314 07:23:29.116054 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b55f3b4-1214-471c-a15c-6364e97d0818-logs\") pod \"2b55f3b4-1214-471c-a15c-6364e97d0818\" (UID: \"2b55f3b4-1214-471c-a15c-6364e97d0818\") " Mar 14 07:23:29 crc kubenswrapper[4893]: I0314 07:23:29.116761 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b55f3b4-1214-471c-a15c-6364e97d0818-combined-ca-bundle\") pod \"2b55f3b4-1214-471c-a15c-6364e97d0818\" (UID: \"2b55f3b4-1214-471c-a15c-6364e97d0818\") " Mar 14 07:23:29 crc kubenswrapper[4893]: I0314 07:23:29.117042 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swj5n\" (UniqueName: \"kubernetes.io/projected/2b55f3b4-1214-471c-a15c-6364e97d0818-kube-api-access-swj5n\") pod \"2b55f3b4-1214-471c-a15c-6364e97d0818\" (UID: \"2b55f3b4-1214-471c-a15c-6364e97d0818\") " Mar 14 07:23:29 crc kubenswrapper[4893]: I0314 07:23:29.117070 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b55f3b4-1214-471c-a15c-6364e97d0818-config-data\") pod \"2b55f3b4-1214-471c-a15c-6364e97d0818\" (UID: \"2b55f3b4-1214-471c-a15c-6364e97d0818\") " Mar 14 07:23:29 crc kubenswrapper[4893]: I0314 07:23:29.116997 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b55f3b4-1214-471c-a15c-6364e97d0818-logs" (OuterVolumeSpecName: "logs") pod "2b55f3b4-1214-471c-a15c-6364e97d0818" (UID: "2b55f3b4-1214-471c-a15c-6364e97d0818"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:23:29 crc kubenswrapper[4893]: I0314 07:23:29.118186 4893 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b55f3b4-1214-471c-a15c-6364e97d0818-logs\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:29 crc kubenswrapper[4893]: I0314 07:23:29.124733 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b55f3b4-1214-471c-a15c-6364e97d0818-kube-api-access-swj5n" (OuterVolumeSpecName: "kube-api-access-swj5n") pod "2b55f3b4-1214-471c-a15c-6364e97d0818" (UID: "2b55f3b4-1214-471c-a15c-6364e97d0818"). InnerVolumeSpecName "kube-api-access-swj5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:23:29 crc kubenswrapper[4893]: I0314 07:23:29.143562 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b55f3b4-1214-471c-a15c-6364e97d0818-config-data" (OuterVolumeSpecName: "config-data") pod "2b55f3b4-1214-471c-a15c-6364e97d0818" (UID: "2b55f3b4-1214-471c-a15c-6364e97d0818"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:23:29 crc kubenswrapper[4893]: I0314 07:23:29.150150 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b55f3b4-1214-471c-a15c-6364e97d0818-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2b55f3b4-1214-471c-a15c-6364e97d0818" (UID: "2b55f3b4-1214-471c-a15c-6364e97d0818"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:23:29 crc kubenswrapper[4893]: I0314 07:23:29.220715 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swj5n\" (UniqueName: \"kubernetes.io/projected/2b55f3b4-1214-471c-a15c-6364e97d0818-kube-api-access-swj5n\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:29 crc kubenswrapper[4893]: I0314 07:23:29.220777 4893 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b55f3b4-1214-471c-a15c-6364e97d0818-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:29 crc kubenswrapper[4893]: I0314 07:23:29.220789 4893 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b55f3b4-1214-471c-a15c-6364e97d0818-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:29 crc kubenswrapper[4893]: I0314 07:23:29.390292 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d80346c2-8721-4c5b-8848-2c44d1b8d2e5" path="/var/lib/kubelet/pods/d80346c2-8721-4c5b-8848-2c44d1b8d2e5/volumes" Mar 14 07:23:29 crc kubenswrapper[4893]: I0314 07:23:29.400881 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:23:29 crc kubenswrapper[4893]: W0314 07:23:29.403486 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc7c1963_417f_453f_8983_1c03d349f76d.slice/crio-3050af06d35ad99ac22b01a1de64fe78ff0ebebb248fd8993018c3814bc273cc WatchSource:0}: Error finding container 3050af06d35ad99ac22b01a1de64fe78ff0ebebb248fd8993018c3814bc273cc: Status 404 returned error can't find the container with id 3050af06d35ad99ac22b01a1de64fe78ff0ebebb248fd8993018c3814bc273cc Mar 14 07:23:29 crc kubenswrapper[4893]: I0314 07:23:29.502562 4893 generic.go:334] "Generic (PLEG): container finished" podID="2b55f3b4-1214-471c-a15c-6364e97d0818" containerID="62be3b20c75ccf945fc1b6ffcd40a952892e1ad548e1d8f510579789c375347c" exitCode=0 Mar 14 07:23:29 crc kubenswrapper[4893]: I0314 07:23:29.502666 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2b55f3b4-1214-471c-a15c-6364e97d0818","Type":"ContainerDied","Data":"62be3b20c75ccf945fc1b6ffcd40a952892e1ad548e1d8f510579789c375347c"} Mar 14 07:23:29 crc kubenswrapper[4893]: I0314 07:23:29.502682 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 07:23:29 crc kubenswrapper[4893]: I0314 07:23:29.502713 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2b55f3b4-1214-471c-a15c-6364e97d0818","Type":"ContainerDied","Data":"85100369830d64a14a0d03d6d73fd7dedf15485e7088c4d8d2cf3e4586783005"} Mar 14 07:23:29 crc kubenswrapper[4893]: I0314 07:23:29.502736 4893 scope.go:117] "RemoveContainer" containerID="62be3b20c75ccf945fc1b6ffcd40a952892e1ad548e1d8f510579789c375347c" Mar 14 07:23:29 crc kubenswrapper[4893]: I0314 07:23:29.503920 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc7c1963-417f-453f-8983-1c03d349f76d","Type":"ContainerStarted","Data":"3050af06d35ad99ac22b01a1de64fe78ff0ebebb248fd8993018c3814bc273cc"} Mar 14 07:23:29 crc kubenswrapper[4893]: I0314 07:23:29.525333 4893 scope.go:117] "RemoveContainer" containerID="85823d86f7f607494f364ff61b9ba224eb4ff5e8b1dddee81c142042cdf0d7fd" Mar 14 07:23:29 crc kubenswrapper[4893]: I0314 07:23:29.535571 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 14 07:23:29 crc kubenswrapper[4893]: I0314 07:23:29.548539 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 14 07:23:29 crc kubenswrapper[4893]: I0314 07:23:29.559953 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 14 07:23:29 crc kubenswrapper[4893]: E0314 07:23:29.560368 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b55f3b4-1214-471c-a15c-6364e97d0818" containerName="nova-api-api" Mar 14 07:23:29 crc kubenswrapper[4893]: I0314 07:23:29.560884 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b55f3b4-1214-471c-a15c-6364e97d0818" containerName="nova-api-api" Mar 14 07:23:29 crc kubenswrapper[4893]: E0314 07:23:29.560908 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b55f3b4-1214-471c-a15c-6364e97d0818" containerName="nova-api-log" Mar 14 07:23:29 crc kubenswrapper[4893]: I0314 07:23:29.560916 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b55f3b4-1214-471c-a15c-6364e97d0818" containerName="nova-api-log" Mar 14 07:23:29 crc kubenswrapper[4893]: I0314 07:23:29.561119 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b55f3b4-1214-471c-a15c-6364e97d0818" containerName="nova-api-api" Mar 14 07:23:29 crc kubenswrapper[4893]: I0314 07:23:29.561137 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b55f3b4-1214-471c-a15c-6364e97d0818" containerName="nova-api-log" Mar 14 07:23:29 crc kubenswrapper[4893]: I0314 07:23:29.562004 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 07:23:29 crc kubenswrapper[4893]: I0314 07:23:29.563706 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 14 07:23:29 crc kubenswrapper[4893]: I0314 07:23:29.563921 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 14 07:23:29 crc kubenswrapper[4893]: I0314 07:23:29.570266 4893 scope.go:117] "RemoveContainer" containerID="62be3b20c75ccf945fc1b6ffcd40a952892e1ad548e1d8f510579789c375347c" Mar 14 07:23:29 crc kubenswrapper[4893]: I0314 07:23:29.570646 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 14 07:23:29 crc kubenswrapper[4893]: E0314 07:23:29.575571 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62be3b20c75ccf945fc1b6ffcd40a952892e1ad548e1d8f510579789c375347c\": container with ID starting with 62be3b20c75ccf945fc1b6ffcd40a952892e1ad548e1d8f510579789c375347c not found: ID does not exist" containerID="62be3b20c75ccf945fc1b6ffcd40a952892e1ad548e1d8f510579789c375347c" Mar 14 07:23:29 crc kubenswrapper[4893]: I0314 07:23:29.575635 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62be3b20c75ccf945fc1b6ffcd40a952892e1ad548e1d8f510579789c375347c"} err="failed to get container status \"62be3b20c75ccf945fc1b6ffcd40a952892e1ad548e1d8f510579789c375347c\": rpc error: code = NotFound desc = could not find container \"62be3b20c75ccf945fc1b6ffcd40a952892e1ad548e1d8f510579789c375347c\": container with ID starting with 62be3b20c75ccf945fc1b6ffcd40a952892e1ad548e1d8f510579789c375347c not found: ID does not exist" Mar 14 07:23:29 crc kubenswrapper[4893]: I0314 07:23:29.575660 4893 scope.go:117] "RemoveContainer" containerID="85823d86f7f607494f364ff61b9ba224eb4ff5e8b1dddee81c142042cdf0d7fd" Mar 14 07:23:29 crc kubenswrapper[4893]: E0314 07:23:29.578387 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85823d86f7f607494f364ff61b9ba224eb4ff5e8b1dddee81c142042cdf0d7fd\": container with ID starting with 85823d86f7f607494f364ff61b9ba224eb4ff5e8b1dddee81c142042cdf0d7fd not found: ID does not exist" containerID="85823d86f7f607494f364ff61b9ba224eb4ff5e8b1dddee81c142042cdf0d7fd" Mar 14 07:23:29 crc kubenswrapper[4893]: I0314 07:23:29.578422 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85823d86f7f607494f364ff61b9ba224eb4ff5e8b1dddee81c142042cdf0d7fd"} err="failed to get container status \"85823d86f7f607494f364ff61b9ba224eb4ff5e8b1dddee81c142042cdf0d7fd\": rpc error: code = NotFound desc = could not find container \"85823d86f7f607494f364ff61b9ba224eb4ff5e8b1dddee81c142042cdf0d7fd\": container with ID starting with 85823d86f7f607494f364ff61b9ba224eb4ff5e8b1dddee81c142042cdf0d7fd not found: ID does not exist" Mar 14 07:23:29 crc kubenswrapper[4893]: I0314 07:23:29.582292 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 14 07:23:29 crc kubenswrapper[4893]: I0314 07:23:29.627236 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bv9rn\" (UniqueName: \"kubernetes.io/projected/b91c66f1-34a5-40f1-a3b7-c303e2c0a58b-kube-api-access-bv9rn\") pod \"nova-api-0\" (UID: \"b91c66f1-34a5-40f1-a3b7-c303e2c0a58b\") " pod="openstack/nova-api-0" Mar 14 07:23:29 crc kubenswrapper[4893]: I0314 07:23:29.627276 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b91c66f1-34a5-40f1-a3b7-c303e2c0a58b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b91c66f1-34a5-40f1-a3b7-c303e2c0a58b\") " pod="openstack/nova-api-0" Mar 14 07:23:29 crc kubenswrapper[4893]: I0314 07:23:29.627331 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b91c66f1-34a5-40f1-a3b7-c303e2c0a58b-config-data\") pod \"nova-api-0\" (UID: \"b91c66f1-34a5-40f1-a3b7-c303e2c0a58b\") " pod="openstack/nova-api-0" Mar 14 07:23:29 crc kubenswrapper[4893]: I0314 07:23:29.627403 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b91c66f1-34a5-40f1-a3b7-c303e2c0a58b-public-tls-certs\") pod \"nova-api-0\" (UID: \"b91c66f1-34a5-40f1-a3b7-c303e2c0a58b\") " pod="openstack/nova-api-0" Mar 14 07:23:29 crc kubenswrapper[4893]: I0314 07:23:29.627453 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b91c66f1-34a5-40f1-a3b7-c303e2c0a58b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b91c66f1-34a5-40f1-a3b7-c303e2c0a58b\") " pod="openstack/nova-api-0" Mar 14 07:23:29 crc kubenswrapper[4893]: I0314 07:23:29.627471 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b91c66f1-34a5-40f1-a3b7-c303e2c0a58b-logs\") pod \"nova-api-0\" (UID: \"b91c66f1-34a5-40f1-a3b7-c303e2c0a58b\") " pod="openstack/nova-api-0" Mar 14 07:23:29 crc kubenswrapper[4893]: I0314 07:23:29.729283 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b91c66f1-34a5-40f1-a3b7-c303e2c0a58b-config-data\") pod \"nova-api-0\" (UID: \"b91c66f1-34a5-40f1-a3b7-c303e2c0a58b\") " pod="openstack/nova-api-0" Mar 14 07:23:29 crc kubenswrapper[4893]: I0314 07:23:29.729479 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b91c66f1-34a5-40f1-a3b7-c303e2c0a58b-public-tls-certs\") pod \"nova-api-0\" (UID: \"b91c66f1-34a5-40f1-a3b7-c303e2c0a58b\") " pod="openstack/nova-api-0" Mar 14 07:23:29 crc kubenswrapper[4893]: I0314 07:23:29.729584 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b91c66f1-34a5-40f1-a3b7-c303e2c0a58b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b91c66f1-34a5-40f1-a3b7-c303e2c0a58b\") " pod="openstack/nova-api-0" Mar 14 07:23:29 crc kubenswrapper[4893]: I0314 07:23:29.729626 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b91c66f1-34a5-40f1-a3b7-c303e2c0a58b-logs\") pod \"nova-api-0\" (UID: \"b91c66f1-34a5-40f1-a3b7-c303e2c0a58b\") " pod="openstack/nova-api-0" Mar 14 07:23:29 crc kubenswrapper[4893]: I0314 07:23:29.729765 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bv9rn\" (UniqueName: \"kubernetes.io/projected/b91c66f1-34a5-40f1-a3b7-c303e2c0a58b-kube-api-access-bv9rn\") pod \"nova-api-0\" (UID: \"b91c66f1-34a5-40f1-a3b7-c303e2c0a58b\") " pod="openstack/nova-api-0" Mar 14 07:23:29 crc kubenswrapper[4893]: I0314 07:23:29.729811 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b91c66f1-34a5-40f1-a3b7-c303e2c0a58b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b91c66f1-34a5-40f1-a3b7-c303e2c0a58b\") " pod="openstack/nova-api-0" Mar 14 07:23:29 crc kubenswrapper[4893]: I0314 07:23:29.730006 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b91c66f1-34a5-40f1-a3b7-c303e2c0a58b-logs\") pod \"nova-api-0\" (UID: \"b91c66f1-34a5-40f1-a3b7-c303e2c0a58b\") " pod="openstack/nova-api-0" Mar 14 07:23:29 crc kubenswrapper[4893]: I0314 07:23:29.734894 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b91c66f1-34a5-40f1-a3b7-c303e2c0a58b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b91c66f1-34a5-40f1-a3b7-c303e2c0a58b\") " pod="openstack/nova-api-0" Mar 14 07:23:29 crc kubenswrapper[4893]: I0314 07:23:29.737027 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b91c66f1-34a5-40f1-a3b7-c303e2c0a58b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b91c66f1-34a5-40f1-a3b7-c303e2c0a58b\") " pod="openstack/nova-api-0" Mar 14 07:23:29 crc kubenswrapper[4893]: I0314 07:23:29.737462 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b91c66f1-34a5-40f1-a3b7-c303e2c0a58b-public-tls-certs\") pod \"nova-api-0\" (UID: \"b91c66f1-34a5-40f1-a3b7-c303e2c0a58b\") " pod="openstack/nova-api-0" Mar 14 07:23:29 crc kubenswrapper[4893]: I0314 07:23:29.737740 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b91c66f1-34a5-40f1-a3b7-c303e2c0a58b-config-data\") pod \"nova-api-0\" (UID: \"b91c66f1-34a5-40f1-a3b7-c303e2c0a58b\") " pod="openstack/nova-api-0" Mar 14 07:23:29 crc kubenswrapper[4893]: I0314 07:23:29.755563 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bv9rn\" (UniqueName: \"kubernetes.io/projected/b91c66f1-34a5-40f1-a3b7-c303e2c0a58b-kube-api-access-bv9rn\") pod \"nova-api-0\" (UID: \"b91c66f1-34a5-40f1-a3b7-c303e2c0a58b\") " pod="openstack/nova-api-0" Mar 14 07:23:29 crc kubenswrapper[4893]: I0314 07:23:29.889079 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 07:23:30 crc kubenswrapper[4893]: I0314 07:23:30.319414 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 14 07:23:30 crc kubenswrapper[4893]: W0314 07:23:30.325099 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb91c66f1_34a5_40f1_a3b7_c303e2c0a58b.slice/crio-0610b496775e2065245b50b0c39415e9c04688c2ad24222d1b83f7a20eac4287 WatchSource:0}: Error finding container 0610b496775e2065245b50b0c39415e9c04688c2ad24222d1b83f7a20eac4287: Status 404 returned error can't find the container with id 0610b496775e2065245b50b0c39415e9c04688c2ad24222d1b83f7a20eac4287 Mar 14 07:23:30 crc kubenswrapper[4893]: I0314 07:23:30.516142 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b91c66f1-34a5-40f1-a3b7-c303e2c0a58b","Type":"ContainerStarted","Data":"cae30b6316de56e4496fa20c46e145f4daf192797854b1e46da7af248c75d2d5"} Mar 14 07:23:30 crc kubenswrapper[4893]: I0314 07:23:30.517360 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b91c66f1-34a5-40f1-a3b7-c303e2c0a58b","Type":"ContainerStarted","Data":"0610b496775e2065245b50b0c39415e9c04688c2ad24222d1b83f7a20eac4287"} Mar 14 07:23:30 crc kubenswrapper[4893]: I0314 07:23:30.519053 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc7c1963-417f-453f-8983-1c03d349f76d","Type":"ContainerStarted","Data":"55f440a65215e639fbf8031519f6a32152e474e4f88918f7a1c6749a48adaca4"} Mar 14 07:23:30 crc kubenswrapper[4893]: I0314 07:23:30.803203 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 14 07:23:30 crc kubenswrapper[4893]: I0314 07:23:30.803600 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 14 07:23:30 crc kubenswrapper[4893]: I0314 07:23:30.810087 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 14 07:23:30 crc kubenswrapper[4893]: I0314 07:23:30.829913 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 14 07:23:31 crc kubenswrapper[4893]: I0314 07:23:31.409893 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b55f3b4-1214-471c-a15c-6364e97d0818" path="/var/lib/kubelet/pods/2b55f3b4-1214-471c-a15c-6364e97d0818/volumes" Mar 14 07:23:31 crc kubenswrapper[4893]: I0314 07:23:31.456045 4893 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 07:23:31 crc kubenswrapper[4893]: I0314 07:23:31.533629 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc7c1963-417f-453f-8983-1c03d349f76d","Type":"ContainerStarted","Data":"93c8ba18223ce6d9916a1a48fe599793ff7a55be5ba49065b9cec75d19d40537"} Mar 14 07:23:31 crc kubenswrapper[4893]: I0314 07:23:31.534424 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc7c1963-417f-453f-8983-1c03d349f76d","Type":"ContainerStarted","Data":"93e846e5581223fc8c1f1a453c95a7636bc81b66f008e4b7deba28bbe41f5000"} Mar 14 07:23:31 crc kubenswrapper[4893]: I0314 07:23:31.541279 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b91c66f1-34a5-40f1-a3b7-c303e2c0a58b","Type":"ContainerStarted","Data":"82f19db5e133bed623ad9bee110b2012ffd8963ece27ea07c5d6440a3c2cc228"} Mar 14 07:23:31 crc kubenswrapper[4893]: I0314 07:23:31.558244 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 14 07:23:31 crc kubenswrapper[4893]: I0314 07:23:31.572970 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.572950895 podStartE2EDuration="2.572950895s" podCreationTimestamp="2026-03-14 07:23:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:23:31.564795007 +0000 UTC m=+1490.826971799" watchObservedRunningTime="2026-03-14 07:23:31.572950895 +0000 UTC m=+1490.835127687" Mar 14 07:23:31 crc kubenswrapper[4893]: I0314 07:23:31.712205 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-sw4ds"] Mar 14 07:23:31 crc kubenswrapper[4893]: I0314 07:23:31.713428 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-sw4ds" Mar 14 07:23:31 crc kubenswrapper[4893]: I0314 07:23:31.719207 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 14 07:23:31 crc kubenswrapper[4893]: I0314 07:23:31.720482 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 14 07:23:31 crc kubenswrapper[4893]: I0314 07:23:31.725127 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-sw4ds"] Mar 14 07:23:31 crc kubenswrapper[4893]: I0314 07:23:31.769055 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24dd0d41-68ca-458d-9011-a2c167fda868-config-data\") pod \"nova-cell1-cell-mapping-sw4ds\" (UID: \"24dd0d41-68ca-458d-9011-a2c167fda868\") " pod="openstack/nova-cell1-cell-mapping-sw4ds" Mar 14 07:23:31 crc kubenswrapper[4893]: I0314 07:23:31.769122 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24dd0d41-68ca-458d-9011-a2c167fda868-scripts\") pod \"nova-cell1-cell-mapping-sw4ds\" (UID: \"24dd0d41-68ca-458d-9011-a2c167fda868\") " pod="openstack/nova-cell1-cell-mapping-sw4ds" Mar 14 07:23:31 crc kubenswrapper[4893]: I0314 07:23:31.769272 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xj7h4\" (UniqueName: \"kubernetes.io/projected/24dd0d41-68ca-458d-9011-a2c167fda868-kube-api-access-xj7h4\") pod \"nova-cell1-cell-mapping-sw4ds\" (UID: \"24dd0d41-68ca-458d-9011-a2c167fda868\") " pod="openstack/nova-cell1-cell-mapping-sw4ds" Mar 14 07:23:31 crc kubenswrapper[4893]: I0314 07:23:31.770044 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24dd0d41-68ca-458d-9011-a2c167fda868-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-sw4ds\" (UID: \"24dd0d41-68ca-458d-9011-a2c167fda868\") " pod="openstack/nova-cell1-cell-mapping-sw4ds" Mar 14 07:23:31 crc kubenswrapper[4893]: I0314 07:23:31.816658 4893 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="37ed3c89-6d6e-481f-a8cb-c96b04c5c13a" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.206:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 07:23:31 crc kubenswrapper[4893]: I0314 07:23:31.816652 4893 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="37ed3c89-6d6e-481f-a8cb-c96b04c5c13a" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.206:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 07:23:31 crc kubenswrapper[4893]: I0314 07:23:31.871640 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24dd0d41-68ca-458d-9011-a2c167fda868-scripts\") pod \"nova-cell1-cell-mapping-sw4ds\" (UID: \"24dd0d41-68ca-458d-9011-a2c167fda868\") " pod="openstack/nova-cell1-cell-mapping-sw4ds" Mar 14 07:23:31 crc kubenswrapper[4893]: I0314 07:23:31.871690 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xj7h4\" (UniqueName: \"kubernetes.io/projected/24dd0d41-68ca-458d-9011-a2c167fda868-kube-api-access-xj7h4\") pod \"nova-cell1-cell-mapping-sw4ds\" (UID: \"24dd0d41-68ca-458d-9011-a2c167fda868\") " pod="openstack/nova-cell1-cell-mapping-sw4ds" Mar 14 07:23:31 crc kubenswrapper[4893]: I0314 07:23:31.871799 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24dd0d41-68ca-458d-9011-a2c167fda868-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-sw4ds\" (UID: \"24dd0d41-68ca-458d-9011-a2c167fda868\") " pod="openstack/nova-cell1-cell-mapping-sw4ds" Mar 14 07:23:31 crc kubenswrapper[4893]: I0314 07:23:31.871846 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24dd0d41-68ca-458d-9011-a2c167fda868-config-data\") pod \"nova-cell1-cell-mapping-sw4ds\" (UID: \"24dd0d41-68ca-458d-9011-a2c167fda868\") " pod="openstack/nova-cell1-cell-mapping-sw4ds" Mar 14 07:23:31 crc kubenswrapper[4893]: I0314 07:23:31.875603 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24dd0d41-68ca-458d-9011-a2c167fda868-scripts\") pod \"nova-cell1-cell-mapping-sw4ds\" (UID: \"24dd0d41-68ca-458d-9011-a2c167fda868\") " pod="openstack/nova-cell1-cell-mapping-sw4ds" Mar 14 07:23:31 crc kubenswrapper[4893]: I0314 07:23:31.875635 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24dd0d41-68ca-458d-9011-a2c167fda868-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-sw4ds\" (UID: \"24dd0d41-68ca-458d-9011-a2c167fda868\") " pod="openstack/nova-cell1-cell-mapping-sw4ds" Mar 14 07:23:31 crc kubenswrapper[4893]: I0314 07:23:31.882003 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24dd0d41-68ca-458d-9011-a2c167fda868-config-data\") pod \"nova-cell1-cell-mapping-sw4ds\" (UID: \"24dd0d41-68ca-458d-9011-a2c167fda868\") " pod="openstack/nova-cell1-cell-mapping-sw4ds" Mar 14 07:23:31 crc kubenswrapper[4893]: I0314 07:23:31.895043 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xj7h4\" (UniqueName: \"kubernetes.io/projected/24dd0d41-68ca-458d-9011-a2c167fda868-kube-api-access-xj7h4\") pod \"nova-cell1-cell-mapping-sw4ds\" (UID: \"24dd0d41-68ca-458d-9011-a2c167fda868\") " pod="openstack/nova-cell1-cell-mapping-sw4ds" Mar 14 07:23:32 crc kubenswrapper[4893]: I0314 07:23:32.029855 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-sw4ds" Mar 14 07:23:32 crc kubenswrapper[4893]: I0314 07:23:32.571238 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-sw4ds"] Mar 14 07:23:32 crc kubenswrapper[4893]: W0314 07:23:32.583667 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24dd0d41_68ca_458d_9011_a2c167fda868.slice/crio-abdd0774b03685891e1edb6ac45f1012a5b9fa1fd7e8e91af13488b7bcd10950 WatchSource:0}: Error finding container abdd0774b03685891e1edb6ac45f1012a5b9fa1fd7e8e91af13488b7bcd10950: Status 404 returned error can't find the container with id abdd0774b03685891e1edb6ac45f1012a5b9fa1fd7e8e91af13488b7bcd10950 Mar 14 07:23:32 crc kubenswrapper[4893]: I0314 07:23:32.978794 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-69ffc749-7qsrf" Mar 14 07:23:33 crc kubenswrapper[4893]: I0314 07:23:33.047342 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b74b5cfd5-p584x"] Mar 14 07:23:33 crc kubenswrapper[4893]: I0314 07:23:33.047581 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b74b5cfd5-p584x" podUID="f1343ad6-1ec8-4249-826c-df0efc18fcb8" containerName="dnsmasq-dns" containerID="cri-o://5610af0d28e422292b929087d858b3cea654e6cea1225c3623ef93e1dccb644e" gracePeriod=10 Mar 14 07:23:33 crc kubenswrapper[4893]: I0314 07:23:33.511129 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b74b5cfd5-p584x" Mar 14 07:23:33 crc kubenswrapper[4893]: I0314 07:23:33.570373 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-sw4ds" event={"ID":"24dd0d41-68ca-458d-9011-a2c167fda868","Type":"ContainerStarted","Data":"90e9322ef26dbbffc9cc7bc8b8905ff428b7b56d164c235cfcf177745f83890c"} Mar 14 07:23:33 crc kubenswrapper[4893]: I0314 07:23:33.570419 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-sw4ds" event={"ID":"24dd0d41-68ca-458d-9011-a2c167fda868","Type":"ContainerStarted","Data":"abdd0774b03685891e1edb6ac45f1012a5b9fa1fd7e8e91af13488b7bcd10950"} Mar 14 07:23:33 crc kubenswrapper[4893]: I0314 07:23:33.583347 4893 generic.go:334] "Generic (PLEG): container finished" podID="f1343ad6-1ec8-4249-826c-df0efc18fcb8" containerID="5610af0d28e422292b929087d858b3cea654e6cea1225c3623ef93e1dccb644e" exitCode=0 Mar 14 07:23:33 crc kubenswrapper[4893]: I0314 07:23:33.583413 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b74b5cfd5-p584x" event={"ID":"f1343ad6-1ec8-4249-826c-df0efc18fcb8","Type":"ContainerDied","Data":"5610af0d28e422292b929087d858b3cea654e6cea1225c3623ef93e1dccb644e"} Mar 14 07:23:33 crc kubenswrapper[4893]: I0314 07:23:33.583438 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b74b5cfd5-p584x" event={"ID":"f1343ad6-1ec8-4249-826c-df0efc18fcb8","Type":"ContainerDied","Data":"4b7bd6321dac21e164c9aa65c2c4f4148bf0e4e783d9ec2ceed62a2b8ce636c3"} Mar 14 07:23:33 crc kubenswrapper[4893]: I0314 07:23:33.583687 4893 scope.go:117] "RemoveContainer" containerID="5610af0d28e422292b929087d858b3cea654e6cea1225c3623ef93e1dccb644e" Mar 14 07:23:33 crc kubenswrapper[4893]: I0314 07:23:33.583814 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b74b5cfd5-p584x" Mar 14 07:23:33 crc kubenswrapper[4893]: I0314 07:23:33.589453 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-sw4ds" podStartSLOduration=2.5894433279999998 podStartE2EDuration="2.589443328s" podCreationTimestamp="2026-03-14 07:23:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:23:33.587551721 +0000 UTC m=+1492.849728523" watchObservedRunningTime="2026-03-14 07:23:33.589443328 +0000 UTC m=+1492.851620120" Mar 14 07:23:33 crc kubenswrapper[4893]: I0314 07:23:33.607201 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1343ad6-1ec8-4249-826c-df0efc18fcb8-config\") pod \"f1343ad6-1ec8-4249-826c-df0efc18fcb8\" (UID: \"f1343ad6-1ec8-4249-826c-df0efc18fcb8\") " Mar 14 07:23:33 crc kubenswrapper[4893]: I0314 07:23:33.607395 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f1343ad6-1ec8-4249-826c-df0efc18fcb8-ovsdbserver-nb\") pod \"f1343ad6-1ec8-4249-826c-df0efc18fcb8\" (UID: \"f1343ad6-1ec8-4249-826c-df0efc18fcb8\") " Mar 14 07:23:33 crc kubenswrapper[4893]: I0314 07:23:33.607429 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f1343ad6-1ec8-4249-826c-df0efc18fcb8-dns-svc\") pod \"f1343ad6-1ec8-4249-826c-df0efc18fcb8\" (UID: \"f1343ad6-1ec8-4249-826c-df0efc18fcb8\") " Mar 14 07:23:33 crc kubenswrapper[4893]: I0314 07:23:33.607476 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f1343ad6-1ec8-4249-826c-df0efc18fcb8-dns-swift-storage-0\") pod \"f1343ad6-1ec8-4249-826c-df0efc18fcb8\" (UID: \"f1343ad6-1ec8-4249-826c-df0efc18fcb8\") " Mar 14 07:23:33 crc kubenswrapper[4893]: I0314 07:23:33.607595 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f1343ad6-1ec8-4249-826c-df0efc18fcb8-ovsdbserver-sb\") pod \"f1343ad6-1ec8-4249-826c-df0efc18fcb8\" (UID: \"f1343ad6-1ec8-4249-826c-df0efc18fcb8\") " Mar 14 07:23:33 crc kubenswrapper[4893]: I0314 07:23:33.607637 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m75qh\" (UniqueName: \"kubernetes.io/projected/f1343ad6-1ec8-4249-826c-df0efc18fcb8-kube-api-access-m75qh\") pod \"f1343ad6-1ec8-4249-826c-df0efc18fcb8\" (UID: \"f1343ad6-1ec8-4249-826c-df0efc18fcb8\") " Mar 14 07:23:33 crc kubenswrapper[4893]: I0314 07:23:33.608731 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc7c1963-417f-453f-8983-1c03d349f76d","Type":"ContainerStarted","Data":"5de4204094b61d61b74c39b2bd96d3d51618d7bb7e19601f38fede117a170eb3"} Mar 14 07:23:33 crc kubenswrapper[4893]: I0314 07:23:33.609685 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 14 07:23:33 crc kubenswrapper[4893]: I0314 07:23:33.612118 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1343ad6-1ec8-4249-826c-df0efc18fcb8-kube-api-access-m75qh" (OuterVolumeSpecName: "kube-api-access-m75qh") pod "f1343ad6-1ec8-4249-826c-df0efc18fcb8" (UID: "f1343ad6-1ec8-4249-826c-df0efc18fcb8"). InnerVolumeSpecName "kube-api-access-m75qh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:23:33 crc kubenswrapper[4893]: I0314 07:23:33.617140 4893 scope.go:117] "RemoveContainer" containerID="bf35d0d0d553b90bea3ea8c8414f7f3701ce5091dc704314fb9c763a78963013" Mar 14 07:23:33 crc kubenswrapper[4893]: I0314 07:23:33.644849 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.7959597870000001 podStartE2EDuration="5.644829807s" podCreationTimestamp="2026-03-14 07:23:28 +0000 UTC" firstStartedPulling="2026-03-14 07:23:29.406853544 +0000 UTC m=+1488.669030336" lastFinishedPulling="2026-03-14 07:23:33.255723564 +0000 UTC m=+1492.517900356" observedRunningTime="2026-03-14 07:23:33.637609262 +0000 UTC m=+1492.899786054" watchObservedRunningTime="2026-03-14 07:23:33.644829807 +0000 UTC m=+1492.907006599" Mar 14 07:23:33 crc kubenswrapper[4893]: I0314 07:23:33.662382 4893 scope.go:117] "RemoveContainer" containerID="5610af0d28e422292b929087d858b3cea654e6cea1225c3623ef93e1dccb644e" Mar 14 07:23:33 crc kubenswrapper[4893]: E0314 07:23:33.663476 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5610af0d28e422292b929087d858b3cea654e6cea1225c3623ef93e1dccb644e\": container with ID starting with 5610af0d28e422292b929087d858b3cea654e6cea1225c3623ef93e1dccb644e not found: ID does not exist" containerID="5610af0d28e422292b929087d858b3cea654e6cea1225c3623ef93e1dccb644e" Mar 14 07:23:33 crc kubenswrapper[4893]: I0314 07:23:33.663508 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5610af0d28e422292b929087d858b3cea654e6cea1225c3623ef93e1dccb644e"} err="failed to get container status \"5610af0d28e422292b929087d858b3cea654e6cea1225c3623ef93e1dccb644e\": rpc error: code = NotFound desc = could not find container \"5610af0d28e422292b929087d858b3cea654e6cea1225c3623ef93e1dccb644e\": container with ID starting with 5610af0d28e422292b929087d858b3cea654e6cea1225c3623ef93e1dccb644e not found: ID does not exist" Mar 14 07:23:33 crc kubenswrapper[4893]: I0314 07:23:33.663539 4893 scope.go:117] "RemoveContainer" containerID="bf35d0d0d553b90bea3ea8c8414f7f3701ce5091dc704314fb9c763a78963013" Mar 14 07:23:33 crc kubenswrapper[4893]: E0314 07:23:33.663891 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf35d0d0d553b90bea3ea8c8414f7f3701ce5091dc704314fb9c763a78963013\": container with ID starting with bf35d0d0d553b90bea3ea8c8414f7f3701ce5091dc704314fb9c763a78963013 not found: ID does not exist" containerID="bf35d0d0d553b90bea3ea8c8414f7f3701ce5091dc704314fb9c763a78963013" Mar 14 07:23:33 crc kubenswrapper[4893]: I0314 07:23:33.663923 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf35d0d0d553b90bea3ea8c8414f7f3701ce5091dc704314fb9c763a78963013"} err="failed to get container status \"bf35d0d0d553b90bea3ea8c8414f7f3701ce5091dc704314fb9c763a78963013\": rpc error: code = NotFound desc = could not find container \"bf35d0d0d553b90bea3ea8c8414f7f3701ce5091dc704314fb9c763a78963013\": container with ID starting with bf35d0d0d553b90bea3ea8c8414f7f3701ce5091dc704314fb9c763a78963013 not found: ID does not exist" Mar 14 07:23:33 crc kubenswrapper[4893]: I0314 07:23:33.666424 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1343ad6-1ec8-4249-826c-df0efc18fcb8-config" (OuterVolumeSpecName: "config") pod "f1343ad6-1ec8-4249-826c-df0efc18fcb8" (UID: "f1343ad6-1ec8-4249-826c-df0efc18fcb8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:23:33 crc kubenswrapper[4893]: I0314 07:23:33.669028 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1343ad6-1ec8-4249-826c-df0efc18fcb8-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f1343ad6-1ec8-4249-826c-df0efc18fcb8" (UID: "f1343ad6-1ec8-4249-826c-df0efc18fcb8"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:23:33 crc kubenswrapper[4893]: I0314 07:23:33.674256 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1343ad6-1ec8-4249-826c-df0efc18fcb8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f1343ad6-1ec8-4249-826c-df0efc18fcb8" (UID: "f1343ad6-1ec8-4249-826c-df0efc18fcb8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:23:33 crc kubenswrapper[4893]: I0314 07:23:33.674383 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1343ad6-1ec8-4249-826c-df0efc18fcb8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f1343ad6-1ec8-4249-826c-df0efc18fcb8" (UID: "f1343ad6-1ec8-4249-826c-df0efc18fcb8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:23:33 crc kubenswrapper[4893]: I0314 07:23:33.676293 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1343ad6-1ec8-4249-826c-df0efc18fcb8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f1343ad6-1ec8-4249-826c-df0efc18fcb8" (UID: "f1343ad6-1ec8-4249-826c-df0efc18fcb8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:23:33 crc kubenswrapper[4893]: I0314 07:23:33.710335 4893 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f1343ad6-1ec8-4249-826c-df0efc18fcb8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:33 crc kubenswrapper[4893]: I0314 07:23:33.710369 4893 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f1343ad6-1ec8-4249-826c-df0efc18fcb8-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:33 crc kubenswrapper[4893]: I0314 07:23:33.710382 4893 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f1343ad6-1ec8-4249-826c-df0efc18fcb8-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:33 crc kubenswrapper[4893]: I0314 07:23:33.710394 4893 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f1343ad6-1ec8-4249-826c-df0efc18fcb8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:33 crc kubenswrapper[4893]: I0314 07:23:33.710406 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m75qh\" (UniqueName: \"kubernetes.io/projected/f1343ad6-1ec8-4249-826c-df0efc18fcb8-kube-api-access-m75qh\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:33 crc kubenswrapper[4893]: I0314 07:23:33.710418 4893 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1343ad6-1ec8-4249-826c-df0efc18fcb8-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:33 crc kubenswrapper[4893]: I0314 07:23:33.993865 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b74b5cfd5-p584x"] Mar 14 07:23:34 crc kubenswrapper[4893]: I0314 07:23:34.005511 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b74b5cfd5-p584x"] Mar 14 07:23:35 crc kubenswrapper[4893]: I0314 07:23:35.395474 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1343ad6-1ec8-4249-826c-df0efc18fcb8" path="/var/lib/kubelet/pods/f1343ad6-1ec8-4249-826c-df0efc18fcb8/volumes" Mar 14 07:23:37 crc kubenswrapper[4893]: E0314 07:23:37.367306 4893 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24dd0d41_68ca_458d_9011_a2c167fda868.slice/crio-conmon-90e9322ef26dbbffc9cc7bc8b8905ff428b7b56d164c235cfcf177745f83890c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24dd0d41_68ca_458d_9011_a2c167fda868.slice/crio-90e9322ef26dbbffc9cc7bc8b8905ff428b7b56d164c235cfcf177745f83890c.scope\": RecentStats: unable to find data in memory cache]" Mar 14 07:23:37 crc kubenswrapper[4893]: I0314 07:23:37.716433 4893 generic.go:334] "Generic (PLEG): container finished" podID="24dd0d41-68ca-458d-9011-a2c167fda868" containerID="90e9322ef26dbbffc9cc7bc8b8905ff428b7b56d164c235cfcf177745f83890c" exitCode=0 Mar 14 07:23:37 crc kubenswrapper[4893]: I0314 07:23:37.716488 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-sw4ds" event={"ID":"24dd0d41-68ca-458d-9011-a2c167fda868","Type":"ContainerDied","Data":"90e9322ef26dbbffc9cc7bc8b8905ff428b7b56d164c235cfcf177745f83890c"} Mar 14 07:23:38 crc kubenswrapper[4893]: I0314 07:23:38.803739 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 14 07:23:38 crc kubenswrapper[4893]: I0314 07:23:38.804050 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 14 07:23:39 crc kubenswrapper[4893]: I0314 07:23:39.126610 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-sw4ds" Mar 14 07:23:39 crc kubenswrapper[4893]: I0314 07:23:39.244720 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xj7h4\" (UniqueName: \"kubernetes.io/projected/24dd0d41-68ca-458d-9011-a2c167fda868-kube-api-access-xj7h4\") pod \"24dd0d41-68ca-458d-9011-a2c167fda868\" (UID: \"24dd0d41-68ca-458d-9011-a2c167fda868\") " Mar 14 07:23:39 crc kubenswrapper[4893]: I0314 07:23:39.244912 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24dd0d41-68ca-458d-9011-a2c167fda868-combined-ca-bundle\") pod \"24dd0d41-68ca-458d-9011-a2c167fda868\" (UID: \"24dd0d41-68ca-458d-9011-a2c167fda868\") " Mar 14 07:23:39 crc kubenswrapper[4893]: I0314 07:23:39.244993 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24dd0d41-68ca-458d-9011-a2c167fda868-scripts\") pod \"24dd0d41-68ca-458d-9011-a2c167fda868\" (UID: \"24dd0d41-68ca-458d-9011-a2c167fda868\") " Mar 14 07:23:39 crc kubenswrapper[4893]: I0314 07:23:39.245136 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24dd0d41-68ca-458d-9011-a2c167fda868-config-data\") pod \"24dd0d41-68ca-458d-9011-a2c167fda868\" (UID: \"24dd0d41-68ca-458d-9011-a2c167fda868\") " Mar 14 07:23:39 crc kubenswrapper[4893]: I0314 07:23:39.249718 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24dd0d41-68ca-458d-9011-a2c167fda868-kube-api-access-xj7h4" (OuterVolumeSpecName: "kube-api-access-xj7h4") pod "24dd0d41-68ca-458d-9011-a2c167fda868" (UID: "24dd0d41-68ca-458d-9011-a2c167fda868"). InnerVolumeSpecName "kube-api-access-xj7h4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:23:39 crc kubenswrapper[4893]: I0314 07:23:39.260987 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24dd0d41-68ca-458d-9011-a2c167fda868-scripts" (OuterVolumeSpecName: "scripts") pod "24dd0d41-68ca-458d-9011-a2c167fda868" (UID: "24dd0d41-68ca-458d-9011-a2c167fda868"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:23:39 crc kubenswrapper[4893]: I0314 07:23:39.290034 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24dd0d41-68ca-458d-9011-a2c167fda868-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "24dd0d41-68ca-458d-9011-a2c167fda868" (UID: "24dd0d41-68ca-458d-9011-a2c167fda868"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:23:39 crc kubenswrapper[4893]: I0314 07:23:39.292603 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24dd0d41-68ca-458d-9011-a2c167fda868-config-data" (OuterVolumeSpecName: "config-data") pod "24dd0d41-68ca-458d-9011-a2c167fda868" (UID: "24dd0d41-68ca-458d-9011-a2c167fda868"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:23:39 crc kubenswrapper[4893]: I0314 07:23:39.348709 4893 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24dd0d41-68ca-458d-9011-a2c167fda868-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:39 crc kubenswrapper[4893]: I0314 07:23:39.348778 4893 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24dd0d41-68ca-458d-9011-a2c167fda868-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:39 crc kubenswrapper[4893]: I0314 07:23:39.348806 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xj7h4\" (UniqueName: \"kubernetes.io/projected/24dd0d41-68ca-458d-9011-a2c167fda868-kube-api-access-xj7h4\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:39 crc kubenswrapper[4893]: I0314 07:23:39.348835 4893 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24dd0d41-68ca-458d-9011-a2c167fda868-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:39 crc kubenswrapper[4893]: I0314 07:23:39.743944 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-sw4ds" event={"ID":"24dd0d41-68ca-458d-9011-a2c167fda868","Type":"ContainerDied","Data":"abdd0774b03685891e1edb6ac45f1012a5b9fa1fd7e8e91af13488b7bcd10950"} Mar 14 07:23:39 crc kubenswrapper[4893]: I0314 07:23:39.743998 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="abdd0774b03685891e1edb6ac45f1012a5b9fa1fd7e8e91af13488b7bcd10950" Mar 14 07:23:39 crc kubenswrapper[4893]: I0314 07:23:39.744072 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-sw4ds" Mar 14 07:23:39 crc kubenswrapper[4893]: I0314 07:23:39.889885 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 14 07:23:39 crc kubenswrapper[4893]: I0314 07:23:39.889966 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 14 07:23:39 crc kubenswrapper[4893]: I0314 07:23:39.941036 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 14 07:23:39 crc kubenswrapper[4893]: I0314 07:23:39.954099 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 07:23:39 crc kubenswrapper[4893]: I0314 07:23:39.954343 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="4bbd017e-988b-409e-861c-098bb3ab86ca" containerName="nova-scheduler-scheduler" containerID="cri-o://b2311004221e23c0af13a452a232d08910d8dc3bb94ea73ec138b9f7e2f2e811" gracePeriod=30 Mar 14 07:23:40 crc kubenswrapper[4893]: I0314 07:23:40.008361 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 07:23:40 crc kubenswrapper[4893]: I0314 07:23:40.008757 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="37ed3c89-6d6e-481f-a8cb-c96b04c5c13a" containerName="nova-metadata-log" containerID="cri-o://45d66f4f6aaf1243be6d7f8337f16f24a2c1f42d48bef2ec7ad7845befd6e10e" gracePeriod=30 Mar 14 07:23:40 crc kubenswrapper[4893]: I0314 07:23:40.008878 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="37ed3c89-6d6e-481f-a8cb-c96b04c5c13a" containerName="nova-metadata-metadata" containerID="cri-o://a2d2e00f0379f1d18e6c37655d633b0f21a0f8144a77f9ea02d818de5a06ae1a" gracePeriod=30 Mar 14 07:23:40 crc kubenswrapper[4893]: I0314 07:23:40.757018 4893 generic.go:334] "Generic (PLEG): container finished" podID="37ed3c89-6d6e-481f-a8cb-c96b04c5c13a" containerID="45d66f4f6aaf1243be6d7f8337f16f24a2c1f42d48bef2ec7ad7845befd6e10e" exitCode=143 Mar 14 07:23:40 crc kubenswrapper[4893]: I0314 07:23:40.757321 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"37ed3c89-6d6e-481f-a8cb-c96b04c5c13a","Type":"ContainerDied","Data":"45d66f4f6aaf1243be6d7f8337f16f24a2c1f42d48bef2ec7ad7845befd6e10e"} Mar 14 07:23:40 crc kubenswrapper[4893]: I0314 07:23:40.757714 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b91c66f1-34a5-40f1-a3b7-c303e2c0a58b" containerName="nova-api-log" containerID="cri-o://cae30b6316de56e4496fa20c46e145f4daf192797854b1e46da7af248c75d2d5" gracePeriod=30 Mar 14 07:23:40 crc kubenswrapper[4893]: I0314 07:23:40.757758 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b91c66f1-34a5-40f1-a3b7-c303e2c0a58b" containerName="nova-api-api" containerID="cri-o://82f19db5e133bed623ad9bee110b2012ffd8963ece27ea07c5d6440a3c2cc228" gracePeriod=30 Mar 14 07:23:40 crc kubenswrapper[4893]: I0314 07:23:40.775424 4893 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b91c66f1-34a5-40f1-a3b7-c303e2c0a58b" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.211:8774/\": EOF" Mar 14 07:23:40 crc kubenswrapper[4893]: I0314 07:23:40.775467 4893 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b91c66f1-34a5-40f1-a3b7-c303e2c0a58b" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.211:8774/\": EOF" Mar 14 07:23:41 crc kubenswrapper[4893]: E0314 07:23:41.528651 4893 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b2311004221e23c0af13a452a232d08910d8dc3bb94ea73ec138b9f7e2f2e811" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 14 07:23:41 crc kubenswrapper[4893]: E0314 07:23:41.530681 4893 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b2311004221e23c0af13a452a232d08910d8dc3bb94ea73ec138b9f7e2f2e811" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 14 07:23:41 crc kubenswrapper[4893]: E0314 07:23:41.532388 4893 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b2311004221e23c0af13a452a232d08910d8dc3bb94ea73ec138b9f7e2f2e811" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 14 07:23:41 crc kubenswrapper[4893]: E0314 07:23:41.532469 4893 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="4bbd017e-988b-409e-861c-098bb3ab86ca" containerName="nova-scheduler-scheduler" Mar 14 07:23:41 crc kubenswrapper[4893]: I0314 07:23:41.775236 4893 generic.go:334] "Generic (PLEG): container finished" podID="b91c66f1-34a5-40f1-a3b7-c303e2c0a58b" containerID="cae30b6316de56e4496fa20c46e145f4daf192797854b1e46da7af248c75d2d5" exitCode=143 Mar 14 07:23:41 crc kubenswrapper[4893]: I0314 07:23:41.775348 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b91c66f1-34a5-40f1-a3b7-c303e2c0a58b","Type":"ContainerDied","Data":"cae30b6316de56e4496fa20c46e145f4daf192797854b1e46da7af248c75d2d5"} Mar 14 07:23:43 crc kubenswrapper[4893]: I0314 07:23:43.629403 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 14 07:23:43 crc kubenswrapper[4893]: I0314 07:23:43.732959 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37ed3c89-6d6e-481f-a8cb-c96b04c5c13a-combined-ca-bundle\") pod \"37ed3c89-6d6e-481f-a8cb-c96b04c5c13a\" (UID: \"37ed3c89-6d6e-481f-a8cb-c96b04c5c13a\") " Mar 14 07:23:43 crc kubenswrapper[4893]: I0314 07:23:43.733026 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/37ed3c89-6d6e-481f-a8cb-c96b04c5c13a-nova-metadata-tls-certs\") pod \"37ed3c89-6d6e-481f-a8cb-c96b04c5c13a\" (UID: \"37ed3c89-6d6e-481f-a8cb-c96b04c5c13a\") " Mar 14 07:23:43 crc kubenswrapper[4893]: I0314 07:23:43.733285 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jm94\" (UniqueName: \"kubernetes.io/projected/37ed3c89-6d6e-481f-a8cb-c96b04c5c13a-kube-api-access-2jm94\") pod \"37ed3c89-6d6e-481f-a8cb-c96b04c5c13a\" (UID: \"37ed3c89-6d6e-481f-a8cb-c96b04c5c13a\") " Mar 14 07:23:43 crc kubenswrapper[4893]: I0314 07:23:43.733318 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37ed3c89-6d6e-481f-a8cb-c96b04c5c13a-logs\") pod \"37ed3c89-6d6e-481f-a8cb-c96b04c5c13a\" (UID: \"37ed3c89-6d6e-481f-a8cb-c96b04c5c13a\") " Mar 14 07:23:43 crc kubenswrapper[4893]: I0314 07:23:43.733335 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37ed3c89-6d6e-481f-a8cb-c96b04c5c13a-config-data\") pod \"37ed3c89-6d6e-481f-a8cb-c96b04c5c13a\" (UID: \"37ed3c89-6d6e-481f-a8cb-c96b04c5c13a\") " Mar 14 07:23:43 crc kubenswrapper[4893]: I0314 07:23:43.734170 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37ed3c89-6d6e-481f-a8cb-c96b04c5c13a-logs" (OuterVolumeSpecName: "logs") pod "37ed3c89-6d6e-481f-a8cb-c96b04c5c13a" (UID: "37ed3c89-6d6e-481f-a8cb-c96b04c5c13a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:23:43 crc kubenswrapper[4893]: I0314 07:23:43.740444 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37ed3c89-6d6e-481f-a8cb-c96b04c5c13a-kube-api-access-2jm94" (OuterVolumeSpecName: "kube-api-access-2jm94") pod "37ed3c89-6d6e-481f-a8cb-c96b04c5c13a" (UID: "37ed3c89-6d6e-481f-a8cb-c96b04c5c13a"). InnerVolumeSpecName "kube-api-access-2jm94". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:23:43 crc kubenswrapper[4893]: I0314 07:23:43.759949 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37ed3c89-6d6e-481f-a8cb-c96b04c5c13a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "37ed3c89-6d6e-481f-a8cb-c96b04c5c13a" (UID: "37ed3c89-6d6e-481f-a8cb-c96b04c5c13a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:23:43 crc kubenswrapper[4893]: I0314 07:23:43.760877 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37ed3c89-6d6e-481f-a8cb-c96b04c5c13a-config-data" (OuterVolumeSpecName: "config-data") pod "37ed3c89-6d6e-481f-a8cb-c96b04c5c13a" (UID: "37ed3c89-6d6e-481f-a8cb-c96b04c5c13a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:23:43 crc kubenswrapper[4893]: I0314 07:23:43.789813 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37ed3c89-6d6e-481f-a8cb-c96b04c5c13a-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "37ed3c89-6d6e-481f-a8cb-c96b04c5c13a" (UID: "37ed3c89-6d6e-481f-a8cb-c96b04c5c13a"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:23:43 crc kubenswrapper[4893]: I0314 07:23:43.802514 4893 generic.go:334] "Generic (PLEG): container finished" podID="37ed3c89-6d6e-481f-a8cb-c96b04c5c13a" containerID="a2d2e00f0379f1d18e6c37655d633b0f21a0f8144a77f9ea02d818de5a06ae1a" exitCode=0 Mar 14 07:23:43 crc kubenswrapper[4893]: I0314 07:23:43.802622 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"37ed3c89-6d6e-481f-a8cb-c96b04c5c13a","Type":"ContainerDied","Data":"a2d2e00f0379f1d18e6c37655d633b0f21a0f8144a77f9ea02d818de5a06ae1a"} Mar 14 07:23:43 crc kubenswrapper[4893]: I0314 07:23:43.802669 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"37ed3c89-6d6e-481f-a8cb-c96b04c5c13a","Type":"ContainerDied","Data":"be20cb920ae2eb8c6359ac73e2e243eb4089bce3f398bc0ee44624b444c23810"} Mar 14 07:23:43 crc kubenswrapper[4893]: I0314 07:23:43.802706 4893 scope.go:117] "RemoveContainer" containerID="a2d2e00f0379f1d18e6c37655d633b0f21a0f8144a77f9ea02d818de5a06ae1a" Mar 14 07:23:43 crc kubenswrapper[4893]: I0314 07:23:43.802910 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 14 07:23:43 crc kubenswrapper[4893]: I0314 07:23:43.836050 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jm94\" (UniqueName: \"kubernetes.io/projected/37ed3c89-6d6e-481f-a8cb-c96b04c5c13a-kube-api-access-2jm94\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:43 crc kubenswrapper[4893]: I0314 07:23:43.836100 4893 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37ed3c89-6d6e-481f-a8cb-c96b04c5c13a-logs\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:43 crc kubenswrapper[4893]: I0314 07:23:43.836119 4893 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37ed3c89-6d6e-481f-a8cb-c96b04c5c13a-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:43 crc kubenswrapper[4893]: I0314 07:23:43.836139 4893 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37ed3c89-6d6e-481f-a8cb-c96b04c5c13a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:43 crc kubenswrapper[4893]: I0314 07:23:43.836156 4893 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/37ed3c89-6d6e-481f-a8cb-c96b04c5c13a-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:43 crc kubenswrapper[4893]: I0314 07:23:43.860324 4893 scope.go:117] "RemoveContainer" containerID="45d66f4f6aaf1243be6d7f8337f16f24a2c1f42d48bef2ec7ad7845befd6e10e" Mar 14 07:23:43 crc kubenswrapper[4893]: I0314 07:23:43.862917 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 07:23:43 crc kubenswrapper[4893]: I0314 07:23:43.878626 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 07:23:43 crc kubenswrapper[4893]: I0314 07:23:43.890180 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 14 07:23:43 crc kubenswrapper[4893]: E0314 07:23:43.890624 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1343ad6-1ec8-4249-826c-df0efc18fcb8" containerName="dnsmasq-dns" Mar 14 07:23:43 crc kubenswrapper[4893]: I0314 07:23:43.890636 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1343ad6-1ec8-4249-826c-df0efc18fcb8" containerName="dnsmasq-dns" Mar 14 07:23:43 crc kubenswrapper[4893]: E0314 07:23:43.890652 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37ed3c89-6d6e-481f-a8cb-c96b04c5c13a" containerName="nova-metadata-log" Mar 14 07:23:43 crc kubenswrapper[4893]: I0314 07:23:43.890658 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="37ed3c89-6d6e-481f-a8cb-c96b04c5c13a" containerName="nova-metadata-log" Mar 14 07:23:43 crc kubenswrapper[4893]: E0314 07:23:43.890675 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1343ad6-1ec8-4249-826c-df0efc18fcb8" containerName="init" Mar 14 07:23:43 crc kubenswrapper[4893]: I0314 07:23:43.890681 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1343ad6-1ec8-4249-826c-df0efc18fcb8" containerName="init" Mar 14 07:23:43 crc kubenswrapper[4893]: E0314 07:23:43.890694 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37ed3c89-6d6e-481f-a8cb-c96b04c5c13a" containerName="nova-metadata-metadata" Mar 14 07:23:43 crc kubenswrapper[4893]: I0314 07:23:43.890700 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="37ed3c89-6d6e-481f-a8cb-c96b04c5c13a" containerName="nova-metadata-metadata" Mar 14 07:23:43 crc kubenswrapper[4893]: E0314 07:23:43.890711 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24dd0d41-68ca-458d-9011-a2c167fda868" containerName="nova-manage" Mar 14 07:23:43 crc kubenswrapper[4893]: I0314 07:23:43.890717 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="24dd0d41-68ca-458d-9011-a2c167fda868" containerName="nova-manage" Mar 14 07:23:43 crc kubenswrapper[4893]: I0314 07:23:43.890925 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="37ed3c89-6d6e-481f-a8cb-c96b04c5c13a" containerName="nova-metadata-metadata" Mar 14 07:23:43 crc kubenswrapper[4893]: I0314 07:23:43.890936 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1343ad6-1ec8-4249-826c-df0efc18fcb8" containerName="dnsmasq-dns" Mar 14 07:23:43 crc kubenswrapper[4893]: I0314 07:23:43.890948 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="37ed3c89-6d6e-481f-a8cb-c96b04c5c13a" containerName="nova-metadata-log" Mar 14 07:23:43 crc kubenswrapper[4893]: I0314 07:23:43.890958 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="24dd0d41-68ca-458d-9011-a2c167fda868" containerName="nova-manage" Mar 14 07:23:43 crc kubenswrapper[4893]: I0314 07:23:43.893763 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 14 07:23:43 crc kubenswrapper[4893]: I0314 07:23:43.894653 4893 scope.go:117] "RemoveContainer" containerID="a2d2e00f0379f1d18e6c37655d633b0f21a0f8144a77f9ea02d818de5a06ae1a" Mar 14 07:23:43 crc kubenswrapper[4893]: E0314 07:23:43.895190 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2d2e00f0379f1d18e6c37655d633b0f21a0f8144a77f9ea02d818de5a06ae1a\": container with ID starting with a2d2e00f0379f1d18e6c37655d633b0f21a0f8144a77f9ea02d818de5a06ae1a not found: ID does not exist" containerID="a2d2e00f0379f1d18e6c37655d633b0f21a0f8144a77f9ea02d818de5a06ae1a" Mar 14 07:23:43 crc kubenswrapper[4893]: I0314 07:23:43.895225 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2d2e00f0379f1d18e6c37655d633b0f21a0f8144a77f9ea02d818de5a06ae1a"} err="failed to get container status \"a2d2e00f0379f1d18e6c37655d633b0f21a0f8144a77f9ea02d818de5a06ae1a\": rpc error: code = NotFound desc = could not find container \"a2d2e00f0379f1d18e6c37655d633b0f21a0f8144a77f9ea02d818de5a06ae1a\": container with ID starting with a2d2e00f0379f1d18e6c37655d633b0f21a0f8144a77f9ea02d818de5a06ae1a not found: ID does not exist" Mar 14 07:23:43 crc kubenswrapper[4893]: I0314 07:23:43.895300 4893 scope.go:117] "RemoveContainer" containerID="45d66f4f6aaf1243be6d7f8337f16f24a2c1f42d48bef2ec7ad7845befd6e10e" Mar 14 07:23:43 crc kubenswrapper[4893]: E0314 07:23:43.897192 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45d66f4f6aaf1243be6d7f8337f16f24a2c1f42d48bef2ec7ad7845befd6e10e\": container with ID starting with 45d66f4f6aaf1243be6d7f8337f16f24a2c1f42d48bef2ec7ad7845befd6e10e not found: ID does not exist" containerID="45d66f4f6aaf1243be6d7f8337f16f24a2c1f42d48bef2ec7ad7845befd6e10e" Mar 14 07:23:43 crc kubenswrapper[4893]: I0314 07:23:43.897253 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45d66f4f6aaf1243be6d7f8337f16f24a2c1f42d48bef2ec7ad7845befd6e10e"} err="failed to get container status \"45d66f4f6aaf1243be6d7f8337f16f24a2c1f42d48bef2ec7ad7845befd6e10e\": rpc error: code = NotFound desc = could not find container \"45d66f4f6aaf1243be6d7f8337f16f24a2c1f42d48bef2ec7ad7845befd6e10e\": container with ID starting with 45d66f4f6aaf1243be6d7f8337f16f24a2c1f42d48bef2ec7ad7845befd6e10e not found: ID does not exist" Mar 14 07:23:43 crc kubenswrapper[4893]: I0314 07:23:43.897336 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 14 07:23:43 crc kubenswrapper[4893]: I0314 07:23:43.897381 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 14 07:23:43 crc kubenswrapper[4893]: I0314 07:23:43.908760 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 07:23:44 crc kubenswrapper[4893]: I0314 07:23:44.041159 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6ca04fa-accd-437a-ab63-d39d14a49777-config-data\") pod \"nova-metadata-0\" (UID: \"b6ca04fa-accd-437a-ab63-d39d14a49777\") " pod="openstack/nova-metadata-0" Mar 14 07:23:44 crc kubenswrapper[4893]: I0314 07:23:44.041223 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6ca04fa-accd-437a-ab63-d39d14a49777-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b6ca04fa-accd-437a-ab63-d39d14a49777\") " pod="openstack/nova-metadata-0" Mar 14 07:23:44 crc kubenswrapper[4893]: I0314 07:23:44.041449 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6ca04fa-accd-437a-ab63-d39d14a49777-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b6ca04fa-accd-437a-ab63-d39d14a49777\") " pod="openstack/nova-metadata-0" Mar 14 07:23:44 crc kubenswrapper[4893]: I0314 07:23:44.041491 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4lnp\" (UniqueName: \"kubernetes.io/projected/b6ca04fa-accd-437a-ab63-d39d14a49777-kube-api-access-m4lnp\") pod \"nova-metadata-0\" (UID: \"b6ca04fa-accd-437a-ab63-d39d14a49777\") " pod="openstack/nova-metadata-0" Mar 14 07:23:44 crc kubenswrapper[4893]: I0314 07:23:44.041618 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6ca04fa-accd-437a-ab63-d39d14a49777-logs\") pod \"nova-metadata-0\" (UID: \"b6ca04fa-accd-437a-ab63-d39d14a49777\") " pod="openstack/nova-metadata-0" Mar 14 07:23:44 crc kubenswrapper[4893]: I0314 07:23:44.143634 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6ca04fa-accd-437a-ab63-d39d14a49777-config-data\") pod \"nova-metadata-0\" (UID: \"b6ca04fa-accd-437a-ab63-d39d14a49777\") " pod="openstack/nova-metadata-0" Mar 14 07:23:44 crc kubenswrapper[4893]: I0314 07:23:44.143727 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6ca04fa-accd-437a-ab63-d39d14a49777-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b6ca04fa-accd-437a-ab63-d39d14a49777\") " pod="openstack/nova-metadata-0" Mar 14 07:23:44 crc kubenswrapper[4893]: I0314 07:23:44.143857 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6ca04fa-accd-437a-ab63-d39d14a49777-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b6ca04fa-accd-437a-ab63-d39d14a49777\") " pod="openstack/nova-metadata-0" Mar 14 07:23:44 crc kubenswrapper[4893]: I0314 07:23:44.143897 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4lnp\" (UniqueName: \"kubernetes.io/projected/b6ca04fa-accd-437a-ab63-d39d14a49777-kube-api-access-m4lnp\") pod \"nova-metadata-0\" (UID: \"b6ca04fa-accd-437a-ab63-d39d14a49777\") " pod="openstack/nova-metadata-0" Mar 14 07:23:44 crc kubenswrapper[4893]: I0314 07:23:44.143980 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6ca04fa-accd-437a-ab63-d39d14a49777-logs\") pod \"nova-metadata-0\" (UID: \"b6ca04fa-accd-437a-ab63-d39d14a49777\") " pod="openstack/nova-metadata-0" Mar 14 07:23:44 crc kubenswrapper[4893]: I0314 07:23:44.144991 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6ca04fa-accd-437a-ab63-d39d14a49777-logs\") pod \"nova-metadata-0\" (UID: \"b6ca04fa-accd-437a-ab63-d39d14a49777\") " pod="openstack/nova-metadata-0" Mar 14 07:23:44 crc kubenswrapper[4893]: I0314 07:23:44.148412 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6ca04fa-accd-437a-ab63-d39d14a49777-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b6ca04fa-accd-437a-ab63-d39d14a49777\") " pod="openstack/nova-metadata-0" Mar 14 07:23:44 crc kubenswrapper[4893]: I0314 07:23:44.148614 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6ca04fa-accd-437a-ab63-d39d14a49777-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b6ca04fa-accd-437a-ab63-d39d14a49777\") " pod="openstack/nova-metadata-0" Mar 14 07:23:44 crc kubenswrapper[4893]: I0314 07:23:44.151432 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6ca04fa-accd-437a-ab63-d39d14a49777-config-data\") pod \"nova-metadata-0\" (UID: \"b6ca04fa-accd-437a-ab63-d39d14a49777\") " pod="openstack/nova-metadata-0" Mar 14 07:23:44 crc kubenswrapper[4893]: I0314 07:23:44.160509 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4lnp\" (UniqueName: \"kubernetes.io/projected/b6ca04fa-accd-437a-ab63-d39d14a49777-kube-api-access-m4lnp\") pod \"nova-metadata-0\" (UID: \"b6ca04fa-accd-437a-ab63-d39d14a49777\") " pod="openstack/nova-metadata-0" Mar 14 07:23:44 crc kubenswrapper[4893]: I0314 07:23:44.230798 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 14 07:23:44 crc kubenswrapper[4893]: I0314 07:23:44.714141 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 07:23:44 crc kubenswrapper[4893]: I0314 07:23:44.818888 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b6ca04fa-accd-437a-ab63-d39d14a49777","Type":"ContainerStarted","Data":"c757d516e66354fa6a18b9042b8d29238cef02b11c13610c41f431086a774b2e"} Mar 14 07:23:45 crc kubenswrapper[4893]: I0314 07:23:45.386957 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37ed3c89-6d6e-481f-a8cb-c96b04c5c13a" path="/var/lib/kubelet/pods/37ed3c89-6d6e-481f-a8cb-c96b04c5c13a/volumes" Mar 14 07:23:45 crc kubenswrapper[4893]: I0314 07:23:45.635487 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 14 07:23:45 crc kubenswrapper[4893]: I0314 07:23:45.674990 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9rgt\" (UniqueName: \"kubernetes.io/projected/4bbd017e-988b-409e-861c-098bb3ab86ca-kube-api-access-l9rgt\") pod \"4bbd017e-988b-409e-861c-098bb3ab86ca\" (UID: \"4bbd017e-988b-409e-861c-098bb3ab86ca\") " Mar 14 07:23:45 crc kubenswrapper[4893]: I0314 07:23:45.675315 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bbd017e-988b-409e-861c-098bb3ab86ca-combined-ca-bundle\") pod \"4bbd017e-988b-409e-861c-098bb3ab86ca\" (UID: \"4bbd017e-988b-409e-861c-098bb3ab86ca\") " Mar 14 07:23:45 crc kubenswrapper[4893]: I0314 07:23:45.675400 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bbd017e-988b-409e-861c-098bb3ab86ca-config-data\") pod \"4bbd017e-988b-409e-861c-098bb3ab86ca\" (UID: \"4bbd017e-988b-409e-861c-098bb3ab86ca\") " Mar 14 07:23:45 crc kubenswrapper[4893]: I0314 07:23:45.683693 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bbd017e-988b-409e-861c-098bb3ab86ca-kube-api-access-l9rgt" (OuterVolumeSpecName: "kube-api-access-l9rgt") pod "4bbd017e-988b-409e-861c-098bb3ab86ca" (UID: "4bbd017e-988b-409e-861c-098bb3ab86ca"). InnerVolumeSpecName "kube-api-access-l9rgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:23:45 crc kubenswrapper[4893]: I0314 07:23:45.710073 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bbd017e-988b-409e-861c-098bb3ab86ca-config-data" (OuterVolumeSpecName: "config-data") pod "4bbd017e-988b-409e-861c-098bb3ab86ca" (UID: "4bbd017e-988b-409e-861c-098bb3ab86ca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:23:45 crc kubenswrapper[4893]: I0314 07:23:45.717410 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bbd017e-988b-409e-861c-098bb3ab86ca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4bbd017e-988b-409e-861c-098bb3ab86ca" (UID: "4bbd017e-988b-409e-861c-098bb3ab86ca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:23:45 crc kubenswrapper[4893]: I0314 07:23:45.777488 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9rgt\" (UniqueName: \"kubernetes.io/projected/4bbd017e-988b-409e-861c-098bb3ab86ca-kube-api-access-l9rgt\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:45 crc kubenswrapper[4893]: I0314 07:23:45.777548 4893 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bbd017e-988b-409e-861c-098bb3ab86ca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:45 crc kubenswrapper[4893]: I0314 07:23:45.777560 4893 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bbd017e-988b-409e-861c-098bb3ab86ca-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:45 crc kubenswrapper[4893]: I0314 07:23:45.829943 4893 generic.go:334] "Generic (PLEG): container finished" podID="4bbd017e-988b-409e-861c-098bb3ab86ca" containerID="b2311004221e23c0af13a452a232d08910d8dc3bb94ea73ec138b9f7e2f2e811" exitCode=0 Mar 14 07:23:45 crc kubenswrapper[4893]: I0314 07:23:45.829994 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4bbd017e-988b-409e-861c-098bb3ab86ca","Type":"ContainerDied","Data":"b2311004221e23c0af13a452a232d08910d8dc3bb94ea73ec138b9f7e2f2e811"} Mar 14 07:23:45 crc kubenswrapper[4893]: I0314 07:23:45.830330 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4bbd017e-988b-409e-861c-098bb3ab86ca","Type":"ContainerDied","Data":"07ac464ecbf483021ef0be8dc7349796787f2d387226a01986484c3f32863e30"} Mar 14 07:23:45 crc kubenswrapper[4893]: I0314 07:23:45.830067 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 14 07:23:45 crc kubenswrapper[4893]: I0314 07:23:45.830357 4893 scope.go:117] "RemoveContainer" containerID="b2311004221e23c0af13a452a232d08910d8dc3bb94ea73ec138b9f7e2f2e811" Mar 14 07:23:45 crc kubenswrapper[4893]: I0314 07:23:45.832779 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b6ca04fa-accd-437a-ab63-d39d14a49777","Type":"ContainerStarted","Data":"3ed3ad5f48a78c2ee75e5dd1a6438ae0b7531b0fefc6d91c0f46aad1b39d3ee9"} Mar 14 07:23:45 crc kubenswrapper[4893]: I0314 07:23:45.832811 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b6ca04fa-accd-437a-ab63-d39d14a49777","Type":"ContainerStarted","Data":"928a88e96ca63f18e3047a55a85109e7284281e3348f6b7748c2c86111c4f15e"} Mar 14 07:23:45 crc kubenswrapper[4893]: I0314 07:23:45.851325 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.851311559 podStartE2EDuration="2.851311559s" podCreationTimestamp="2026-03-14 07:23:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:23:45.849410204 +0000 UTC m=+1505.111586996" watchObservedRunningTime="2026-03-14 07:23:45.851311559 +0000 UTC m=+1505.113488351" Mar 14 07:23:45 crc kubenswrapper[4893]: I0314 07:23:45.869244 4893 scope.go:117] "RemoveContainer" containerID="b2311004221e23c0af13a452a232d08910d8dc3bb94ea73ec138b9f7e2f2e811" Mar 14 07:23:45 crc kubenswrapper[4893]: E0314 07:23:45.874760 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2311004221e23c0af13a452a232d08910d8dc3bb94ea73ec138b9f7e2f2e811\": container with ID starting with b2311004221e23c0af13a452a232d08910d8dc3bb94ea73ec138b9f7e2f2e811 not found: ID does not exist" containerID="b2311004221e23c0af13a452a232d08910d8dc3bb94ea73ec138b9f7e2f2e811" Mar 14 07:23:45 crc kubenswrapper[4893]: I0314 07:23:45.874805 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2311004221e23c0af13a452a232d08910d8dc3bb94ea73ec138b9f7e2f2e811"} err="failed to get container status \"b2311004221e23c0af13a452a232d08910d8dc3bb94ea73ec138b9f7e2f2e811\": rpc error: code = NotFound desc = could not find container \"b2311004221e23c0af13a452a232d08910d8dc3bb94ea73ec138b9f7e2f2e811\": container with ID starting with b2311004221e23c0af13a452a232d08910d8dc3bb94ea73ec138b9f7e2f2e811 not found: ID does not exist" Mar 14 07:23:45 crc kubenswrapper[4893]: I0314 07:23:45.887179 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 07:23:45 crc kubenswrapper[4893]: I0314 07:23:45.899928 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 07:23:45 crc kubenswrapper[4893]: I0314 07:23:45.910796 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 07:23:45 crc kubenswrapper[4893]: E0314 07:23:45.911209 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bbd017e-988b-409e-861c-098bb3ab86ca" containerName="nova-scheduler-scheduler" Mar 14 07:23:45 crc kubenswrapper[4893]: I0314 07:23:45.911227 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bbd017e-988b-409e-861c-098bb3ab86ca" containerName="nova-scheduler-scheduler" Mar 14 07:23:45 crc kubenswrapper[4893]: I0314 07:23:45.911384 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bbd017e-988b-409e-861c-098bb3ab86ca" containerName="nova-scheduler-scheduler" Mar 14 07:23:45 crc kubenswrapper[4893]: I0314 07:23:45.911982 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 14 07:23:45 crc kubenswrapper[4893]: I0314 07:23:45.914295 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 14 07:23:45 crc kubenswrapper[4893]: I0314 07:23:45.920354 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 07:23:45 crc kubenswrapper[4893]: I0314 07:23:45.980836 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dd135e7-b208-4d7f-85f5-05baa2819788-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6dd135e7-b208-4d7f-85f5-05baa2819788\") " pod="openstack/nova-scheduler-0" Mar 14 07:23:45 crc kubenswrapper[4893]: I0314 07:23:45.980923 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dd135e7-b208-4d7f-85f5-05baa2819788-config-data\") pod \"nova-scheduler-0\" (UID: \"6dd135e7-b208-4d7f-85f5-05baa2819788\") " pod="openstack/nova-scheduler-0" Mar 14 07:23:45 crc kubenswrapper[4893]: I0314 07:23:45.981003 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ng759\" (UniqueName: \"kubernetes.io/projected/6dd135e7-b208-4d7f-85f5-05baa2819788-kube-api-access-ng759\") pod \"nova-scheduler-0\" (UID: \"6dd135e7-b208-4d7f-85f5-05baa2819788\") " pod="openstack/nova-scheduler-0" Mar 14 07:23:46 crc kubenswrapper[4893]: I0314 07:23:46.082953 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dd135e7-b208-4d7f-85f5-05baa2819788-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6dd135e7-b208-4d7f-85f5-05baa2819788\") " pod="openstack/nova-scheduler-0" Mar 14 07:23:46 crc kubenswrapper[4893]: I0314 07:23:46.083072 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dd135e7-b208-4d7f-85f5-05baa2819788-config-data\") pod \"nova-scheduler-0\" (UID: \"6dd135e7-b208-4d7f-85f5-05baa2819788\") " pod="openstack/nova-scheduler-0" Mar 14 07:23:46 crc kubenswrapper[4893]: I0314 07:23:46.083160 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ng759\" (UniqueName: \"kubernetes.io/projected/6dd135e7-b208-4d7f-85f5-05baa2819788-kube-api-access-ng759\") pod \"nova-scheduler-0\" (UID: \"6dd135e7-b208-4d7f-85f5-05baa2819788\") " pod="openstack/nova-scheduler-0" Mar 14 07:23:46 crc kubenswrapper[4893]: I0314 07:23:46.088703 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dd135e7-b208-4d7f-85f5-05baa2819788-config-data\") pod \"nova-scheduler-0\" (UID: \"6dd135e7-b208-4d7f-85f5-05baa2819788\") " pod="openstack/nova-scheduler-0" Mar 14 07:23:46 crc kubenswrapper[4893]: I0314 07:23:46.089135 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dd135e7-b208-4d7f-85f5-05baa2819788-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6dd135e7-b208-4d7f-85f5-05baa2819788\") " pod="openstack/nova-scheduler-0" Mar 14 07:23:46 crc kubenswrapper[4893]: I0314 07:23:46.110468 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ng759\" (UniqueName: \"kubernetes.io/projected/6dd135e7-b208-4d7f-85f5-05baa2819788-kube-api-access-ng759\") pod \"nova-scheduler-0\" (UID: \"6dd135e7-b208-4d7f-85f5-05baa2819788\") " pod="openstack/nova-scheduler-0" Mar 14 07:23:46 crc kubenswrapper[4893]: I0314 07:23:46.228771 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 14 07:23:46 crc kubenswrapper[4893]: I0314 07:23:46.676462 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 07:23:46 crc kubenswrapper[4893]: W0314 07:23:46.781671 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6dd135e7_b208_4d7f_85f5_05baa2819788.slice/crio-a73bc026e01dea0510554fd14e663dd76f64073e52963f45bd43cdbbc28f22ce WatchSource:0}: Error finding container a73bc026e01dea0510554fd14e663dd76f64073e52963f45bd43cdbbc28f22ce: Status 404 returned error can't find the container with id a73bc026e01dea0510554fd14e663dd76f64073e52963f45bd43cdbbc28f22ce Mar 14 07:23:46 crc kubenswrapper[4893]: I0314 07:23:46.782189 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 07:23:46 crc kubenswrapper[4893]: I0314 07:23:46.800529 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b91c66f1-34a5-40f1-a3b7-c303e2c0a58b-internal-tls-certs\") pod \"b91c66f1-34a5-40f1-a3b7-c303e2c0a58b\" (UID: \"b91c66f1-34a5-40f1-a3b7-c303e2c0a58b\") " Mar 14 07:23:46 crc kubenswrapper[4893]: I0314 07:23:46.800801 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b91c66f1-34a5-40f1-a3b7-c303e2c0a58b-logs\") pod \"b91c66f1-34a5-40f1-a3b7-c303e2c0a58b\" (UID: \"b91c66f1-34a5-40f1-a3b7-c303e2c0a58b\") " Mar 14 07:23:46 crc kubenswrapper[4893]: I0314 07:23:46.800857 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b91c66f1-34a5-40f1-a3b7-c303e2c0a58b-config-data\") pod \"b91c66f1-34a5-40f1-a3b7-c303e2c0a58b\" (UID: \"b91c66f1-34a5-40f1-a3b7-c303e2c0a58b\") " Mar 14 07:23:46 crc kubenswrapper[4893]: I0314 07:23:46.800906 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b91c66f1-34a5-40f1-a3b7-c303e2c0a58b-public-tls-certs\") pod \"b91c66f1-34a5-40f1-a3b7-c303e2c0a58b\" (UID: \"b91c66f1-34a5-40f1-a3b7-c303e2c0a58b\") " Mar 14 07:23:46 crc kubenswrapper[4893]: I0314 07:23:46.800945 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bv9rn\" (UniqueName: \"kubernetes.io/projected/b91c66f1-34a5-40f1-a3b7-c303e2c0a58b-kube-api-access-bv9rn\") pod \"b91c66f1-34a5-40f1-a3b7-c303e2c0a58b\" (UID: \"b91c66f1-34a5-40f1-a3b7-c303e2c0a58b\") " Mar 14 07:23:46 crc kubenswrapper[4893]: I0314 07:23:46.801032 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b91c66f1-34a5-40f1-a3b7-c303e2c0a58b-combined-ca-bundle\") pod \"b91c66f1-34a5-40f1-a3b7-c303e2c0a58b\" (UID: \"b91c66f1-34a5-40f1-a3b7-c303e2c0a58b\") " Mar 14 07:23:46 crc kubenswrapper[4893]: I0314 07:23:46.801220 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b91c66f1-34a5-40f1-a3b7-c303e2c0a58b-logs" (OuterVolumeSpecName: "logs") pod "b91c66f1-34a5-40f1-a3b7-c303e2c0a58b" (UID: "b91c66f1-34a5-40f1-a3b7-c303e2c0a58b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:23:46 crc kubenswrapper[4893]: I0314 07:23:46.801678 4893 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b91c66f1-34a5-40f1-a3b7-c303e2c0a58b-logs\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:46 crc kubenswrapper[4893]: I0314 07:23:46.804291 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b91c66f1-34a5-40f1-a3b7-c303e2c0a58b-kube-api-access-bv9rn" (OuterVolumeSpecName: "kube-api-access-bv9rn") pod "b91c66f1-34a5-40f1-a3b7-c303e2c0a58b" (UID: "b91c66f1-34a5-40f1-a3b7-c303e2c0a58b"). InnerVolumeSpecName "kube-api-access-bv9rn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:23:46 crc kubenswrapper[4893]: I0314 07:23:46.832197 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b91c66f1-34a5-40f1-a3b7-c303e2c0a58b-config-data" (OuterVolumeSpecName: "config-data") pod "b91c66f1-34a5-40f1-a3b7-c303e2c0a58b" (UID: "b91c66f1-34a5-40f1-a3b7-c303e2c0a58b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:23:46 crc kubenswrapper[4893]: I0314 07:23:46.834796 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b91c66f1-34a5-40f1-a3b7-c303e2c0a58b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b91c66f1-34a5-40f1-a3b7-c303e2c0a58b" (UID: "b91c66f1-34a5-40f1-a3b7-c303e2c0a58b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:23:46 crc kubenswrapper[4893]: I0314 07:23:46.842978 4893 generic.go:334] "Generic (PLEG): container finished" podID="b91c66f1-34a5-40f1-a3b7-c303e2c0a58b" containerID="82f19db5e133bed623ad9bee110b2012ffd8963ece27ea07c5d6440a3c2cc228" exitCode=0 Mar 14 07:23:46 crc kubenswrapper[4893]: I0314 07:23:46.843156 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b91c66f1-34a5-40f1-a3b7-c303e2c0a58b","Type":"ContainerDied","Data":"82f19db5e133bed623ad9bee110b2012ffd8963ece27ea07c5d6440a3c2cc228"} Mar 14 07:23:46 crc kubenswrapper[4893]: I0314 07:23:46.843234 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b91c66f1-34a5-40f1-a3b7-c303e2c0a58b","Type":"ContainerDied","Data":"0610b496775e2065245b50b0c39415e9c04688c2ad24222d1b83f7a20eac4287"} Mar 14 07:23:46 crc kubenswrapper[4893]: I0314 07:23:46.843296 4893 scope.go:117] "RemoveContainer" containerID="82f19db5e133bed623ad9bee110b2012ffd8963ece27ea07c5d6440a3c2cc228" Mar 14 07:23:46 crc kubenswrapper[4893]: I0314 07:23:46.843436 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 07:23:46 crc kubenswrapper[4893]: I0314 07:23:46.848518 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6dd135e7-b208-4d7f-85f5-05baa2819788","Type":"ContainerStarted","Data":"a73bc026e01dea0510554fd14e663dd76f64073e52963f45bd43cdbbc28f22ce"} Mar 14 07:23:46 crc kubenswrapper[4893]: I0314 07:23:46.856597 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b91c66f1-34a5-40f1-a3b7-c303e2c0a58b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b91c66f1-34a5-40f1-a3b7-c303e2c0a58b" (UID: "b91c66f1-34a5-40f1-a3b7-c303e2c0a58b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:23:46 crc kubenswrapper[4893]: I0314 07:23:46.865050 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b91c66f1-34a5-40f1-a3b7-c303e2c0a58b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b91c66f1-34a5-40f1-a3b7-c303e2c0a58b" (UID: "b91c66f1-34a5-40f1-a3b7-c303e2c0a58b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:23:46 crc kubenswrapper[4893]: I0314 07:23:46.898520 4893 scope.go:117] "RemoveContainer" containerID="cae30b6316de56e4496fa20c46e145f4daf192797854b1e46da7af248c75d2d5" Mar 14 07:23:46 crc kubenswrapper[4893]: I0314 07:23:46.903195 4893 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b91c66f1-34a5-40f1-a3b7-c303e2c0a58b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:46 crc kubenswrapper[4893]: I0314 07:23:46.903233 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bv9rn\" (UniqueName: \"kubernetes.io/projected/b91c66f1-34a5-40f1-a3b7-c303e2c0a58b-kube-api-access-bv9rn\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:46 crc kubenswrapper[4893]: I0314 07:23:46.903246 4893 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b91c66f1-34a5-40f1-a3b7-c303e2c0a58b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:46 crc kubenswrapper[4893]: I0314 07:23:46.903254 4893 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b91c66f1-34a5-40f1-a3b7-c303e2c0a58b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:46 crc kubenswrapper[4893]: I0314 07:23:46.903263 4893 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b91c66f1-34a5-40f1-a3b7-c303e2c0a58b-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:46 crc kubenswrapper[4893]: I0314 07:23:46.928071 4893 scope.go:117] "RemoveContainer" containerID="82f19db5e133bed623ad9bee110b2012ffd8963ece27ea07c5d6440a3c2cc228" Mar 14 07:23:46 crc kubenswrapper[4893]: E0314 07:23:46.928562 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82f19db5e133bed623ad9bee110b2012ffd8963ece27ea07c5d6440a3c2cc228\": container with ID starting with 82f19db5e133bed623ad9bee110b2012ffd8963ece27ea07c5d6440a3c2cc228 not found: ID does not exist" containerID="82f19db5e133bed623ad9bee110b2012ffd8963ece27ea07c5d6440a3c2cc228" Mar 14 07:23:46 crc kubenswrapper[4893]: I0314 07:23:46.928599 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82f19db5e133bed623ad9bee110b2012ffd8963ece27ea07c5d6440a3c2cc228"} err="failed to get container status \"82f19db5e133bed623ad9bee110b2012ffd8963ece27ea07c5d6440a3c2cc228\": rpc error: code = NotFound desc = could not find container \"82f19db5e133bed623ad9bee110b2012ffd8963ece27ea07c5d6440a3c2cc228\": container with ID starting with 82f19db5e133bed623ad9bee110b2012ffd8963ece27ea07c5d6440a3c2cc228 not found: ID does not exist" Mar 14 07:23:46 crc kubenswrapper[4893]: I0314 07:23:46.928624 4893 scope.go:117] "RemoveContainer" containerID="cae30b6316de56e4496fa20c46e145f4daf192797854b1e46da7af248c75d2d5" Mar 14 07:23:46 crc kubenswrapper[4893]: E0314 07:23:46.928982 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cae30b6316de56e4496fa20c46e145f4daf192797854b1e46da7af248c75d2d5\": container with ID starting with cae30b6316de56e4496fa20c46e145f4daf192797854b1e46da7af248c75d2d5 not found: ID does not exist" containerID="cae30b6316de56e4496fa20c46e145f4daf192797854b1e46da7af248c75d2d5" Mar 14 07:23:46 crc kubenswrapper[4893]: I0314 07:23:46.929010 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cae30b6316de56e4496fa20c46e145f4daf192797854b1e46da7af248c75d2d5"} err="failed to get container status \"cae30b6316de56e4496fa20c46e145f4daf192797854b1e46da7af248c75d2d5\": rpc error: code = NotFound desc = could not find container \"cae30b6316de56e4496fa20c46e145f4daf192797854b1e46da7af248c75d2d5\": container with ID starting with cae30b6316de56e4496fa20c46e145f4daf192797854b1e46da7af248c75d2d5 not found: ID does not exist" Mar 14 07:23:47 crc kubenswrapper[4893]: I0314 07:23:47.182256 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 14 07:23:47 crc kubenswrapper[4893]: I0314 07:23:47.216725 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 14 07:23:47 crc kubenswrapper[4893]: I0314 07:23:47.241071 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 14 07:23:47 crc kubenswrapper[4893]: E0314 07:23:47.245798 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b91c66f1-34a5-40f1-a3b7-c303e2c0a58b" containerName="nova-api-api" Mar 14 07:23:47 crc kubenswrapper[4893]: I0314 07:23:47.245832 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="b91c66f1-34a5-40f1-a3b7-c303e2c0a58b" containerName="nova-api-api" Mar 14 07:23:47 crc kubenswrapper[4893]: E0314 07:23:47.245890 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b91c66f1-34a5-40f1-a3b7-c303e2c0a58b" containerName="nova-api-log" Mar 14 07:23:47 crc kubenswrapper[4893]: I0314 07:23:47.245899 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="b91c66f1-34a5-40f1-a3b7-c303e2c0a58b" containerName="nova-api-log" Mar 14 07:23:47 crc kubenswrapper[4893]: I0314 07:23:47.246386 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="b91c66f1-34a5-40f1-a3b7-c303e2c0a58b" containerName="nova-api-log" Mar 14 07:23:47 crc kubenswrapper[4893]: I0314 07:23:47.246422 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="b91c66f1-34a5-40f1-a3b7-c303e2c0a58b" containerName="nova-api-api" Mar 14 07:23:47 crc kubenswrapper[4893]: I0314 07:23:47.248750 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 07:23:47 crc kubenswrapper[4893]: I0314 07:23:47.254243 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 14 07:23:47 crc kubenswrapper[4893]: I0314 07:23:47.254315 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 14 07:23:47 crc kubenswrapper[4893]: I0314 07:23:47.254333 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 14 07:23:47 crc kubenswrapper[4893]: I0314 07:23:47.257855 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 14 07:23:47 crc kubenswrapper[4893]: I0314 07:23:47.311713 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0d5e368-4fac-48b9-a64c-717f3acf9388-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a0d5e368-4fac-48b9-a64c-717f3acf9388\") " pod="openstack/nova-api-0" Mar 14 07:23:47 crc kubenswrapper[4893]: I0314 07:23:47.311923 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0d5e368-4fac-48b9-a64c-717f3acf9388-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a0d5e368-4fac-48b9-a64c-717f3acf9388\") " pod="openstack/nova-api-0" Mar 14 07:23:47 crc kubenswrapper[4893]: I0314 07:23:47.312049 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0d5e368-4fac-48b9-a64c-717f3acf9388-config-data\") pod \"nova-api-0\" (UID: \"a0d5e368-4fac-48b9-a64c-717f3acf9388\") " pod="openstack/nova-api-0" Mar 14 07:23:47 crc kubenswrapper[4893]: I0314 07:23:47.312158 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0d5e368-4fac-48b9-a64c-717f3acf9388-public-tls-certs\") pod \"nova-api-0\" (UID: \"a0d5e368-4fac-48b9-a64c-717f3acf9388\") " pod="openstack/nova-api-0" Mar 14 07:23:47 crc kubenswrapper[4893]: I0314 07:23:47.312491 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4prf8\" (UniqueName: \"kubernetes.io/projected/a0d5e368-4fac-48b9-a64c-717f3acf9388-kube-api-access-4prf8\") pod \"nova-api-0\" (UID: \"a0d5e368-4fac-48b9-a64c-717f3acf9388\") " pod="openstack/nova-api-0" Mar 14 07:23:47 crc kubenswrapper[4893]: I0314 07:23:47.312681 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0d5e368-4fac-48b9-a64c-717f3acf9388-logs\") pod \"nova-api-0\" (UID: \"a0d5e368-4fac-48b9-a64c-717f3acf9388\") " pod="openstack/nova-api-0" Mar 14 07:23:47 crc kubenswrapper[4893]: I0314 07:23:47.392704 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bbd017e-988b-409e-861c-098bb3ab86ca" path="/var/lib/kubelet/pods/4bbd017e-988b-409e-861c-098bb3ab86ca/volumes" Mar 14 07:23:47 crc kubenswrapper[4893]: I0314 07:23:47.393862 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b91c66f1-34a5-40f1-a3b7-c303e2c0a58b" path="/var/lib/kubelet/pods/b91c66f1-34a5-40f1-a3b7-c303e2c0a58b/volumes" Mar 14 07:23:47 crc kubenswrapper[4893]: I0314 07:23:47.413898 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4prf8\" (UniqueName: \"kubernetes.io/projected/a0d5e368-4fac-48b9-a64c-717f3acf9388-kube-api-access-4prf8\") pod \"nova-api-0\" (UID: \"a0d5e368-4fac-48b9-a64c-717f3acf9388\") " pod="openstack/nova-api-0" Mar 14 07:23:47 crc kubenswrapper[4893]: I0314 07:23:47.413966 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0d5e368-4fac-48b9-a64c-717f3acf9388-logs\") pod \"nova-api-0\" (UID: \"a0d5e368-4fac-48b9-a64c-717f3acf9388\") " pod="openstack/nova-api-0" Mar 14 07:23:47 crc kubenswrapper[4893]: I0314 07:23:47.414002 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0d5e368-4fac-48b9-a64c-717f3acf9388-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a0d5e368-4fac-48b9-a64c-717f3acf9388\") " pod="openstack/nova-api-0" Mar 14 07:23:47 crc kubenswrapper[4893]: I0314 07:23:47.414047 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0d5e368-4fac-48b9-a64c-717f3acf9388-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a0d5e368-4fac-48b9-a64c-717f3acf9388\") " pod="openstack/nova-api-0" Mar 14 07:23:47 crc kubenswrapper[4893]: I0314 07:23:47.414081 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0d5e368-4fac-48b9-a64c-717f3acf9388-config-data\") pod \"nova-api-0\" (UID: \"a0d5e368-4fac-48b9-a64c-717f3acf9388\") " pod="openstack/nova-api-0" Mar 14 07:23:47 crc kubenswrapper[4893]: I0314 07:23:47.414103 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0d5e368-4fac-48b9-a64c-717f3acf9388-public-tls-certs\") pod \"nova-api-0\" (UID: \"a0d5e368-4fac-48b9-a64c-717f3acf9388\") " pod="openstack/nova-api-0" Mar 14 07:23:47 crc kubenswrapper[4893]: I0314 07:23:47.419891 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0d5e368-4fac-48b9-a64c-717f3acf9388-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a0d5e368-4fac-48b9-a64c-717f3acf9388\") " pod="openstack/nova-api-0" Mar 14 07:23:47 crc kubenswrapper[4893]: I0314 07:23:47.420973 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0d5e368-4fac-48b9-a64c-717f3acf9388-logs\") pod \"nova-api-0\" (UID: \"a0d5e368-4fac-48b9-a64c-717f3acf9388\") " pod="openstack/nova-api-0" Mar 14 07:23:47 crc kubenswrapper[4893]: I0314 07:23:47.424921 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0d5e368-4fac-48b9-a64c-717f3acf9388-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a0d5e368-4fac-48b9-a64c-717f3acf9388\") " pod="openstack/nova-api-0" Mar 14 07:23:47 crc kubenswrapper[4893]: I0314 07:23:47.427181 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0d5e368-4fac-48b9-a64c-717f3acf9388-public-tls-certs\") pod \"nova-api-0\" (UID: \"a0d5e368-4fac-48b9-a64c-717f3acf9388\") " pod="openstack/nova-api-0" Mar 14 07:23:47 crc kubenswrapper[4893]: I0314 07:23:47.428746 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0d5e368-4fac-48b9-a64c-717f3acf9388-config-data\") pod \"nova-api-0\" (UID: \"a0d5e368-4fac-48b9-a64c-717f3acf9388\") " pod="openstack/nova-api-0" Mar 14 07:23:47 crc kubenswrapper[4893]: I0314 07:23:47.437500 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4prf8\" (UniqueName: \"kubernetes.io/projected/a0d5e368-4fac-48b9-a64c-717f3acf9388-kube-api-access-4prf8\") pod \"nova-api-0\" (UID: \"a0d5e368-4fac-48b9-a64c-717f3acf9388\") " pod="openstack/nova-api-0" Mar 14 07:23:47 crc kubenswrapper[4893]: I0314 07:23:47.577264 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 07:23:47 crc kubenswrapper[4893]: I0314 07:23:47.860985 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6dd135e7-b208-4d7f-85f5-05baa2819788","Type":"ContainerStarted","Data":"34fd8df3d0beb69b7fc8e1a539f4034f7bd7a4fd0c322b7a5fd50447ba69c38c"} Mar 14 07:23:47 crc kubenswrapper[4893]: I0314 07:23:47.887879 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.887855987 podStartE2EDuration="2.887855987s" podCreationTimestamp="2026-03-14 07:23:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:23:47.875791554 +0000 UTC m=+1507.137968366" watchObservedRunningTime="2026-03-14 07:23:47.887855987 +0000 UTC m=+1507.150032799" Mar 14 07:23:47 crc kubenswrapper[4893]: W0314 07:23:47.894790 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0d5e368_4fac_48b9_a64c_717f3acf9388.slice/crio-0e443246654e1bf453e67cfc9d4f720cd7e137ce2408b6c5f4c882b5ee0c4b30 WatchSource:0}: Error finding container 0e443246654e1bf453e67cfc9d4f720cd7e137ce2408b6c5f4c882b5ee0c4b30: Status 404 returned error can't find the container with id 0e443246654e1bf453e67cfc9d4f720cd7e137ce2408b6c5f4c882b5ee0c4b30 Mar 14 07:23:47 crc kubenswrapper[4893]: I0314 07:23:47.908102 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 14 07:23:48 crc kubenswrapper[4893]: I0314 07:23:48.876251 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a0d5e368-4fac-48b9-a64c-717f3acf9388","Type":"ContainerStarted","Data":"8b903cc17bbe84a61bbbbd1a8b800ac009c7878c7b0d328302abe5a9b20394cb"} Mar 14 07:23:48 crc kubenswrapper[4893]: I0314 07:23:48.876627 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a0d5e368-4fac-48b9-a64c-717f3acf9388","Type":"ContainerStarted","Data":"d1f0172858057fee57fcad3f87f58b29148aa386e794d735c5c5da631d14ed80"} Mar 14 07:23:48 crc kubenswrapper[4893]: I0314 07:23:48.876647 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a0d5e368-4fac-48b9-a64c-717f3acf9388","Type":"ContainerStarted","Data":"0e443246654e1bf453e67cfc9d4f720cd7e137ce2408b6c5f4c882b5ee0c4b30"} Mar 14 07:23:48 crc kubenswrapper[4893]: I0314 07:23:48.900299 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.900280709 podStartE2EDuration="1.900280709s" podCreationTimestamp="2026-03-14 07:23:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:23:48.892431458 +0000 UTC m=+1508.154608270" watchObservedRunningTime="2026-03-14 07:23:48.900280709 +0000 UTC m=+1508.162457501" Mar 14 07:23:51 crc kubenswrapper[4893]: I0314 07:23:51.229812 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 14 07:23:54 crc kubenswrapper[4893]: I0314 07:23:54.231100 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 14 07:23:54 crc kubenswrapper[4893]: I0314 07:23:54.231735 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 14 07:23:55 crc kubenswrapper[4893]: I0314 07:23:55.237838 4893 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b6ca04fa-accd-437a-ab63-d39d14a49777" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.213:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 07:23:55 crc kubenswrapper[4893]: I0314 07:23:55.245842 4893 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b6ca04fa-accd-437a-ab63-d39d14a49777" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.213:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 07:23:56 crc kubenswrapper[4893]: I0314 07:23:56.229094 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 14 07:23:56 crc kubenswrapper[4893]: I0314 07:23:56.281223 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 14 07:23:57 crc kubenswrapper[4893]: I0314 07:23:57.077143 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 14 07:23:57 crc kubenswrapper[4893]: I0314 07:23:57.578055 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 14 07:23:57 crc kubenswrapper[4893]: I0314 07:23:57.578201 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 14 07:23:58 crc kubenswrapper[4893]: I0314 07:23:58.592798 4893 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a0d5e368-4fac-48b9-a64c-717f3acf9388" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.215:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 07:23:58 crc kubenswrapper[4893]: I0314 07:23:58.592798 4893 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a0d5e368-4fac-48b9-a64c-717f3acf9388" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.215:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 07:23:58 crc kubenswrapper[4893]: I0314 07:23:58.950208 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 14 07:24:00 crc kubenswrapper[4893]: I0314 07:24:00.147154 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557884-7b2sz"] Mar 14 07:24:00 crc kubenswrapper[4893]: I0314 07:24:00.149506 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557884-7b2sz" Mar 14 07:24:00 crc kubenswrapper[4893]: I0314 07:24:00.152610 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-44qb7" Mar 14 07:24:00 crc kubenswrapper[4893]: I0314 07:24:00.153069 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 07:24:00 crc kubenswrapper[4893]: I0314 07:24:00.153310 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 07:24:00 crc kubenswrapper[4893]: I0314 07:24:00.158259 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557884-7b2sz"] Mar 14 07:24:00 crc kubenswrapper[4893]: I0314 07:24:00.186166 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llsgx\" (UniqueName: \"kubernetes.io/projected/4f56851e-f6ad-46d1-9b96-8c6a9d80a227-kube-api-access-llsgx\") pod \"auto-csr-approver-29557884-7b2sz\" (UID: \"4f56851e-f6ad-46d1-9b96-8c6a9d80a227\") " pod="openshift-infra/auto-csr-approver-29557884-7b2sz" Mar 14 07:24:00 crc kubenswrapper[4893]: I0314 07:24:00.287485 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llsgx\" (UniqueName: \"kubernetes.io/projected/4f56851e-f6ad-46d1-9b96-8c6a9d80a227-kube-api-access-llsgx\") pod \"auto-csr-approver-29557884-7b2sz\" (UID: \"4f56851e-f6ad-46d1-9b96-8c6a9d80a227\") " pod="openshift-infra/auto-csr-approver-29557884-7b2sz" Mar 14 07:24:00 crc kubenswrapper[4893]: I0314 07:24:00.315406 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llsgx\" (UniqueName: \"kubernetes.io/projected/4f56851e-f6ad-46d1-9b96-8c6a9d80a227-kube-api-access-llsgx\") pod \"auto-csr-approver-29557884-7b2sz\" (UID: \"4f56851e-f6ad-46d1-9b96-8c6a9d80a227\") " pod="openshift-infra/auto-csr-approver-29557884-7b2sz" Mar 14 07:24:00 crc kubenswrapper[4893]: I0314 07:24:00.488977 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557884-7b2sz" Mar 14 07:24:00 crc kubenswrapper[4893]: I0314 07:24:00.991472 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557884-7b2sz"] Mar 14 07:24:01 crc kubenswrapper[4893]: I0314 07:24:01.071746 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557884-7b2sz" event={"ID":"4f56851e-f6ad-46d1-9b96-8c6a9d80a227","Type":"ContainerStarted","Data":"518889f568df3a05b0d8812a72bf62297c9c289a3e4170a753ffcc7b5a802674"} Mar 14 07:24:02 crc kubenswrapper[4893]: I0314 07:24:02.230927 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 14 07:24:02 crc kubenswrapper[4893]: I0314 07:24:02.231313 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 14 07:24:03 crc kubenswrapper[4893]: I0314 07:24:03.099214 4893 generic.go:334] "Generic (PLEG): container finished" podID="4f56851e-f6ad-46d1-9b96-8c6a9d80a227" containerID="0f1669c9351590b8d4ea31c6f74fb1dfaf88c99551b1ad6898325ff1daada067" exitCode=0 Mar 14 07:24:03 crc kubenswrapper[4893]: I0314 07:24:03.099333 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557884-7b2sz" event={"ID":"4f56851e-f6ad-46d1-9b96-8c6a9d80a227","Type":"ContainerDied","Data":"0f1669c9351590b8d4ea31c6f74fb1dfaf88c99551b1ad6898325ff1daada067"} Mar 14 07:24:04 crc kubenswrapper[4893]: I0314 07:24:04.241223 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 14 07:24:04 crc kubenswrapper[4893]: I0314 07:24:04.247515 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 14 07:24:04 crc kubenswrapper[4893]: I0314 07:24:04.252316 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 14 07:24:04 crc kubenswrapper[4893]: I0314 07:24:04.546179 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557884-7b2sz" Mar 14 07:24:04 crc kubenswrapper[4893]: I0314 07:24:04.586924 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llsgx\" (UniqueName: \"kubernetes.io/projected/4f56851e-f6ad-46d1-9b96-8c6a9d80a227-kube-api-access-llsgx\") pod \"4f56851e-f6ad-46d1-9b96-8c6a9d80a227\" (UID: \"4f56851e-f6ad-46d1-9b96-8c6a9d80a227\") " Mar 14 07:24:04 crc kubenswrapper[4893]: I0314 07:24:04.597684 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f56851e-f6ad-46d1-9b96-8c6a9d80a227-kube-api-access-llsgx" (OuterVolumeSpecName: "kube-api-access-llsgx") pod "4f56851e-f6ad-46d1-9b96-8c6a9d80a227" (UID: "4f56851e-f6ad-46d1-9b96-8c6a9d80a227"). InnerVolumeSpecName "kube-api-access-llsgx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:24:04 crc kubenswrapper[4893]: I0314 07:24:04.689651 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-llsgx\" (UniqueName: \"kubernetes.io/projected/4f56851e-f6ad-46d1-9b96-8c6a9d80a227-kube-api-access-llsgx\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:05 crc kubenswrapper[4893]: I0314 07:24:05.130828 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557884-7b2sz" Mar 14 07:24:05 crc kubenswrapper[4893]: I0314 07:24:05.130859 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557884-7b2sz" event={"ID":"4f56851e-f6ad-46d1-9b96-8c6a9d80a227","Type":"ContainerDied","Data":"518889f568df3a05b0d8812a72bf62297c9c289a3e4170a753ffcc7b5a802674"} Mar 14 07:24:05 crc kubenswrapper[4893]: I0314 07:24:05.130920 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="518889f568df3a05b0d8812a72bf62297c9c289a3e4170a753ffcc7b5a802674" Mar 14 07:24:05 crc kubenswrapper[4893]: I0314 07:24:05.140748 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 14 07:24:05 crc kubenswrapper[4893]: I0314 07:24:05.577756 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 14 07:24:05 crc kubenswrapper[4893]: I0314 07:24:05.577877 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 14 07:24:05 crc kubenswrapper[4893]: I0314 07:24:05.648199 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557878-cvzrx"] Mar 14 07:24:05 crc kubenswrapper[4893]: I0314 07:24:05.656546 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557878-cvzrx"] Mar 14 07:24:07 crc kubenswrapper[4893]: I0314 07:24:07.388772 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85464791-1f62-4d33-bd20-813896eef4b8" path="/var/lib/kubelet/pods/85464791-1f62-4d33-bd20-813896eef4b8/volumes" Mar 14 07:24:07 crc kubenswrapper[4893]: I0314 07:24:07.586327 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 14 07:24:07 crc kubenswrapper[4893]: I0314 07:24:07.590104 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 14 07:24:07 crc kubenswrapper[4893]: I0314 07:24:07.591261 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 14 07:24:08 crc kubenswrapper[4893]: I0314 07:24:08.167358 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 14 07:24:22 crc kubenswrapper[4893]: I0314 07:24:22.748906 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qdfzm"] Mar 14 07:24:22 crc kubenswrapper[4893]: E0314 07:24:22.750118 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f56851e-f6ad-46d1-9b96-8c6a9d80a227" containerName="oc" Mar 14 07:24:22 crc kubenswrapper[4893]: I0314 07:24:22.750140 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f56851e-f6ad-46d1-9b96-8c6a9d80a227" containerName="oc" Mar 14 07:24:22 crc kubenswrapper[4893]: I0314 07:24:22.750515 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f56851e-f6ad-46d1-9b96-8c6a9d80a227" containerName="oc" Mar 14 07:24:22 crc kubenswrapper[4893]: I0314 07:24:22.760731 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qdfzm" Mar 14 07:24:22 crc kubenswrapper[4893]: I0314 07:24:22.791678 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qdfzm"] Mar 14 07:24:22 crc kubenswrapper[4893]: I0314 07:24:22.885265 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6354f443-b3e8-4932-a319-315187cebac7-utilities\") pod \"redhat-operators-qdfzm\" (UID: \"6354f443-b3e8-4932-a319-315187cebac7\") " pod="openshift-marketplace/redhat-operators-qdfzm" Mar 14 07:24:22 crc kubenswrapper[4893]: I0314 07:24:22.885322 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6354f443-b3e8-4932-a319-315187cebac7-catalog-content\") pod \"redhat-operators-qdfzm\" (UID: \"6354f443-b3e8-4932-a319-315187cebac7\") " pod="openshift-marketplace/redhat-operators-qdfzm" Mar 14 07:24:22 crc kubenswrapper[4893]: I0314 07:24:22.885364 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxzlx\" (UniqueName: \"kubernetes.io/projected/6354f443-b3e8-4932-a319-315187cebac7-kube-api-access-nxzlx\") pod \"redhat-operators-qdfzm\" (UID: \"6354f443-b3e8-4932-a319-315187cebac7\") " pod="openshift-marketplace/redhat-operators-qdfzm" Mar 14 07:24:22 crc kubenswrapper[4893]: I0314 07:24:22.987744 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6354f443-b3e8-4932-a319-315187cebac7-utilities\") pod \"redhat-operators-qdfzm\" (UID: \"6354f443-b3e8-4932-a319-315187cebac7\") " pod="openshift-marketplace/redhat-operators-qdfzm" Mar 14 07:24:22 crc kubenswrapper[4893]: I0314 07:24:22.987812 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6354f443-b3e8-4932-a319-315187cebac7-catalog-content\") pod \"redhat-operators-qdfzm\" (UID: \"6354f443-b3e8-4932-a319-315187cebac7\") " pod="openshift-marketplace/redhat-operators-qdfzm" Mar 14 07:24:22 crc kubenswrapper[4893]: I0314 07:24:22.987863 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxzlx\" (UniqueName: \"kubernetes.io/projected/6354f443-b3e8-4932-a319-315187cebac7-kube-api-access-nxzlx\") pod \"redhat-operators-qdfzm\" (UID: \"6354f443-b3e8-4932-a319-315187cebac7\") " pod="openshift-marketplace/redhat-operators-qdfzm" Mar 14 07:24:22 crc kubenswrapper[4893]: I0314 07:24:22.988347 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6354f443-b3e8-4932-a319-315187cebac7-utilities\") pod \"redhat-operators-qdfzm\" (UID: \"6354f443-b3e8-4932-a319-315187cebac7\") " pod="openshift-marketplace/redhat-operators-qdfzm" Mar 14 07:24:22 crc kubenswrapper[4893]: I0314 07:24:22.988683 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6354f443-b3e8-4932-a319-315187cebac7-catalog-content\") pod \"redhat-operators-qdfzm\" (UID: \"6354f443-b3e8-4932-a319-315187cebac7\") " pod="openshift-marketplace/redhat-operators-qdfzm" Mar 14 07:24:23 crc kubenswrapper[4893]: I0314 07:24:23.012866 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxzlx\" (UniqueName: \"kubernetes.io/projected/6354f443-b3e8-4932-a319-315187cebac7-kube-api-access-nxzlx\") pod \"redhat-operators-qdfzm\" (UID: \"6354f443-b3e8-4932-a319-315187cebac7\") " pod="openshift-marketplace/redhat-operators-qdfzm" Mar 14 07:24:23 crc kubenswrapper[4893]: I0314 07:24:23.109449 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qdfzm" Mar 14 07:24:23 crc kubenswrapper[4893]: I0314 07:24:23.637785 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qdfzm"] Mar 14 07:24:24 crc kubenswrapper[4893]: I0314 07:24:24.362022 4893 generic.go:334] "Generic (PLEG): container finished" podID="6354f443-b3e8-4932-a319-315187cebac7" containerID="ed6a5148c6226117649b26539c8c901ac8fbb8d23015579f9313a9d1a00f84e4" exitCode=0 Mar 14 07:24:24 crc kubenswrapper[4893]: I0314 07:24:24.362119 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qdfzm" event={"ID":"6354f443-b3e8-4932-a319-315187cebac7","Type":"ContainerDied","Data":"ed6a5148c6226117649b26539c8c901ac8fbb8d23015579f9313a9d1a00f84e4"} Mar 14 07:24:24 crc kubenswrapper[4893]: I0314 07:24:24.362269 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qdfzm" event={"ID":"6354f443-b3e8-4932-a319-315187cebac7","Type":"ContainerStarted","Data":"7dd45de766400d5e4e9294f0ecec545522417997e7d7f797447d1064f473c43e"} Mar 14 07:24:25 crc kubenswrapper[4893]: I0314 07:24:25.408949 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qdfzm" event={"ID":"6354f443-b3e8-4932-a319-315187cebac7","Type":"ContainerStarted","Data":"d78a88db10bea0199bec89514dcb2f2dc2495ab3ecf23dfd2f0fa4a8292fcd76"} Mar 14 07:24:25 crc kubenswrapper[4893]: I0314 07:24:25.443012 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-zwt4c"] Mar 14 07:24:25 crc kubenswrapper[4893]: I0314 07:24:25.444265 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zwt4c" Mar 14 07:24:25 crc kubenswrapper[4893]: I0314 07:24:25.448284 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 14 07:24:25 crc kubenswrapper[4893]: I0314 07:24:25.458959 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-zwt4c"] Mar 14 07:24:25 crc kubenswrapper[4893]: I0314 07:24:25.479739 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-78ff-account-create-update-rbfhn"] Mar 14 07:24:25 crc kubenswrapper[4893]: I0314 07:24:25.480842 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-78ff-account-create-update-rbfhn" Mar 14 07:24:25 crc kubenswrapper[4893]: I0314 07:24:25.485949 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 14 07:24:25 crc kubenswrapper[4893]: I0314 07:24:25.532165 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrqb5\" (UniqueName: \"kubernetes.io/projected/e6be5c8e-c381-4e29-90e7-069d902c1805-kube-api-access-rrqb5\") pod \"root-account-create-update-zwt4c\" (UID: \"e6be5c8e-c381-4e29-90e7-069d902c1805\") " pod="openstack/root-account-create-update-zwt4c" Mar 14 07:24:25 crc kubenswrapper[4893]: I0314 07:24:25.532365 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6be5c8e-c381-4e29-90e7-069d902c1805-operator-scripts\") pod \"root-account-create-update-zwt4c\" (UID: \"e6be5c8e-c381-4e29-90e7-069d902c1805\") " pod="openstack/root-account-create-update-zwt4c" Mar 14 07:24:25 crc kubenswrapper[4893]: I0314 07:24:25.549068 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-78ff-account-create-update-rbfhn"] Mar 14 07:24:25 crc kubenswrapper[4893]: I0314 07:24:25.636326 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6be5c8e-c381-4e29-90e7-069d902c1805-operator-scripts\") pod \"root-account-create-update-zwt4c\" (UID: \"e6be5c8e-c381-4e29-90e7-069d902c1805\") " pod="openstack/root-account-create-update-zwt4c" Mar 14 07:24:25 crc kubenswrapper[4893]: I0314 07:24:25.636632 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sgmd\" (UniqueName: \"kubernetes.io/projected/22218c00-c79a-4d4a-a0e4-dc59f10ebaaf-kube-api-access-5sgmd\") pod \"neutron-78ff-account-create-update-rbfhn\" (UID: \"22218c00-c79a-4d4a-a0e4-dc59f10ebaaf\") " pod="openstack/neutron-78ff-account-create-update-rbfhn" Mar 14 07:24:25 crc kubenswrapper[4893]: I0314 07:24:25.636743 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrqb5\" (UniqueName: \"kubernetes.io/projected/e6be5c8e-c381-4e29-90e7-069d902c1805-kube-api-access-rrqb5\") pod \"root-account-create-update-zwt4c\" (UID: \"e6be5c8e-c381-4e29-90e7-069d902c1805\") " pod="openstack/root-account-create-update-zwt4c" Mar 14 07:24:25 crc kubenswrapper[4893]: I0314 07:24:25.636956 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22218c00-c79a-4d4a-a0e4-dc59f10ebaaf-operator-scripts\") pod \"neutron-78ff-account-create-update-rbfhn\" (UID: \"22218c00-c79a-4d4a-a0e4-dc59f10ebaaf\") " pod="openstack/neutron-78ff-account-create-update-rbfhn" Mar 14 07:24:25 crc kubenswrapper[4893]: I0314 07:24:25.637182 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6be5c8e-c381-4e29-90e7-069d902c1805-operator-scripts\") pod \"root-account-create-update-zwt4c\" (UID: \"e6be5c8e-c381-4e29-90e7-069d902c1805\") " pod="openstack/root-account-create-update-zwt4c" Mar 14 07:24:25 crc kubenswrapper[4893]: I0314 07:24:25.638150 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Mar 14 07:24:25 crc kubenswrapper[4893]: I0314 07:24:25.638396 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="3a9bdb3e-d59b-4614-a1dd-bc6212ba104c" containerName="openstackclient" containerID="cri-o://743b3347e9bfc67743c1d12d642d3877d05a6b762a5e1bfa35ad0c3f14efe090" gracePeriod=2 Mar 14 07:24:25 crc kubenswrapper[4893]: I0314 07:24:25.677747 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-3b5a-account-create-update-hkpb6"] Mar 14 07:24:25 crc kubenswrapper[4893]: I0314 07:24:25.679048 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3b5a-account-create-update-hkpb6" Mar 14 07:24:25 crc kubenswrapper[4893]: I0314 07:24:25.683200 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 14 07:24:25 crc kubenswrapper[4893]: I0314 07:24:25.698247 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrqb5\" (UniqueName: \"kubernetes.io/projected/e6be5c8e-c381-4e29-90e7-069d902c1805-kube-api-access-rrqb5\") pod \"root-account-create-update-zwt4c\" (UID: \"e6be5c8e-c381-4e29-90e7-069d902c1805\") " pod="openstack/root-account-create-update-zwt4c" Mar 14 07:24:25 crc kubenswrapper[4893]: I0314 07:24:25.698311 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Mar 14 07:24:25 crc kubenswrapper[4893]: I0314 07:24:25.734753 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-3b5a-account-create-update-hkpb6"] Mar 14 07:24:25 crc kubenswrapper[4893]: I0314 07:24:25.740808 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22218c00-c79a-4d4a-a0e4-dc59f10ebaaf-operator-scripts\") pod \"neutron-78ff-account-create-update-rbfhn\" (UID: \"22218c00-c79a-4d4a-a0e4-dc59f10ebaaf\") " pod="openstack/neutron-78ff-account-create-update-rbfhn" Mar 14 07:24:25 crc kubenswrapper[4893]: I0314 07:24:25.740878 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sgmd\" (UniqueName: \"kubernetes.io/projected/22218c00-c79a-4d4a-a0e4-dc59f10ebaaf-kube-api-access-5sgmd\") pod \"neutron-78ff-account-create-update-rbfhn\" (UID: \"22218c00-c79a-4d4a-a0e4-dc59f10ebaaf\") " pod="openstack/neutron-78ff-account-create-update-rbfhn" Mar 14 07:24:25 crc kubenswrapper[4893]: I0314 07:24:25.742101 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22218c00-c79a-4d4a-a0e4-dc59f10ebaaf-operator-scripts\") pod \"neutron-78ff-account-create-update-rbfhn\" (UID: \"22218c00-c79a-4d4a-a0e4-dc59f10ebaaf\") " pod="openstack/neutron-78ff-account-create-update-rbfhn" Mar 14 07:24:25 crc kubenswrapper[4893]: I0314 07:24:25.781096 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zwt4c" Mar 14 07:24:25 crc kubenswrapper[4893]: I0314 07:24:25.789071 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sgmd\" (UniqueName: \"kubernetes.io/projected/22218c00-c79a-4d4a-a0e4-dc59f10ebaaf-kube-api-access-5sgmd\") pod \"neutron-78ff-account-create-update-rbfhn\" (UID: \"22218c00-c79a-4d4a-a0e4-dc59f10ebaaf\") " pod="openstack/neutron-78ff-account-create-update-rbfhn" Mar 14 07:24:25 crc kubenswrapper[4893]: I0314 07:24:25.809875 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-78ff-account-create-update-rbfhn" Mar 14 07:24:25 crc kubenswrapper[4893]: I0314 07:24:25.898215 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x92hw\" (UniqueName: \"kubernetes.io/projected/c9a21025-970d-4b50-8f80-b0926242b929-kube-api-access-x92hw\") pod \"placement-3b5a-account-create-update-hkpb6\" (UID: \"c9a21025-970d-4b50-8f80-b0926242b929\") " pod="openstack/placement-3b5a-account-create-update-hkpb6" Mar 14 07:24:25 crc kubenswrapper[4893]: I0314 07:24:25.898429 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9a21025-970d-4b50-8f80-b0926242b929-operator-scripts\") pod \"placement-3b5a-account-create-update-hkpb6\" (UID: \"c9a21025-970d-4b50-8f80-b0926242b929\") " pod="openstack/placement-3b5a-account-create-update-hkpb6" Mar 14 07:24:25 crc kubenswrapper[4893]: I0314 07:24:25.920701 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.004579 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x92hw\" (UniqueName: \"kubernetes.io/projected/c9a21025-970d-4b50-8f80-b0926242b929-kube-api-access-x92hw\") pod \"placement-3b5a-account-create-update-hkpb6\" (UID: \"c9a21025-970d-4b50-8f80-b0926242b929\") " pod="openstack/placement-3b5a-account-create-update-hkpb6" Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.004666 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9a21025-970d-4b50-8f80-b0926242b929-operator-scripts\") pod \"placement-3b5a-account-create-update-hkpb6\" (UID: \"c9a21025-970d-4b50-8f80-b0926242b929\") " pod="openstack/placement-3b5a-account-create-update-hkpb6" Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.005380 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9a21025-970d-4b50-8f80-b0926242b929-operator-scripts\") pod \"placement-3b5a-account-create-update-hkpb6\" (UID: \"c9a21025-970d-4b50-8f80-b0926242b929\") " pod="openstack/placement-3b5a-account-create-update-hkpb6" Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.051153 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-78ff-account-create-update-8m2n9"] Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.054308 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x92hw\" (UniqueName: \"kubernetes.io/projected/c9a21025-970d-4b50-8f80-b0926242b929-kube-api-access-x92hw\") pod \"placement-3b5a-account-create-update-hkpb6\" (UID: \"c9a21025-970d-4b50-8f80-b0926242b929\") " pod="openstack/placement-3b5a-account-create-update-hkpb6" Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.066333 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3b5a-account-create-update-hkpb6" Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.069117 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-78ff-account-create-update-8m2n9"] Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.091787 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-pmw52"] Mar 14 07:24:26 crc kubenswrapper[4893]: E0314 07:24:26.108600 4893 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 14 07:24:26 crc kubenswrapper[4893]: E0314 07:24:26.108654 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e-config-data podName:7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e nodeName:}" failed. No retries permitted until 2026-03-14 07:24:26.608639531 +0000 UTC m=+1545.870816323 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e-config-data") pod "rabbitmq-cell1-server-0" (UID: "7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e") : configmap "rabbitmq-cell1-config-data" not found Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.119096 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-pmw52"] Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.158805 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-98sg2"] Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.171469 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-98sg2"] Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.183588 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-7crn6"] Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.208601 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-7crn6"] Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.225614 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-3b5a-account-create-update-8nwqm"] Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.258368 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-3b5a-account-create-update-8nwqm"] Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.280289 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-8vwkl"] Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.320597 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-8vwkl"] Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.348215 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-mdkpp"] Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.365490 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-mdkpp"] Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.383567 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-xwpzh"] Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.404752 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-xwpzh"] Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.426584 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-f82a-account-create-update-cwrq9"] Mar 14 07:24:26 crc kubenswrapper[4893]: E0314 07:24:26.427021 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a9bdb3e-d59b-4614-a1dd-bc6212ba104c" containerName="openstackclient" Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.427035 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a9bdb3e-d59b-4614-a1dd-bc6212ba104c" containerName="openstackclient" Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.427193 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a9bdb3e-d59b-4614-a1dd-bc6212ba104c" containerName="openstackclient" Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.427863 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-f82a-account-create-update-cwrq9" Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.430188 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.446231 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-259c-account-create-update-5jsnr"] Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.448579 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-259c-account-create-update-5jsnr" Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.451673 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.496111 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-33a2-account-create-update-fzsjz"] Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.497227 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-33a2-account-create-update-fzsjz" Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.499144 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.518316 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bd578b9-f296-486d-ba12-376e227c3b09-operator-scripts\") pod \"nova-api-f82a-account-create-update-cwrq9\" (UID: \"6bd578b9-f296-486d-ba12-376e227c3b09\") " pod="openstack/nova-api-f82a-account-create-update-cwrq9" Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.518392 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhjfl\" (UniqueName: \"kubernetes.io/projected/6bd578b9-f296-486d-ba12-376e227c3b09-kube-api-access-fhjfl\") pod \"nova-api-f82a-account-create-update-cwrq9\" (UID: \"6bd578b9-f296-486d-ba12-376e227c3b09\") " pod="openstack/nova-api-f82a-account-create-update-cwrq9" Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.527914 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-f82a-account-create-update-cwrq9"] Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.547118 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-259c-account-create-update-5jsnr"] Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.570583 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-33a2-account-create-update-fzsjz"] Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.579776 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.614609 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.615204 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="e1f33b9a-21cf-4ca9-82a1-5fb7592bb749" containerName="openstack-network-exporter" containerID="cri-o://eace24dbb800288f6b1dfc821a8adc0ebb57521e5ce25c1b28ee9518e85fe50f" gracePeriod=300 Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.627070 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzl2v\" (UniqueName: \"kubernetes.io/projected/ab0242d4-11e1-4a4b-9e55-a841f2ba874d-kube-api-access-zzl2v\") pod \"nova-cell1-33a2-account-create-update-fzsjz\" (UID: \"ab0242d4-11e1-4a4b-9e55-a841f2ba874d\") " pod="openstack/nova-cell1-33a2-account-create-update-fzsjz" Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.627135 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5296cc7d-3008-44e6-ae0b-f88c333e13aa-operator-scripts\") pod \"nova-cell0-259c-account-create-update-5jsnr\" (UID: \"5296cc7d-3008-44e6-ae0b-f88c333e13aa\") " pod="openstack/nova-cell0-259c-account-create-update-5jsnr" Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.627246 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bd578b9-f296-486d-ba12-376e227c3b09-operator-scripts\") pod \"nova-api-f82a-account-create-update-cwrq9\" (UID: \"6bd578b9-f296-486d-ba12-376e227c3b09\") " pod="openstack/nova-api-f82a-account-create-update-cwrq9" Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.627373 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhjfl\" (UniqueName: \"kubernetes.io/projected/6bd578b9-f296-486d-ba12-376e227c3b09-kube-api-access-fhjfl\") pod \"nova-api-f82a-account-create-update-cwrq9\" (UID: \"6bd578b9-f296-486d-ba12-376e227c3b09\") " pod="openstack/nova-api-f82a-account-create-update-cwrq9" Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.627557 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dzmx\" (UniqueName: \"kubernetes.io/projected/5296cc7d-3008-44e6-ae0b-f88c333e13aa-kube-api-access-7dzmx\") pod \"nova-cell0-259c-account-create-update-5jsnr\" (UID: \"5296cc7d-3008-44e6-ae0b-f88c333e13aa\") " pod="openstack/nova-cell0-259c-account-create-update-5jsnr" Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.627596 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab0242d4-11e1-4a4b-9e55-a841f2ba874d-operator-scripts\") pod \"nova-cell1-33a2-account-create-update-fzsjz\" (UID: \"ab0242d4-11e1-4a4b-9e55-a841f2ba874d\") " pod="openstack/nova-cell1-33a2-account-create-update-fzsjz" Mar 14 07:24:26 crc kubenswrapper[4893]: E0314 07:24:26.628628 4893 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 14 07:24:26 crc kubenswrapper[4893]: E0314 07:24:26.628676 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e-config-data podName:7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e nodeName:}" failed. No retries permitted until 2026-03-14 07:24:27.628660722 +0000 UTC m=+1546.890837514 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e-config-data") pod "rabbitmq-cell1-server-0" (UID: "7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e") : configmap "rabbitmq-cell1-config-data" not found Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.629101 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bd578b9-f296-486d-ba12-376e227c3b09-operator-scripts\") pod \"nova-api-f82a-account-create-update-cwrq9\" (UID: \"6bd578b9-f296-486d-ba12-376e227c3b09\") " pod="openstack/nova-api-f82a-account-create-update-cwrq9" Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.635558 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69ffc749-7qsrf"] Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.635776 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-69ffc749-7qsrf" podUID="5241cf43-f60b-4499-ae07-6b449f6ef57e" containerName="dnsmasq-dns" containerID="cri-o://acd718ecc01f85e0d44e4851299d2daef071eb198233eb55a80f4008d7f7cece" gracePeriod=10 Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.653503 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhjfl\" (UniqueName: \"kubernetes.io/projected/6bd578b9-f296-486d-ba12-376e227c3b09-kube-api-access-fhjfl\") pod \"nova-api-f82a-account-create-update-cwrq9\" (UID: \"6bd578b9-f296-486d-ba12-376e227c3b09\") " pod="openstack/nova-api-f82a-account-create-update-cwrq9" Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.656878 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-f82a-account-create-update-99zsr"] Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.667947 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-f82a-account-create-update-99zsr"] Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.676299 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.677350 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="e12ac0dd-46ad-4d51-9ebc-acdd264649b2" containerName="openstack-network-exporter" containerID="cri-o://8649f7f4f4c1d54eaf72dbf5276bdbd35d2015a4eb0c3467c699f17e2aee8e05" gracePeriod=300 Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.711268 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.711587 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="c7a6e4cc-c5a8-4551-bd39-ab4eb11331df" containerName="ovn-northd" containerID="cri-o://7eb5ea7adcd2372f21f4aaa33bfc67aa71545d592bca12e3e249bea3ada6eeec" gracePeriod=30 Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.711998 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="c7a6e4cc-c5a8-4551-bd39-ab4eb11331df" containerName="openstack-network-exporter" containerID="cri-o://a353de5b0e8e4b5839fc133d37ff96079b4483877961980a7a0ff324ee5c28c1" gracePeriod=30 Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.730285 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dzmx\" (UniqueName: \"kubernetes.io/projected/5296cc7d-3008-44e6-ae0b-f88c333e13aa-kube-api-access-7dzmx\") pod \"nova-cell0-259c-account-create-update-5jsnr\" (UID: \"5296cc7d-3008-44e6-ae0b-f88c333e13aa\") " pod="openstack/nova-cell0-259c-account-create-update-5jsnr" Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.730343 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab0242d4-11e1-4a4b-9e55-a841f2ba874d-operator-scripts\") pod \"nova-cell1-33a2-account-create-update-fzsjz\" (UID: \"ab0242d4-11e1-4a4b-9e55-a841f2ba874d\") " pod="openstack/nova-cell1-33a2-account-create-update-fzsjz" Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.730410 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzl2v\" (UniqueName: \"kubernetes.io/projected/ab0242d4-11e1-4a4b-9e55-a841f2ba874d-kube-api-access-zzl2v\") pod \"nova-cell1-33a2-account-create-update-fzsjz\" (UID: \"ab0242d4-11e1-4a4b-9e55-a841f2ba874d\") " pod="openstack/nova-cell1-33a2-account-create-update-fzsjz" Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.730436 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5296cc7d-3008-44e6-ae0b-f88c333e13aa-operator-scripts\") pod \"nova-cell0-259c-account-create-update-5jsnr\" (UID: \"5296cc7d-3008-44e6-ae0b-f88c333e13aa\") " pod="openstack/nova-cell0-259c-account-create-update-5jsnr" Mar 14 07:24:26 crc kubenswrapper[4893]: E0314 07:24:26.732366 4893 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 14 07:24:26 crc kubenswrapper[4893]: E0314 07:24:26.732456 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a752b3c8-284e-490f-be39-506e7a075c6f-config-data podName:a752b3c8-284e-490f-be39-506e7a075c6f nodeName:}" failed. No retries permitted until 2026-03-14 07:24:27.232438422 +0000 UTC m=+1546.494615214 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/a752b3c8-284e-490f-be39-506e7a075c6f-config-data") pod "rabbitmq-server-0" (UID: "a752b3c8-284e-490f-be39-506e7a075c6f") : configmap "rabbitmq-config-data" not found Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.733060 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab0242d4-11e1-4a4b-9e55-a841f2ba874d-operator-scripts\") pod \"nova-cell1-33a2-account-create-update-fzsjz\" (UID: \"ab0242d4-11e1-4a4b-9e55-a841f2ba874d\") " pod="openstack/nova-cell1-33a2-account-create-update-fzsjz" Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.734042 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5296cc7d-3008-44e6-ae0b-f88c333e13aa-operator-scripts\") pod \"nova-cell0-259c-account-create-update-5jsnr\" (UID: \"5296cc7d-3008-44e6-ae0b-f88c333e13aa\") " pod="openstack/nova-cell0-259c-account-create-update-5jsnr" Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.752613 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="e1f33b9a-21cf-4ca9-82a1-5fb7592bb749" containerName="ovsdbserver-sb" containerID="cri-o://d32e5cf30b4cda1330ae637ea8cabb6506d4c4567367cf5b048fc3ef60031d41" gracePeriod=300 Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.764067 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzl2v\" (UniqueName: \"kubernetes.io/projected/ab0242d4-11e1-4a4b-9e55-a841f2ba874d-kube-api-access-zzl2v\") pod \"nova-cell1-33a2-account-create-update-fzsjz\" (UID: \"ab0242d4-11e1-4a4b-9e55-a841f2ba874d\") " pod="openstack/nova-cell1-33a2-account-create-update-fzsjz" Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.764609 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dzmx\" (UniqueName: \"kubernetes.io/projected/5296cc7d-3008-44e6-ae0b-f88c333e13aa-kube-api-access-7dzmx\") pod \"nova-cell0-259c-account-create-update-5jsnr\" (UID: \"5296cc7d-3008-44e6-ae0b-f88c333e13aa\") " pod="openstack/nova-cell0-259c-account-create-update-5jsnr" Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.785564 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5cdc5f965f-t6wfv"] Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.785789 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5cdc5f965f-t6wfv" podUID="2bee0811-3177-4034-aa99-39158e55c44f" containerName="neutron-api" containerID="cri-o://14a32fe9b5caad93b19511764a6626f209dfafffc629a6f33c688c048a09bb5d" gracePeriod=30 Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.787099 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5cdc5f965f-t6wfv" podUID="2bee0811-3177-4034-aa99-39158e55c44f" containerName="neutron-httpd" containerID="cri-o://7a372b079b7d68e2f8f76d74a0d939fbb7c8e14fb0e7f1258273cfbe77ae9b8b" gracePeriod=30 Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.808699 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-f82a-account-create-update-cwrq9" Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.840690 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-259c-account-create-update-5jsnr" Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.847602 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-33a2-account-create-update-fzsjz" Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.853294 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="e12ac0dd-46ad-4d51-9ebc-acdd264649b2" containerName="ovsdbserver-nb" containerID="cri-o://89d23ad9d8b1070ff1ada0fd7fa8efcb4659ecb5a3b9953b1e1bb4bcf13f5396" gracePeriod=300 Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.908890 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.909269 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="1239ac87-7084-45c6-9eef-ecab07108656" containerName="cinder-scheduler" containerID="cri-o://200f0500dd95736bf725b120a1467c014a0ae3d6596c940427a6b0fa69a9ddd9" gracePeriod=30 Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.909337 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="1239ac87-7084-45c6-9eef-ecab07108656" containerName="probe" containerID="cri-o://6d4a0c5d5f1c982cd165c28bd58f8c768f1c7c5b718e400d69233659dc76ae51" gracePeriod=30 Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.991745 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-7bbf4"] Mar 14 07:24:26 crc kubenswrapper[4893]: I0314 07:24:26.992103 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-7bbf4" podUID="457f660f-9b87-4d37-a92e-0c30bb2a2fea" containerName="openstack-network-exporter" containerID="cri-o://eda28ec59118715e8794d9877944b4ef0d5e23bf08d3c266e2f2246bfd2ec127" gracePeriod=30 Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.028240 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-bwq2l"] Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.042716 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.043016 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="ab16027b-4fcf-42bf-b586-a7b8ff348305" containerName="cinder-api-log" containerID="cri-o://492cd7441a800dc10a30bd9b6a28dd3d89e262a94b290571698a0218593d8d7f" gracePeriod=30 Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.043151 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="ab16027b-4fcf-42bf-b586-a7b8ff348305" containerName="cinder-api" containerID="cri-o://471e6c6aa09a8b1a609a1618d206f8f19bac722872a90829f538c1ebf60510ec" gracePeriod=30 Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.110964 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.111440 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="dc1eed68-2de8-46ed-91c9-3eb4fe897d3a" containerName="glance-log" containerID="cri-o://43f69db823a3f9f8c85a6a93d52c4972fbfa042fe09bc248a37ffd104f7a1ea4" gracePeriod=30 Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.111803 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="dc1eed68-2de8-46ed-91c9-3eb4fe897d3a" containerName="glance-httpd" containerID="cri-o://4d1c426d2b310ef6f81cd1f91ebfb6670cc4b8ebb961307ca3eb73f633e88e39" gracePeriod=30 Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.153061 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-8rcbf"] Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.198283 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.200134 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="a07f9759-5cdb-4e42-b7c6-714d0e34ee55" containerName="glance-log" containerID="cri-o://f69d37d7aa3aaad764b626c6801d7a4b236d60fdfae8857b91bae1f1d591f9a8" gracePeriod=30 Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.201043 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="a07f9759-5cdb-4e42-b7c6-714d0e34ee55" containerName="glance-httpd" containerID="cri-o://2fc121bd16a2e48ba5ae0e8191ee49932905a50f6c55bad773e4ec5663d05d86" gracePeriod=30 Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.240594 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-259c-account-create-update-g6bdg"] Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.259171 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-33a2-account-create-update-tbdb9"] Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.277736 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-33a2-account-create-update-tbdb9"] Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.285241 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-259c-account-create-update-g6bdg"] Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.295196 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-d4589578b-zwqpr"] Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.295470 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-d4589578b-zwqpr" podUID="2195ecfb-6eeb-48f1-8b55-c57520974663" containerName="placement-log" containerID="cri-o://bea0181f97653e52ef6b00db91426f325b021b8db102f6f9ca17b54176110d69" gracePeriod=30 Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.295861 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-d4589578b-zwqpr" podUID="2195ecfb-6eeb-48f1-8b55-c57520974663" containerName="placement-api" containerID="cri-o://071be3d7b6163f25cab591d62f20975ae6d40b5630f0ede440ea7ddafb12315f" gracePeriod=30 Mar 14 07:24:27 crc kubenswrapper[4893]: E0314 07:24:27.297200 4893 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 14 07:24:27 crc kubenswrapper[4893]: E0314 07:24:27.297236 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a752b3c8-284e-490f-be39-506e7a075c6f-config-data podName:a752b3c8-284e-490f-be39-506e7a075c6f nodeName:}" failed. No retries permitted until 2026-03-14 07:24:28.297223295 +0000 UTC m=+1547.559400087 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/a752b3c8-284e-490f-be39-506e7a075c6f-config-data") pod "rabbitmq-server-0" (UID: "a752b3c8-284e-490f-be39-506e7a075c6f") : configmap "rabbitmq-config-data" not found Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.323847 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-f553-account-create-update-d5kx6"] Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.339566 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-f553-account-create-update-d5kx6"] Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.593382 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05bc0af5-83ce-4dd0-b4b9-53e9307905ad" path="/var/lib/kubelet/pods/05bc0af5-83ce-4dd0-b4b9-53e9307905ad/volumes" Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.601927 4893 generic.go:334] "Generic (PLEG): container finished" podID="5241cf43-f60b-4499-ae07-6b449f6ef57e" containerID="acd718ecc01f85e0d44e4851299d2daef071eb198233eb55a80f4008d7f7cece" exitCode=0 Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.605263 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b4b8032-3d08-4574-9987-fc159aa8c506" path="/var/lib/kubelet/pods/0b4b8032-3d08-4574-9987-fc159aa8c506/volumes" Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.606177 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bb86b36-ba27-4788-9a19-451d42d8a4e2" path="/var/lib/kubelet/pods/2bb86b36-ba27-4788-9a19-451d42d8a4e2/volumes" Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.609744 4893 generic.go:334] "Generic (PLEG): container finished" podID="6354f443-b3e8-4932-a319-315187cebac7" containerID="d78a88db10bea0199bec89514dcb2f2dc2495ab3ecf23dfd2f0fa4a8292fcd76" exitCode=0 Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.615096 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40b3f7b9-3c33-4486-bba1-2f528c0eb212" path="/var/lib/kubelet/pods/40b3f7b9-3c33-4486-bba1-2f528c0eb212/volumes" Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.618744 4893 generic.go:334] "Generic (PLEG): container finished" podID="c7a6e4cc-c5a8-4551-bd39-ab4eb11331df" containerID="a353de5b0e8e4b5839fc133d37ff96079b4483877961980a7a0ff324ee5c28c1" exitCode=2 Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.620639 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="435a4f9d-9f5b-4f9f-bc4a-b4651831bac1" path="/var/lib/kubelet/pods/435a4f9d-9f5b-4f9f-bc4a-b4651831bac1/volumes" Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.621249 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44268a3d-27a9-41ee-a0d7-38ba3b152ce5" path="/var/lib/kubelet/pods/44268a3d-27a9-41ee-a0d7-38ba3b152ce5/volumes" Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.625850 4893 generic.go:334] "Generic (PLEG): container finished" podID="a07f9759-5cdb-4e42-b7c6-714d0e34ee55" containerID="f69d37d7aa3aaad764b626c6801d7a4b236d60fdfae8857b91bae1f1d591f9a8" exitCode=143 Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.626369 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47f8afde-298d-42a9-a92e-dd3b4561b98c" path="/var/lib/kubelet/pods/47f8afde-298d-42a9-a92e-dd3b4561b98c/volumes" Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.627505 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e3cdf1f-7963-494d-86f8-699e7401fe91" path="/var/lib/kubelet/pods/7e3cdf1f-7963-494d-86f8-699e7401fe91/volumes" Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.629894 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-7bbf4_457f660f-9b87-4d37-a92e-0c30bb2a2fea/openstack-network-exporter/0.log" Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.648314 4893 generic.go:334] "Generic (PLEG): container finished" podID="457f660f-9b87-4d37-a92e-0c30bb2a2fea" containerID="eda28ec59118715e8794d9877944b4ef0d5e23bf08d3c266e2f2246bfd2ec127" exitCode=2 Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.652534 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a25b2a2-2d1a-4e0c-8114-c796ed4fe1e2" path="/var/lib/kubelet/pods/8a25b2a2-2d1a-4e0c-8114-c796ed4fe1e2/volumes" Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.660297 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e3d2794-7a14-49ad-9ca9-a73ec58d2981" path="/var/lib/kubelet/pods/8e3d2794-7a14-49ad-9ca9-a73ec58d2981/volumes" Mar 14 07:24:27 crc kubenswrapper[4893]: E0314 07:24:27.665091 4893 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 14 07:24:27 crc kubenswrapper[4893]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95,Command:[/bin/sh -c #!/bin/bash Mar 14 07:24:27 crc kubenswrapper[4893]: Mar 14 07:24:27 crc kubenswrapper[4893]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 14 07:24:27 crc kubenswrapper[4893]: Mar 14 07:24:27 crc kubenswrapper[4893]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 14 07:24:27 crc kubenswrapper[4893]: Mar 14 07:24:27 crc kubenswrapper[4893]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 14 07:24:27 crc kubenswrapper[4893]: Mar 14 07:24:27 crc kubenswrapper[4893]: if [ -n "neutron" ]; then Mar 14 07:24:27 crc kubenswrapper[4893]: GRANT_DATABASE="neutron" Mar 14 07:24:27 crc kubenswrapper[4893]: else Mar 14 07:24:27 crc kubenswrapper[4893]: GRANT_DATABASE="*" Mar 14 07:24:27 crc kubenswrapper[4893]: fi Mar 14 07:24:27 crc kubenswrapper[4893]: Mar 14 07:24:27 crc kubenswrapper[4893]: # going for maximum compatibility here: Mar 14 07:24:27 crc kubenswrapper[4893]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 14 07:24:27 crc kubenswrapper[4893]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 14 07:24:27 crc kubenswrapper[4893]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 14 07:24:27 crc kubenswrapper[4893]: # support updates Mar 14 07:24:27 crc kubenswrapper[4893]: Mar 14 07:24:27 crc kubenswrapper[4893]: $MYSQL_CMD < logger="UnhandledError" Mar 14 07:24:27 crc kubenswrapper[4893]: E0314 07:24:27.671077 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"neutron-db-secret\\\" not found\"" pod="openstack/neutron-78ff-account-create-update-rbfhn" podUID="22218c00-c79a-4d4a-a0e4-dc59f10ebaaf" Mar 14 07:24:27 crc kubenswrapper[4893]: E0314 07:24:27.672083 4893 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 14 07:24:27 crc kubenswrapper[4893]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95,Command:[/bin/sh -c #!/bin/bash Mar 14 07:24:27 crc kubenswrapper[4893]: Mar 14 07:24:27 crc kubenswrapper[4893]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 14 07:24:27 crc kubenswrapper[4893]: Mar 14 07:24:27 crc kubenswrapper[4893]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 14 07:24:27 crc kubenswrapper[4893]: Mar 14 07:24:27 crc kubenswrapper[4893]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 14 07:24:27 crc kubenswrapper[4893]: Mar 14 07:24:27 crc kubenswrapper[4893]: if [ -n "placement" ]; then Mar 14 07:24:27 crc kubenswrapper[4893]: GRANT_DATABASE="placement" Mar 14 07:24:27 crc kubenswrapper[4893]: else Mar 14 07:24:27 crc kubenswrapper[4893]: GRANT_DATABASE="*" Mar 14 07:24:27 crc kubenswrapper[4893]: fi Mar 14 07:24:27 crc kubenswrapper[4893]: Mar 14 07:24:27 crc kubenswrapper[4893]: # going for maximum compatibility here: Mar 14 07:24:27 crc kubenswrapper[4893]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 14 07:24:27 crc kubenswrapper[4893]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 14 07:24:27 crc kubenswrapper[4893]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 14 07:24:27 crc kubenswrapper[4893]: # support updates Mar 14 07:24:27 crc kubenswrapper[4893]: Mar 14 07:24:27 crc kubenswrapper[4893]: $MYSQL_CMD < logger="UnhandledError" Mar 14 07:24:27 crc kubenswrapper[4893]: E0314 07:24:27.673982 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"placement-db-secret\\\" not found\"" pod="openstack/placement-3b5a-account-create-update-hkpb6" podUID="c9a21025-970d-4b50-8f80-b0926242b929" Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.674939 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_e1f33b9a-21cf-4ca9-82a1-5fb7592bb749/ovsdbserver-sb/0.log" Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.674976 4893 generic.go:334] "Generic (PLEG): container finished" podID="e1f33b9a-21cf-4ca9-82a1-5fb7592bb749" containerID="eace24dbb800288f6b1dfc821a8adc0ebb57521e5ce25c1b28ee9518e85fe50f" exitCode=2 Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.674989 4893 generic.go:334] "Generic (PLEG): container finished" podID="e1f33b9a-21cf-4ca9-82a1-5fb7592bb749" containerID="d32e5cf30b4cda1330ae637ea8cabb6506d4c4567367cf5b048fc3ef60031d41" exitCode=143 Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.677786 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b113c1f3-1fdd-4fd0-806e-42707f79ba1e" path="/var/lib/kubelet/pods/b113c1f3-1fdd-4fd0-806e-42707f79ba1e/volumes" Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.678408 4893 generic.go:334] "Generic (PLEG): container finished" podID="dc1eed68-2de8-46ed-91c9-3eb4fe897d3a" containerID="43f69db823a3f9f8c85a6a93d52c4972fbfa042fe09bc248a37ffd104f7a1ea4" exitCode=143 Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.686199 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f09ba521-7c5d-4b30-902d-d7634a77e369" path="/var/lib/kubelet/pods/f09ba521-7c5d-4b30-902d-d7634a77e369/volumes" Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.753174 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-9600-account-create-update-dwgnl"] Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.753207 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-9600-account-create-update-dwgnl"] Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.753225 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-k5lhh"] Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.753236 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69ffc749-7qsrf" event={"ID":"5241cf43-f60b-4499-ae07-6b449f6ef57e","Type":"ContainerDied","Data":"acd718ecc01f85e0d44e4851299d2daef071eb198233eb55a80f4008d7f7cece"} Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.753301 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-zwt4c" event={"ID":"e6be5c8e-c381-4e29-90e7-069d902c1805","Type":"ContainerStarted","Data":"dff232fbc3c9a3abb4d7e60a566fdb00bf629c3d040d8e58813243ceb43d28ae"} Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.753316 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-k5lhh"] Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.753330 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qdfzm" event={"ID":"6354f443-b3e8-4932-a319-315187cebac7","Type":"ContainerDied","Data":"d78a88db10bea0199bec89514dcb2f2dc2495ab3ecf23dfd2f0fa4a8292fcd76"} Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.753342 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c7a6e4cc-c5a8-4551-bd39-ab4eb11331df","Type":"ContainerDied","Data":"a353de5b0e8e4b5839fc133d37ff96079b4483877961980a7a0ff324ee5c28c1"} Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.753353 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a07f9759-5cdb-4e42-b7c6-714d0e34ee55","Type":"ContainerDied","Data":"f69d37d7aa3aaad764b626c6801d7a4b236d60fdfae8857b91bae1f1d591f9a8"} Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.753390 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-78ff-account-create-update-rbfhn" event={"ID":"22218c00-c79a-4d4a-a0e4-dc59f10ebaaf","Type":"ContainerStarted","Data":"ad03753bc9e5096402c46b4a68aac223028a3c27d17a6333c612615b285cf61f"} Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.753399 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-sw4ds"] Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.753410 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-sw4ds"] Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.753421 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-8knjw"] Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.753432 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-8knjw"] Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.753472 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-7bbf4" event={"ID":"457f660f-9b87-4d37-a92e-0c30bb2a2fea","Type":"ContainerDied","Data":"eda28ec59118715e8794d9877944b4ef0d5e23bf08d3c266e2f2246bfd2ec127"} Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.753484 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"e1f33b9a-21cf-4ca9-82a1-5fb7592bb749","Type":"ContainerDied","Data":"eace24dbb800288f6b1dfc821a8adc0ebb57521e5ce25c1b28ee9518e85fe50f"} Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.753496 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"e1f33b9a-21cf-4ca9-82a1-5fb7592bb749","Type":"ContainerDied","Data":"d32e5cf30b4cda1330ae637ea8cabb6506d4c4567367cf5b048fc3ef60031d41"} Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.753504 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-78ff-account-create-update-rbfhn"] Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.753549 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-wr6c4"] Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.753561 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-wr6c4"] Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.753571 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"dc1eed68-2de8-46ed-91c9-3eb4fe897d3a","Type":"ContainerDied","Data":"43f69db823a3f9f8c85a6a93d52c4972fbfa042fe09bc248a37ffd104f7a1ea4"} Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.753583 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.753624 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-6g9t6"] Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.753636 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-6g9t6"] Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.753646 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-w6klp"] Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.753657 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-w6klp"] Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.753686 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-3b5a-account-create-update-hkpb6"] Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.755736 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerName="account-server" containerID="cri-o://e6185392ccc78feefed6979a46060f7ab1bf26d3d5dd3f3682205a3151cc587f" gracePeriod=30 Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.687692 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_e12ac0dd-46ad-4d51-9ebc-acdd264649b2/ovsdbserver-nb/0.log" Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.756003 4893 generic.go:334] "Generic (PLEG): container finished" podID="e12ac0dd-46ad-4d51-9ebc-acdd264649b2" containerID="8649f7f4f4c1d54eaf72dbf5276bdbd35d2015a4eb0c3467c699f17e2aee8e05" exitCode=2 Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.756077 4893 generic.go:334] "Generic (PLEG): container finished" podID="e12ac0dd-46ad-4d51-9ebc-acdd264649b2" containerID="89d23ad9d8b1070ff1ada0fd7fa8efcb4659ecb5a3b9953b1e1bb4bcf13f5396" exitCode=143 Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.756143 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"e12ac0dd-46ad-4d51-9ebc-acdd264649b2","Type":"ContainerDied","Data":"8649f7f4f4c1d54eaf72dbf5276bdbd35d2015a4eb0c3467c699f17e2aee8e05"} Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.756223 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"e12ac0dd-46ad-4d51-9ebc-acdd264649b2","Type":"ContainerDied","Data":"89d23ad9d8b1070ff1ada0fd7fa8efcb4659ecb5a3b9953b1e1bb4bcf13f5396"} Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.764591 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerName="swift-recon-cron" containerID="cri-o://50280f9973eaf24cfaecf1b0325f81232bba8db39fef91743adc8159634ae0a5" gracePeriod=30 Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.764949 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerName="rsync" containerID="cri-o://61f20b3d153fc2b7e2f88fb820282348d684b7c7d0ab7a0173efe70cf59b854d" gracePeriod=30 Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.765076 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerName="object-expirer" containerID="cri-o://42879a04fa13be12602354e7d2287565ac8c111d2ae1985e6be2c424ccdde936" gracePeriod=30 Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.765228 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerName="object-updater" containerID="cri-o://8e99f57da5e8b525a1c282c63776aab48b212ab715d4fbb62b57b9f55dd89177" gracePeriod=30 Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.765356 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerName="object-auditor" containerID="cri-o://b2ee065d0fc6e1370be16e3b3da813d71fb595937dc649af8cca1aaacddcfda3" gracePeriod=30 Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.765485 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerName="object-replicator" containerID="cri-o://c81f7949be58dba02aa8c25ce6d88e2e1c6c71c62d11b3a1abcfa0c71bd14098" gracePeriod=30 Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.769108 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerName="object-server" containerID="cri-o://5caf4411272c4d0599dad02231d0a82c29d93699fd274a70d9030405888f8c29" gracePeriod=30 Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.769247 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerName="container-updater" containerID="cri-o://273a5c886b9d88d838c2d721fc03cc49f0a45e38b5a0c201ed139fec6b5f6ee1" gracePeriod=30 Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.769374 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerName="container-auditor" containerID="cri-o://c23ee49f390ead1ae4f7fca83deb5f84581ceae59f8e8ae8455a1874dfe7b051" gracePeriod=30 Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.769486 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerName="container-replicator" containerID="cri-o://68d10b4deb6f11784dbbc34682c64f702addc8c78d192179a6921060f8643b7d" gracePeriod=30 Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.769751 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerName="container-server" containerID="cri-o://9c507561a2ac6d068928170405d1a9bf41f0375a26767e40cbfa1697910cfdd9" gracePeriod=30 Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.769950 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerName="account-reaper" containerID="cri-o://265624aa91f386fecc5aba702935646c1883c30507f6b5633f11a004e1345c10" gracePeriod=30 Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.770152 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerName="account-auditor" containerID="cri-o://10212c44819536c44b3626faff9ed9fd4a4329b9deab21102b0375f256bb23d2" gracePeriod=30 Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.770352 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerName="account-replicator" containerID="cri-o://8181ccf3acc345fc42f8615ca13010757b2d59b421f286383ec54a73844363c3" gracePeriod=30 Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.778388 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-k8kjv"] Mar 14 07:24:27 crc kubenswrapper[4893]: E0314 07:24:27.807321 4893 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 14 07:24:27 crc kubenswrapper[4893]: E0314 07:24:27.807390 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e-config-data podName:7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e nodeName:}" failed. No retries permitted until 2026-03-14 07:24:29.807367977 +0000 UTC m=+1549.069544769 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e-config-data") pod "rabbitmq-cell1-server-0" (UID: "7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e") : configmap "rabbitmq-cell1-config-data" not found Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.928018 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-k8kjv"] Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.959298 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-8d2e-account-create-update-db4t8"] Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.974435 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-8d2e-account-create-update-db4t8"] Mar 14 07:24:27 crc kubenswrapper[4893]: I0314 07:24:27.987681 4893 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-69ffc749-7qsrf" podUID="5241cf43-f60b-4499-ae07-6b449f6ef57e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.208:5353: connect: connection refused" Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.005472 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-ftbdk"] Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.015752 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-ftbdk"] Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.025369 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-68559f9fc9-2zprf"] Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.025800 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-68559f9fc9-2zprf" podUID="1fc48c5b-eba4-4ce3-b68a-289b737dd9c4" containerName="barbican-worker-log" containerID="cri-o://a6139b7db93a2d6935130af661309f66a9cc34448d14819a79327f7b0ac80cc2" gracePeriod=30 Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.026557 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-68559f9fc9-2zprf" podUID="1fc48c5b-eba4-4ce3-b68a-289b737dd9c4" containerName="barbican-worker" containerID="cri-o://41c90b70a47f5b7fd9f983850f85e1893e4b84b4238a9030cf46ac107b91a526" gracePeriod=30 Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.040235 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-6f4f4558c4-87m4w"] Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.040501 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-6f4f4558c4-87m4w" podUID="49d7922c-f9be-40bc-ba17-ec777a331998" containerName="barbican-keystone-listener-log" containerID="cri-o://dfb104ad3656c054cc0079116aa3b717e79a6e2e4bfdd1d18ecce304d17812e0" gracePeriod=30 Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.040937 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-6f4f4558c4-87m4w" podUID="49d7922c-f9be-40bc-ba17-ec777a331998" containerName="barbican-keystone-listener" containerID="cri-o://76671f2af7de340854a84a9a135d9964bd871f327e60af0a310986cec234a8f4" gracePeriod=30 Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.054602 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-zwt4c"] Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.062184 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.069676 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-78ff-account-create-update-rbfhn"] Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.077649 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.077866 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a0d5e368-4fac-48b9-a64c-717f3acf9388" containerName="nova-api-log" containerID="cri-o://d1f0172858057fee57fcad3f87f58b29148aa386e794d735c5c5da631d14ed80" gracePeriod=30 Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.078460 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a0d5e368-4fac-48b9-a64c-717f3acf9388" containerName="nova-api-api" containerID="cri-o://8b903cc17bbe84a61bbbbd1a8b800ac009c7878c7b0d328302abe5a9b20394cb" gracePeriod=30 Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.094628 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-f82a-account-create-update-cwrq9"] Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.099415 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-p7mnh"] Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.106788 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-p7mnh"] Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.113054 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.113431 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b6ca04fa-accd-437a-ab63-d39d14a49777" containerName="nova-metadata-log" containerID="cri-o://928a88e96ca63f18e3047a55a85109e7284281e3348f6b7748c2c86111c4f15e" gracePeriod=30 Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.113785 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b6ca04fa-accd-437a-ab63-d39d14a49777" containerName="nova-metadata-metadata" containerID="cri-o://3ed3ad5f48a78c2ee75e5dd1a6438ae0b7531b0fefc6d91c0f46aad1b39d3ee9" gracePeriod=30 Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.135261 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-684d6b469b-c8l2b"] Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.135461 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-684d6b469b-c8l2b" podUID="fefe82b2-447a-4f97-8221-7050b61ef60c" containerName="barbican-api-log" containerID="cri-o://142a5e92673e832e9dc3ab1aaae6b53df019680363c32f4373ec08b60c93fa74" gracePeriod=30 Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.135767 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-684d6b469b-c8l2b" podUID="fefe82b2-447a-4f97-8221-7050b61ef60c" containerName="barbican-api" containerID="cri-o://a0362c79961c85ff09cc537289495d9c32c6edf16fe15acfc1b254f825509254" gracePeriod=30 Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.144165 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.173657 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-3b5a-account-create-update-hkpb6"] Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.228402 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-33a2-account-create-update-fzsjz"] Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.237466 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-bbgd4"] Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.249375 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-lpk6v"] Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.249427 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-lpk6v"] Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.258464 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-7776dc7c77-9lm7q"] Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.258719 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-7776dc7c77-9lm7q" podUID="203abd37-654f-480c-8a9d-719d767aec4d" containerName="proxy-httpd" containerID="cri-o://26fa9847a83bdbc2805cf9345b971f151cfc501d5ecd4dfe251d3518ad82f625" gracePeriod=30 Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.259077 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-7776dc7c77-9lm7q" podUID="203abd37-654f-480c-8a9d-719d767aec4d" containerName="proxy-server" containerID="cri-o://52854fc52e28bf94ad21c38ff77654cb2e7662a16eb28502030a3a7f5acaab4f" gracePeriod=30 Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.263680 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-259c-account-create-update-5jsnr"] Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.271458 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-bbgd4"] Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.277849 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.278101 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="12b3c392-b02f-435f-8a96-04ad97890449" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://8df0344d0d349890a42132c18330de316fe3521d61a0cdd8284849c4aacee7c9" gracePeriod=30 Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.280450 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.294879 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-5vwvm"] Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.297035 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.297290 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="d0f57646-651c-4b8f-b73d-6606d06fa3a3" containerName="nova-cell1-conductor-conductor" containerID="cri-o://c9b16cf0fb0ddb6f10075d1c80980022cc10c492e67b85046aa17943882b5a0b" gracePeriod=30 Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.304492 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-96kn9"] Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.314504 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.314746 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="eb5d1a2a-ad9f-4eb7-8a70-37a1523a1b6f" containerName="nova-cell0-conductor-conductor" containerID="cri-o://7ab1a7eb14a49fb840db8cffee0e903b4ae4cb885907191a6bc0ab1496f7f86c" gracePeriod=30 Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.325605 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-96kn9"] Mar 14 07:24:28 crc kubenswrapper[4893]: E0314 07:24:28.344661 4893 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 14 07:24:28 crc kubenswrapper[4893]: E0314 07:24:28.344735 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a752b3c8-284e-490f-be39-506e7a075c6f-config-data podName:a752b3c8-284e-490f-be39-506e7a075c6f nodeName:}" failed. No retries permitted until 2026-03-14 07:24:30.344718066 +0000 UTC m=+1549.606894858 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/a752b3c8-284e-490f-be39-506e7a075c6f-config-data") pod "rabbitmq-server-0" (UID: "a752b3c8-284e-490f-be39-506e7a075c6f") : configmap "rabbitmq-config-data" not found Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.350745 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e" containerName="rabbitmq" containerID="cri-o://a5f24be09a6c54f86e91629c32362dd6fc01dad63e4f1a792a6a2d72329e86ae" gracePeriod=604800 Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.355182 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-5vwvm"] Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.422896 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="a752b3c8-284e-490f-be39-506e7a075c6f" containerName="rabbitmq" containerID="cri-o://ee01de4008b1335a2c5408eb31930a4fb5131e254bc97793e1f6069614788d91" gracePeriod=604800 Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.503708 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-7bbf4_457f660f-9b87-4d37-a92e-0c30bb2a2fea/openstack-network-exporter/0.log" Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.503797 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-7bbf4" Mar 14 07:24:28 crc kubenswrapper[4893]: E0314 07:24:28.505879 4893 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 89d23ad9d8b1070ff1ada0fd7fa8efcb4659ecb5a3b9953b1e1bb4bcf13f5396 is running failed: container process not found" containerID="89d23ad9d8b1070ff1ada0fd7fa8efcb4659ecb5a3b9953b1e1bb4bcf13f5396" cmd=["/usr/bin/pidof","ovsdb-server"] Mar 14 07:24:28 crc kubenswrapper[4893]: E0314 07:24:28.506234 4893 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 89d23ad9d8b1070ff1ada0fd7fa8efcb4659ecb5a3b9953b1e1bb4bcf13f5396 is running failed: container process not found" containerID="89d23ad9d8b1070ff1ada0fd7fa8efcb4659ecb5a3b9953b1e1bb4bcf13f5396" cmd=["/usr/bin/pidof","ovsdb-server"] Mar 14 07:24:28 crc kubenswrapper[4893]: E0314 07:24:28.506754 4893 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 89d23ad9d8b1070ff1ada0fd7fa8efcb4659ecb5a3b9953b1e1bb4bcf13f5396 is running failed: container process not found" containerID="89d23ad9d8b1070ff1ada0fd7fa8efcb4659ecb5a3b9953b1e1bb4bcf13f5396" cmd=["/usr/bin/pidof","ovsdb-server"] Mar 14 07:24:28 crc kubenswrapper[4893]: E0314 07:24:28.506825 4893 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 89d23ad9d8b1070ff1ada0fd7fa8efcb4659ecb5a3b9953b1e1bb4bcf13f5396 is running failed: container process not found" probeType="Readiness" pod="openstack/ovsdbserver-nb-0" podUID="e12ac0dd-46ad-4d51-9ebc-acdd264649b2" containerName="ovsdbserver-nb" Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.527576 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_e12ac0dd-46ad-4d51-9ebc-acdd264649b2/ovsdbserver-nb/0.log" Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.527653 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 14 07:24:28 crc kubenswrapper[4893]: E0314 07:24:28.578701 4893 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Mar 14 07:24:28 crc kubenswrapper[4893]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Mar 14 07:24:28 crc kubenswrapper[4893]: + source /usr/local/bin/container-scripts/functions Mar 14 07:24:28 crc kubenswrapper[4893]: ++ OVNBridge=br-int Mar 14 07:24:28 crc kubenswrapper[4893]: ++ OVNRemote=tcp:localhost:6642 Mar 14 07:24:28 crc kubenswrapper[4893]: ++ OVNEncapType=geneve Mar 14 07:24:28 crc kubenswrapper[4893]: ++ OVNAvailabilityZones= Mar 14 07:24:28 crc kubenswrapper[4893]: ++ EnableChassisAsGateway=true Mar 14 07:24:28 crc kubenswrapper[4893]: ++ PhysicalNetworks= Mar 14 07:24:28 crc kubenswrapper[4893]: ++ OVNHostName= Mar 14 07:24:28 crc kubenswrapper[4893]: ++ DB_FILE=/etc/openvswitch/conf.db Mar 14 07:24:28 crc kubenswrapper[4893]: ++ ovs_dir=/var/lib/openvswitch Mar 14 07:24:28 crc kubenswrapper[4893]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Mar 14 07:24:28 crc kubenswrapper[4893]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Mar 14 07:24:28 crc kubenswrapper[4893]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 14 07:24:28 crc kubenswrapper[4893]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 14 07:24:28 crc kubenswrapper[4893]: + sleep 0.5 Mar 14 07:24:28 crc kubenswrapper[4893]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 14 07:24:28 crc kubenswrapper[4893]: + sleep 0.5 Mar 14 07:24:28 crc kubenswrapper[4893]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 14 07:24:28 crc kubenswrapper[4893]: + cleanup_ovsdb_server_semaphore Mar 14 07:24:28 crc kubenswrapper[4893]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 14 07:24:28 crc kubenswrapper[4893]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Mar 14 07:24:28 crc kubenswrapper[4893]: > execCommand=["/usr/local/bin/container-scripts/stop-ovsdb-server.sh"] containerName="ovsdb-server" pod="openstack/ovn-controller-ovs-bwq2l" message=< Mar 14 07:24:28 crc kubenswrapper[4893]: Exiting ovsdb-server (5) [ OK ] Mar 14 07:24:28 crc kubenswrapper[4893]: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Mar 14 07:24:28 crc kubenswrapper[4893]: + source /usr/local/bin/container-scripts/functions Mar 14 07:24:28 crc kubenswrapper[4893]: ++ OVNBridge=br-int Mar 14 07:24:28 crc kubenswrapper[4893]: ++ OVNRemote=tcp:localhost:6642 Mar 14 07:24:28 crc kubenswrapper[4893]: ++ OVNEncapType=geneve Mar 14 07:24:28 crc kubenswrapper[4893]: ++ OVNAvailabilityZones= Mar 14 07:24:28 crc kubenswrapper[4893]: ++ EnableChassisAsGateway=true Mar 14 07:24:28 crc kubenswrapper[4893]: ++ PhysicalNetworks= Mar 14 07:24:28 crc kubenswrapper[4893]: ++ OVNHostName= Mar 14 07:24:28 crc kubenswrapper[4893]: ++ DB_FILE=/etc/openvswitch/conf.db Mar 14 07:24:28 crc kubenswrapper[4893]: ++ ovs_dir=/var/lib/openvswitch Mar 14 07:24:28 crc kubenswrapper[4893]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Mar 14 07:24:28 crc kubenswrapper[4893]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Mar 14 07:24:28 crc kubenswrapper[4893]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 14 07:24:28 crc kubenswrapper[4893]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 14 07:24:28 crc kubenswrapper[4893]: + sleep 0.5 Mar 14 07:24:28 crc kubenswrapper[4893]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 14 07:24:28 crc kubenswrapper[4893]: + sleep 0.5 Mar 14 07:24:28 crc kubenswrapper[4893]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 14 07:24:28 crc kubenswrapper[4893]: + cleanup_ovsdb_server_semaphore Mar 14 07:24:28 crc kubenswrapper[4893]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 14 07:24:28 crc kubenswrapper[4893]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Mar 14 07:24:28 crc kubenswrapper[4893]: > Mar 14 07:24:28 crc kubenswrapper[4893]: E0314 07:24:28.578754 4893 kuberuntime_container.go:691] "PreStop hook failed" err=< Mar 14 07:24:28 crc kubenswrapper[4893]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Mar 14 07:24:28 crc kubenswrapper[4893]: + source /usr/local/bin/container-scripts/functions Mar 14 07:24:28 crc kubenswrapper[4893]: ++ OVNBridge=br-int Mar 14 07:24:28 crc kubenswrapper[4893]: ++ OVNRemote=tcp:localhost:6642 Mar 14 07:24:28 crc kubenswrapper[4893]: ++ OVNEncapType=geneve Mar 14 07:24:28 crc kubenswrapper[4893]: ++ OVNAvailabilityZones= Mar 14 07:24:28 crc kubenswrapper[4893]: ++ EnableChassisAsGateway=true Mar 14 07:24:28 crc kubenswrapper[4893]: ++ PhysicalNetworks= Mar 14 07:24:28 crc kubenswrapper[4893]: ++ OVNHostName= Mar 14 07:24:28 crc kubenswrapper[4893]: ++ DB_FILE=/etc/openvswitch/conf.db Mar 14 07:24:28 crc kubenswrapper[4893]: ++ ovs_dir=/var/lib/openvswitch Mar 14 07:24:28 crc kubenswrapper[4893]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Mar 14 07:24:28 crc kubenswrapper[4893]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Mar 14 07:24:28 crc kubenswrapper[4893]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 14 07:24:28 crc kubenswrapper[4893]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 14 07:24:28 crc kubenswrapper[4893]: + sleep 0.5 Mar 14 07:24:28 crc kubenswrapper[4893]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 14 07:24:28 crc kubenswrapper[4893]: + sleep 0.5 Mar 14 07:24:28 crc kubenswrapper[4893]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 14 07:24:28 crc kubenswrapper[4893]: + cleanup_ovsdb_server_semaphore Mar 14 07:24:28 crc kubenswrapper[4893]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 14 07:24:28 crc kubenswrapper[4893]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Mar 14 07:24:28 crc kubenswrapper[4893]: > pod="openstack/ovn-controller-ovs-bwq2l" podUID="ec3a7835-99ba-4d0d-b81d-2dea0dc7128b" containerName="ovsdb-server" containerID="cri-o://14dfeeb1ac493f8c868ceb12cb9f2623644d493f702d4b8d551c54f2d4fd82cb" Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.578790 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-bwq2l" podUID="ec3a7835-99ba-4d0d-b81d-2dea0dc7128b" containerName="ovsdb-server" containerID="cri-o://14dfeeb1ac493f8c868ceb12cb9f2623644d493f702d4b8d551c54f2d4fd82cb" gracePeriod=29 Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.597464 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="4f6afb83-bfaa-41a6-8429-b8588d82c7a7" containerName="galera" containerID="cri-o://027f613615f6fb782a95ff00ffffb3250ba4bf7974f3a506df7ad3cbe9d15d84" gracePeriod=30 Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.603950 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_e1f33b9a-21cf-4ca9-82a1-5fb7592bb749/ovsdbserver-sb/0.log" Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.604021 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.604399 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69ffc749-7qsrf" Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.650308 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e12ac0dd-46ad-4d51-9ebc-acdd264649b2-metrics-certs-tls-certs\") pod \"e12ac0dd-46ad-4d51-9ebc-acdd264649b2\" (UID: \"e12ac0dd-46ad-4d51-9ebc-acdd264649b2\") " Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.650425 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e12ac0dd-46ad-4d51-9ebc-acdd264649b2-ovsdbserver-nb-tls-certs\") pod \"e12ac0dd-46ad-4d51-9ebc-acdd264649b2\" (UID: \"e12ac0dd-46ad-4d51-9ebc-acdd264649b2\") " Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.650488 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/457f660f-9b87-4d37-a92e-0c30bb2a2fea-ovn-rundir\") pod \"457f660f-9b87-4d37-a92e-0c30bb2a2fea\" (UID: \"457f660f-9b87-4d37-a92e-0c30bb2a2fea\") " Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.650511 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5phk4\" (UniqueName: \"kubernetes.io/projected/e12ac0dd-46ad-4d51-9ebc-acdd264649b2-kube-api-access-5phk4\") pod \"e12ac0dd-46ad-4d51-9ebc-acdd264649b2\" (UID: \"e12ac0dd-46ad-4d51-9ebc-acdd264649b2\") " Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.650586 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e12ac0dd-46ad-4d51-9ebc-acdd264649b2-scripts\") pod \"e12ac0dd-46ad-4d51-9ebc-acdd264649b2\" (UID: \"e12ac0dd-46ad-4d51-9ebc-acdd264649b2\") " Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.650614 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/457f660f-9b87-4d37-a92e-0c30bb2a2fea-config\") pod \"457f660f-9b87-4d37-a92e-0c30bb2a2fea\" (UID: \"457f660f-9b87-4d37-a92e-0c30bb2a2fea\") " Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.650648 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e12ac0dd-46ad-4d51-9ebc-acdd264649b2-ovsdb-rundir\") pod \"e12ac0dd-46ad-4d51-9ebc-acdd264649b2\" (UID: \"e12ac0dd-46ad-4d51-9ebc-acdd264649b2\") " Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.650691 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e12ac0dd-46ad-4d51-9ebc-acdd264649b2-config\") pod \"e12ac0dd-46ad-4d51-9ebc-acdd264649b2\" (UID: \"e12ac0dd-46ad-4d51-9ebc-acdd264649b2\") " Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.650717 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/457f660f-9b87-4d37-a92e-0c30bb2a2fea-ovs-rundir\") pod \"457f660f-9b87-4d37-a92e-0c30bb2a2fea\" (UID: \"457f660f-9b87-4d37-a92e-0c30bb2a2fea\") " Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.650752 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/457f660f-9b87-4d37-a92e-0c30bb2a2fea-combined-ca-bundle\") pod \"457f660f-9b87-4d37-a92e-0c30bb2a2fea\" (UID: \"457f660f-9b87-4d37-a92e-0c30bb2a2fea\") " Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.650791 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/457f660f-9b87-4d37-a92e-0c30bb2a2fea-metrics-certs-tls-certs\") pod \"457f660f-9b87-4d37-a92e-0c30bb2a2fea\" (UID: \"457f660f-9b87-4d37-a92e-0c30bb2a2fea\") " Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.650813 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"e12ac0dd-46ad-4d51-9ebc-acdd264649b2\" (UID: \"e12ac0dd-46ad-4d51-9ebc-acdd264649b2\") " Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.650871 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v82fz\" (UniqueName: \"kubernetes.io/projected/457f660f-9b87-4d37-a92e-0c30bb2a2fea-kube-api-access-v82fz\") pod \"457f660f-9b87-4d37-a92e-0c30bb2a2fea\" (UID: \"457f660f-9b87-4d37-a92e-0c30bb2a2fea\") " Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.650903 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e12ac0dd-46ad-4d51-9ebc-acdd264649b2-combined-ca-bundle\") pod \"e12ac0dd-46ad-4d51-9ebc-acdd264649b2\" (UID: \"e12ac0dd-46ad-4d51-9ebc-acdd264649b2\") " Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.651495 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/457f660f-9b87-4d37-a92e-0c30bb2a2fea-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "457f660f-9b87-4d37-a92e-0c30bb2a2fea" (UID: "457f660f-9b87-4d37-a92e-0c30bb2a2fea"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.651580 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/457f660f-9b87-4d37-a92e-0c30bb2a2fea-config" (OuterVolumeSpecName: "config") pod "457f660f-9b87-4d37-a92e-0c30bb2a2fea" (UID: "457f660f-9b87-4d37-a92e-0c30bb2a2fea"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.654623 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e12ac0dd-46ad-4d51-9ebc-acdd264649b2-config" (OuterVolumeSpecName: "config") pod "e12ac0dd-46ad-4d51-9ebc-acdd264649b2" (UID: "e12ac0dd-46ad-4d51-9ebc-acdd264649b2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.655120 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/457f660f-9b87-4d37-a92e-0c30bb2a2fea-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "457f660f-9b87-4d37-a92e-0c30bb2a2fea" (UID: "457f660f-9b87-4d37-a92e-0c30bb2a2fea"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.655300 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e12ac0dd-46ad-4d51-9ebc-acdd264649b2-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "e12ac0dd-46ad-4d51-9ebc-acdd264649b2" (UID: "e12ac0dd-46ad-4d51-9ebc-acdd264649b2"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.655616 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e12ac0dd-46ad-4d51-9ebc-acdd264649b2-scripts" (OuterVolumeSpecName: "scripts") pod "e12ac0dd-46ad-4d51-9ebc-acdd264649b2" (UID: "e12ac0dd-46ad-4d51-9ebc-acdd264649b2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.669889 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/457f660f-9b87-4d37-a92e-0c30bb2a2fea-kube-api-access-v82fz" (OuterVolumeSpecName: "kube-api-access-v82fz") pod "457f660f-9b87-4d37-a92e-0c30bb2a2fea" (UID: "457f660f-9b87-4d37-a92e-0c30bb2a2fea"). InnerVolumeSpecName "kube-api-access-v82fz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.670117 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e12ac0dd-46ad-4d51-9ebc-acdd264649b2-kube-api-access-5phk4" (OuterVolumeSpecName: "kube-api-access-5phk4") pod "e12ac0dd-46ad-4d51-9ebc-acdd264649b2" (UID: "e12ac0dd-46ad-4d51-9ebc-acdd264649b2"). InnerVolumeSpecName "kube-api-access-5phk4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.678659 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.682327 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "e12ac0dd-46ad-4d51-9ebc-acdd264649b2" (UID: "e12ac0dd-46ad-4d51-9ebc-acdd264649b2"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.692110 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-f82a-account-create-update-cwrq9"] Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.718273 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/457f660f-9b87-4d37-a92e-0c30bb2a2fea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "457f660f-9b87-4d37-a92e-0c30bb2a2fea" (UID: "457f660f-9b87-4d37-a92e-0c30bb2a2fea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:28 crc kubenswrapper[4893]: E0314 07:24:28.736016 4893 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 14 07:24:28 crc kubenswrapper[4893]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95,Command:[/bin/sh -c #!/bin/bash Mar 14 07:24:28 crc kubenswrapper[4893]: Mar 14 07:24:28 crc kubenswrapper[4893]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 14 07:24:28 crc kubenswrapper[4893]: Mar 14 07:24:28 crc kubenswrapper[4893]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 14 07:24:28 crc kubenswrapper[4893]: Mar 14 07:24:28 crc kubenswrapper[4893]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 14 07:24:28 crc kubenswrapper[4893]: Mar 14 07:24:28 crc kubenswrapper[4893]: if [ -n "nova_api" ]; then Mar 14 07:24:28 crc kubenswrapper[4893]: GRANT_DATABASE="nova_api" Mar 14 07:24:28 crc kubenswrapper[4893]: else Mar 14 07:24:28 crc kubenswrapper[4893]: GRANT_DATABASE="*" Mar 14 07:24:28 crc kubenswrapper[4893]: fi Mar 14 07:24:28 crc kubenswrapper[4893]: Mar 14 07:24:28 crc kubenswrapper[4893]: # going for maximum compatibility here: Mar 14 07:24:28 crc kubenswrapper[4893]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 14 07:24:28 crc kubenswrapper[4893]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 14 07:24:28 crc kubenswrapper[4893]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 14 07:24:28 crc kubenswrapper[4893]: # support updates Mar 14 07:24:28 crc kubenswrapper[4893]: Mar 14 07:24:28 crc kubenswrapper[4893]: $MYSQL_CMD < logger="UnhandledError" Mar 14 07:24:28 crc kubenswrapper[4893]: E0314 07:24:28.737602 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-api-db-secret\\\" not found\"" pod="openstack/nova-api-f82a-account-create-update-cwrq9" podUID="6bd578b9-f296-486d-ba12-376e227c3b09" Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.744697 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e12ac0dd-46ad-4d51-9ebc-acdd264649b2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e12ac0dd-46ad-4d51-9ebc-acdd264649b2" (UID: "e12ac0dd-46ad-4d51-9ebc-acdd264649b2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.752237 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5241cf43-f60b-4499-ae07-6b449f6ef57e-dns-swift-storage-0\") pod \"5241cf43-f60b-4499-ae07-6b449f6ef57e\" (UID: \"5241cf43-f60b-4499-ae07-6b449f6ef57e\") " Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.752292 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1f33b9a-21cf-4ca9-82a1-5fb7592bb749-metrics-certs-tls-certs\") pod \"e1f33b9a-21cf-4ca9-82a1-5fb7592bb749\" (UID: \"e1f33b9a-21cf-4ca9-82a1-5fb7592bb749\") " Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.752320 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5241cf43-f60b-4499-ae07-6b449f6ef57e-dns-svc\") pod \"5241cf43-f60b-4499-ae07-6b449f6ef57e\" (UID: \"5241cf43-f60b-4499-ae07-6b449f6ef57e\") " Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.752348 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e1f33b9a-21cf-4ca9-82a1-5fb7592bb749-scripts\") pod \"e1f33b9a-21cf-4ca9-82a1-5fb7592bb749\" (UID: \"e1f33b9a-21cf-4ca9-82a1-5fb7592bb749\") " Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.752366 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5241cf43-f60b-4499-ae07-6b449f6ef57e-config\") pod \"5241cf43-f60b-4499-ae07-6b449f6ef57e\" (UID: \"5241cf43-f60b-4499-ae07-6b449f6ef57e\") " Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.752387 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5241cf43-f60b-4499-ae07-6b449f6ef57e-ovsdbserver-nb\") pod \"5241cf43-f60b-4499-ae07-6b449f6ef57e\" (UID: \"5241cf43-f60b-4499-ae07-6b449f6ef57e\") " Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.752850 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvlcj\" (UniqueName: \"kubernetes.io/projected/5241cf43-f60b-4499-ae07-6b449f6ef57e-kube-api-access-nvlcj\") pod \"5241cf43-f60b-4499-ae07-6b449f6ef57e\" (UID: \"5241cf43-f60b-4499-ae07-6b449f6ef57e\") " Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.752883 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5241cf43-f60b-4499-ae07-6b449f6ef57e-ovsdbserver-sb\") pod \"5241cf43-f60b-4499-ae07-6b449f6ef57e\" (UID: \"5241cf43-f60b-4499-ae07-6b449f6ef57e\") " Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.752912 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1f33b9a-21cf-4ca9-82a1-5fb7592bb749-config\") pod \"e1f33b9a-21cf-4ca9-82a1-5fb7592bb749\" (UID: \"e1f33b9a-21cf-4ca9-82a1-5fb7592bb749\") " Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.753167 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clxtq\" (UniqueName: \"kubernetes.io/projected/e1f33b9a-21cf-4ca9-82a1-5fb7592bb749-kube-api-access-clxtq\") pod \"e1f33b9a-21cf-4ca9-82a1-5fb7592bb749\" (UID: \"e1f33b9a-21cf-4ca9-82a1-5fb7592bb749\") " Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.753217 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1f33b9a-21cf-4ca9-82a1-5fb7592bb749-ovsdbserver-sb-tls-certs\") pod \"e1f33b9a-21cf-4ca9-82a1-5fb7592bb749\" (UID: \"e1f33b9a-21cf-4ca9-82a1-5fb7592bb749\") " Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.753269 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1f33b9a-21cf-4ca9-82a1-5fb7592bb749-combined-ca-bundle\") pod \"e1f33b9a-21cf-4ca9-82a1-5fb7592bb749\" (UID: \"e1f33b9a-21cf-4ca9-82a1-5fb7592bb749\") " Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.753493 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e1f33b9a-21cf-4ca9-82a1-5fb7592bb749-ovsdb-rundir\") pod \"e1f33b9a-21cf-4ca9-82a1-5fb7592bb749\" (UID: \"e1f33b9a-21cf-4ca9-82a1-5fb7592bb749\") " Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.753549 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"e1f33b9a-21cf-4ca9-82a1-5fb7592bb749\" (UID: \"e1f33b9a-21cf-4ca9-82a1-5fb7592bb749\") " Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.756570 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v82fz\" (UniqueName: \"kubernetes.io/projected/457f660f-9b87-4d37-a92e-0c30bb2a2fea-kube-api-access-v82fz\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.756603 4893 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e12ac0dd-46ad-4d51-9ebc-acdd264649b2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.756620 4893 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/457f660f-9b87-4d37-a92e-0c30bb2a2fea-ovn-rundir\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.756630 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5phk4\" (UniqueName: \"kubernetes.io/projected/e12ac0dd-46ad-4d51-9ebc-acdd264649b2-kube-api-access-5phk4\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.756639 4893 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e12ac0dd-46ad-4d51-9ebc-acdd264649b2-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.756769 4893 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/457f660f-9b87-4d37-a92e-0c30bb2a2fea-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.756778 4893 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e12ac0dd-46ad-4d51-9ebc-acdd264649b2-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.756787 4893 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e12ac0dd-46ad-4d51-9ebc-acdd264649b2-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.756795 4893 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/457f660f-9b87-4d37-a92e-0c30bb2a2fea-ovs-rundir\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.756805 4893 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/457f660f-9b87-4d37-a92e-0c30bb2a2fea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.756828 4893 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.763749 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1f33b9a-21cf-4ca9-82a1-5fb7592bb749-scripts" (OuterVolumeSpecName: "scripts") pod "e1f33b9a-21cf-4ca9-82a1-5fb7592bb749" (UID: "e1f33b9a-21cf-4ca9-82a1-5fb7592bb749"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.775127 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-bwq2l" podUID="ec3a7835-99ba-4d0d-b81d-2dea0dc7128b" containerName="ovs-vswitchd" containerID="cri-o://815433c66f3f469063c8f960c990cc45531d8077d9256e3db6d30e4aeb47f852" gracePeriod=29 Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.778251 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1f33b9a-21cf-4ca9-82a1-5fb7592bb749-config" (OuterVolumeSpecName: "config") pod "e1f33b9a-21cf-4ca9-82a1-5fb7592bb749" (UID: "e1f33b9a-21cf-4ca9-82a1-5fb7592bb749"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.779052 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1f33b9a-21cf-4ca9-82a1-5fb7592bb749-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "e1f33b9a-21cf-4ca9-82a1-5fb7592bb749" (UID: "e1f33b9a-21cf-4ca9-82a1-5fb7592bb749"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.794340 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5241cf43-f60b-4499-ae07-6b449f6ef57e-kube-api-access-nvlcj" (OuterVolumeSpecName: "kube-api-access-nvlcj") pod "5241cf43-f60b-4499-ae07-6b449f6ef57e" (UID: "5241cf43-f60b-4499-ae07-6b449f6ef57e"). InnerVolumeSpecName "kube-api-access-nvlcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.801487 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "e1f33b9a-21cf-4ca9-82a1-5fb7592bb749" (UID: "e1f33b9a-21cf-4ca9-82a1-5fb7592bb749"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.801918 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e12ac0dd-46ad-4d51-9ebc-acdd264649b2-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "e12ac0dd-46ad-4d51-9ebc-acdd264649b2" (UID: "e12ac0dd-46ad-4d51-9ebc-acdd264649b2"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.813761 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1f33b9a-21cf-4ca9-82a1-5fb7592bb749-kube-api-access-clxtq" (OuterVolumeSpecName: "kube-api-access-clxtq") pod "e1f33b9a-21cf-4ca9-82a1-5fb7592bb749" (UID: "e1f33b9a-21cf-4ca9-82a1-5fb7592bb749"). InnerVolumeSpecName "kube-api-access-clxtq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.831909 4893 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.842579 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e12ac0dd-46ad-4d51-9ebc-acdd264649b2-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "e12ac0dd-46ad-4d51-9ebc-acdd264649b2" (UID: "e12ac0dd-46ad-4d51-9ebc-acdd264649b2"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.843373 4893 generic.go:334] "Generic (PLEG): container finished" podID="b6ca04fa-accd-437a-ab63-d39d14a49777" containerID="928a88e96ca63f18e3047a55a85109e7284281e3348f6b7748c2c86111c4f15e" exitCode=143 Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.843540 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b6ca04fa-accd-437a-ab63-d39d14a49777","Type":"ContainerDied","Data":"928a88e96ca63f18e3047a55a85109e7284281e3348f6b7748c2c86111c4f15e"} Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.847589 4893 generic.go:334] "Generic (PLEG): container finished" podID="2bee0811-3177-4034-aa99-39158e55c44f" containerID="7a372b079b7d68e2f8f76d74a0d939fbb7c8e14fb0e7f1258273cfbe77ae9b8b" exitCode=0 Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.847792 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5cdc5f965f-t6wfv" event={"ID":"2bee0811-3177-4034-aa99-39158e55c44f","Type":"ContainerDied","Data":"7a372b079b7d68e2f8f76d74a0d939fbb7c8e14fb0e7f1258273cfbe77ae9b8b"} Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.858592 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3a9bdb3e-d59b-4614-a1dd-bc6212ba104c-openstack-config-secret\") pod \"3a9bdb3e-d59b-4614-a1dd-bc6212ba104c\" (UID: \"3a9bdb3e-d59b-4614-a1dd-bc6212ba104c\") " Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.858639 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3a9bdb3e-d59b-4614-a1dd-bc6212ba104c-openstack-config\") pod \"3a9bdb3e-d59b-4614-a1dd-bc6212ba104c\" (UID: \"3a9bdb3e-d59b-4614-a1dd-bc6212ba104c\") " Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.858696 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4fvm\" (UniqueName: \"kubernetes.io/projected/3a9bdb3e-d59b-4614-a1dd-bc6212ba104c-kube-api-access-g4fvm\") pod \"3a9bdb3e-d59b-4614-a1dd-bc6212ba104c\" (UID: \"3a9bdb3e-d59b-4614-a1dd-bc6212ba104c\") " Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.858727 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a9bdb3e-d59b-4614-a1dd-bc6212ba104c-combined-ca-bundle\") pod \"3a9bdb3e-d59b-4614-a1dd-bc6212ba104c\" (UID: \"3a9bdb3e-d59b-4614-a1dd-bc6212ba104c\") " Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.859916 4893 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.859943 4893 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e1f33b9a-21cf-4ca9-82a1-5fb7592bb749-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.859954 4893 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e12ac0dd-46ad-4d51-9ebc-acdd264649b2-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.859963 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvlcj\" (UniqueName: \"kubernetes.io/projected/5241cf43-f60b-4499-ae07-6b449f6ef57e-kube-api-access-nvlcj\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.859975 4893 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1f33b9a-21cf-4ca9-82a1-5fb7592bb749-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.859983 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-clxtq\" (UniqueName: \"kubernetes.io/projected/e1f33b9a-21cf-4ca9-82a1-5fb7592bb749-kube-api-access-clxtq\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.859992 4893 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e12ac0dd-46ad-4d51-9ebc-acdd264649b2-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.860000 4893 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e1f33b9a-21cf-4ca9-82a1-5fb7592bb749-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.860022 4893 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.867169 4893 generic.go:334] "Generic (PLEG): container finished" podID="a0d5e368-4fac-48b9-a64c-717f3acf9388" containerID="d1f0172858057fee57fcad3f87f58b29148aa386e794d735c5c5da631d14ed80" exitCode=143 Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.867416 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a0d5e368-4fac-48b9-a64c-717f3acf9388","Type":"ContainerDied","Data":"d1f0172858057fee57fcad3f87f58b29148aa386e794d735c5c5da631d14ed80"} Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.885444 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a9bdb3e-d59b-4614-a1dd-bc6212ba104c-kube-api-access-g4fvm" (OuterVolumeSpecName: "kube-api-access-g4fvm") pod "3a9bdb3e-d59b-4614-a1dd-bc6212ba104c" (UID: "3a9bdb3e-d59b-4614-a1dd-bc6212ba104c"). InnerVolumeSpecName "kube-api-access-g4fvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.886053 4893 generic.go:334] "Generic (PLEG): container finished" podID="1fc48c5b-eba4-4ce3-b68a-289b737dd9c4" containerID="a6139b7db93a2d6935130af661309f66a9cc34448d14819a79327f7b0ac80cc2" exitCode=143 Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.886186 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-68559f9fc9-2zprf" event={"ID":"1fc48c5b-eba4-4ce3-b68a-289b737dd9c4","Type":"ContainerDied","Data":"a6139b7db93a2d6935130af661309f66a9cc34448d14819a79327f7b0ac80cc2"} Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.893772 4893 generic.go:334] "Generic (PLEG): container finished" podID="2195ecfb-6eeb-48f1-8b55-c57520974663" containerID="bea0181f97653e52ef6b00db91426f325b021b8db102f6f9ca17b54176110d69" exitCode=143 Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.893978 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d4589578b-zwqpr" event={"ID":"2195ecfb-6eeb-48f1-8b55-c57520974663","Type":"ContainerDied","Data":"bea0181f97653e52ef6b00db91426f325b021b8db102f6f9ca17b54176110d69"} Mar 14 07:24:28 crc kubenswrapper[4893]: E0314 07:24:28.903991 4893 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod079232b7_87bb_42cf_96ff_1eb2d1cfe2b5.slice/crio-conmon-42879a04fa13be12602354e7d2287565ac8c111d2ae1985e6be2c424ccdde936.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec3a7835_99ba_4d0d_b81d_2dea0dc7128b.slice/crio-14dfeeb1ac493f8c868ceb12cb9f2623644d493f702d4b8d551c54f2d4fd82cb.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod203abd37_654f_480c_8a9d_719d767aec4d.slice/crio-26fa9847a83bdbc2805cf9345b971f151cfc501d5ecd4dfe251d3518ad82f625.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod079232b7_87bb_42cf_96ff_1eb2d1cfe2b5.slice/crio-e6185392ccc78feefed6979a46060f7ab1bf26d3d5dd3f3682205a3151cc587f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49d7922c_f9be_40bc_ba17_ec777a331998.slice/crio-76671f2af7de340854a84a9a135d9964bd871f327e60af0a310986cec234a8f4.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49d7922c_f9be_40bc_ba17_ec777a331998.slice/crio-conmon-76671f2af7de340854a84a9a135d9964bd871f327e60af0a310986cec234a8f4.scope\": RecentStats: unable to find data in memory cache]" Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.906426 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_e1f33b9a-21cf-4ca9-82a1-5fb7592bb749/ovsdbserver-sb/0.log" Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.906686 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"e1f33b9a-21cf-4ca9-82a1-5fb7592bb749","Type":"ContainerDied","Data":"4b5631e650c731c2a185bb7430db72d82be2d70d5dbcd4c9189cb4f2306de86d"} Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.906802 4893 scope.go:117] "RemoveContainer" containerID="eace24dbb800288f6b1dfc821a8adc0ebb57521e5ce25c1b28ee9518e85fe50f" Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.906733 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.920452 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1f33b9a-21cf-4ca9-82a1-5fb7592bb749-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e1f33b9a-21cf-4ca9-82a1-5fb7592bb749" (UID: "e1f33b9a-21cf-4ca9-82a1-5fb7592bb749"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.929712 4893 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.930678 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_e12ac0dd-46ad-4d51-9ebc-acdd264649b2/ovsdbserver-nb/0.log" Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.930802 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"e12ac0dd-46ad-4d51-9ebc-acdd264649b2","Type":"ContainerDied","Data":"30159984625ce8d1a17f6484306a447ef75f2b4c6c364fedd5f4711d1cf4fe3e"} Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.930920 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.933574 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a9bdb3e-d59b-4614-a1dd-bc6212ba104c-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "3a9bdb3e-d59b-4614-a1dd-bc6212ba104c" (UID: "3a9bdb3e-d59b-4614-a1dd-bc6212ba104c"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.939254 4893 generic.go:334] "Generic (PLEG): container finished" podID="49d7922c-f9be-40bc-ba17-ec777a331998" containerID="dfb104ad3656c054cc0079116aa3b717e79a6e2e4bfdd1d18ecce304d17812e0" exitCode=143 Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.939311 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6f4f4558c4-87m4w" event={"ID":"49d7922c-f9be-40bc-ba17-ec777a331998","Type":"ContainerDied","Data":"dfb104ad3656c054cc0079116aa3b717e79a6e2e4bfdd1d18ecce304d17812e0"} Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.955151 4893 generic.go:334] "Generic (PLEG): container finished" podID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerID="42879a04fa13be12602354e7d2287565ac8c111d2ae1985e6be2c424ccdde936" exitCode=0 Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.955184 4893 generic.go:334] "Generic (PLEG): container finished" podID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerID="273a5c886b9d88d838c2d721fc03cc49f0a45e38b5a0c201ed139fec6b5f6ee1" exitCode=0 Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.955191 4893 generic.go:334] "Generic (PLEG): container finished" podID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerID="c23ee49f390ead1ae4f7fca83deb5f84581ceae59f8e8ae8455a1874dfe7b051" exitCode=0 Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.955197 4893 generic.go:334] "Generic (PLEG): container finished" podID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerID="10212c44819536c44b3626faff9ed9fd4a4329b9deab21102b0375f256bb23d2" exitCode=0 Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.955238 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"079232b7-87bb-42cf-96ff-1eb2d1cfe2b5","Type":"ContainerDied","Data":"42879a04fa13be12602354e7d2287565ac8c111d2ae1985e6be2c424ccdde936"} Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.955263 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"079232b7-87bb-42cf-96ff-1eb2d1cfe2b5","Type":"ContainerDied","Data":"273a5c886b9d88d838c2d721fc03cc49f0a45e38b5a0c201ed139fec6b5f6ee1"} Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.955274 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"079232b7-87bb-42cf-96ff-1eb2d1cfe2b5","Type":"ContainerDied","Data":"c23ee49f390ead1ae4f7fca83deb5f84581ceae59f8e8ae8455a1874dfe7b051"} Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.955284 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"079232b7-87bb-42cf-96ff-1eb2d1cfe2b5","Type":"ContainerDied","Data":"10212c44819536c44b3626faff9ed9fd4a4329b9deab21102b0375f256bb23d2"} Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.959943 4893 generic.go:334] "Generic (PLEG): container finished" podID="ab16027b-4fcf-42bf-b586-a7b8ff348305" containerID="492cd7441a800dc10a30bd9b6a28dd3d89e262a94b290571698a0218593d8d7f" exitCode=143 Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.960071 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ab16027b-4fcf-42bf-b586-a7b8ff348305","Type":"ContainerDied","Data":"492cd7441a800dc10a30bd9b6a28dd3d89e262a94b290571698a0218593d8d7f"} Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.960745 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.961393 4893 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1f33b9a-21cf-4ca9-82a1-5fb7592bb749-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.961467 4893 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3a9bdb3e-d59b-4614-a1dd-bc6212ba104c-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.961550 4893 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.961617 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4fvm\" (UniqueName: \"kubernetes.io/projected/3a9bdb3e-d59b-4614-a1dd-bc6212ba104c-kube-api-access-g4fvm\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.962495 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-f82a-account-create-update-cwrq9" event={"ID":"6bd578b9-f296-486d-ba12-376e227c3b09","Type":"ContainerStarted","Data":"9075da906b5896435624346f4d356cdb99afb4688efdfce646beac4798852170"} Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.964007 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1f33b9a-21cf-4ca9-82a1-5fb7592bb749-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "e1f33b9a-21cf-4ca9-82a1-5fb7592bb749" (UID: "e1f33b9a-21cf-4ca9-82a1-5fb7592bb749"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.967274 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-7bbf4_457f660f-9b87-4d37-a92e-0c30bb2a2fea/openstack-network-exporter/0.log" Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.967330 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-7bbf4" event={"ID":"457f660f-9b87-4d37-a92e-0c30bb2a2fea","Type":"ContainerDied","Data":"a755deaf332e8bde1c5b1a0c6f5a8dcc8c488df2d0fcb490d3d75e182b1de7d4"} Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.967384 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-7bbf4" Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.969948 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a9bdb3e-d59b-4614-a1dd-bc6212ba104c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3a9bdb3e-d59b-4614-a1dd-bc6212ba104c" (UID: "3a9bdb3e-d59b-4614-a1dd-bc6212ba104c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.981368 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5241cf43-f60b-4499-ae07-6b449f6ef57e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5241cf43-f60b-4499-ae07-6b449f6ef57e" (UID: "5241cf43-f60b-4499-ae07-6b449f6ef57e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.982633 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69ffc749-7qsrf" Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.982927 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69ffc749-7qsrf" event={"ID":"5241cf43-f60b-4499-ae07-6b449f6ef57e","Type":"ContainerDied","Data":"1d96da80206a893b073d69a3fa67a054b52bdf5bab173b3ca52d2ed565c61cd3"} Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.996899 4893 generic.go:334] "Generic (PLEG): container finished" podID="fefe82b2-447a-4f97-8221-7050b61ef60c" containerID="142a5e92673e832e9dc3ab1aaae6b53df019680363c32f4373ec08b60c93fa74" exitCode=143 Mar 14 07:24:28 crc kubenswrapper[4893]: I0314 07:24:28.996976 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-684d6b469b-c8l2b" event={"ID":"fefe82b2-447a-4f97-8221-7050b61ef60c","Type":"ContainerDied","Data":"142a5e92673e832e9dc3ab1aaae6b53df019680363c32f4373ec08b60c93fa74"} Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.001725 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5241cf43-f60b-4499-ae07-6b449f6ef57e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5241cf43-f60b-4499-ae07-6b449f6ef57e" (UID: "5241cf43-f60b-4499-ae07-6b449f6ef57e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.005818 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a9bdb3e-d59b-4614-a1dd-bc6212ba104c-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "3a9bdb3e-d59b-4614-a1dd-bc6212ba104c" (UID: "3a9bdb3e-d59b-4614-a1dd-bc6212ba104c"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.009351 4893 generic.go:334] "Generic (PLEG): container finished" podID="3a9bdb3e-d59b-4614-a1dd-bc6212ba104c" containerID="743b3347e9bfc67743c1d12d642d3877d05a6b762a5e1bfa35ad0c3f14efe090" exitCode=137 Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.009513 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.009915 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/457f660f-9b87-4d37-a92e-0c30bb2a2fea-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "457f660f-9b87-4d37-a92e-0c30bb2a2fea" (UID: "457f660f-9b87-4d37-a92e-0c30bb2a2fea"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.015659 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3b5a-account-create-update-hkpb6" event={"ID":"c9a21025-970d-4b50-8f80-b0926242b929","Type":"ContainerStarted","Data":"f12c3a3e28c4e24d17c934f0dfe8ea2e3761eb5fe3bf77770f89caee4b0dfa8d"} Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.034710 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.063680 4893 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5241cf43-f60b-4499-ae07-6b449f6ef57e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.063711 4893 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5241cf43-f60b-4499-ae07-6b449f6ef57e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.063720 4893 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1f33b9a-21cf-4ca9-82a1-5fb7592bb749-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.063732 4893 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3a9bdb3e-d59b-4614-a1dd-bc6212ba104c-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.063741 4893 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a9bdb3e-d59b-4614-a1dd-bc6212ba104c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.063750 4893 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/457f660f-9b87-4d37-a92e-0c30bb2a2fea-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.109584 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5241cf43-f60b-4499-ae07-6b449f6ef57e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5241cf43-f60b-4499-ae07-6b449f6ef57e" (UID: "5241cf43-f60b-4499-ae07-6b449f6ef57e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.123038 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-33a2-account-create-update-fzsjz"] Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.131789 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5241cf43-f60b-4499-ae07-6b449f6ef57e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "5241cf43-f60b-4499-ae07-6b449f6ef57e" (UID: "5241cf43-f60b-4499-ae07-6b449f6ef57e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.139505 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5241cf43-f60b-4499-ae07-6b449f6ef57e-config" (OuterVolumeSpecName: "config") pod "5241cf43-f60b-4499-ae07-6b449f6ef57e" (UID: "5241cf43-f60b-4499-ae07-6b449f6ef57e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.166805 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1f33b9a-21cf-4ca9-82a1-5fb7592bb749-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "e1f33b9a-21cf-4ca9-82a1-5fb7592bb749" (UID: "e1f33b9a-21cf-4ca9-82a1-5fb7592bb749"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.166906 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1f33b9a-21cf-4ca9-82a1-5fb7592bb749-metrics-certs-tls-certs\") pod \"e1f33b9a-21cf-4ca9-82a1-5fb7592bb749\" (UID: \"e1f33b9a-21cf-4ca9-82a1-5fb7592bb749\") " Mar 14 07:24:29 crc kubenswrapper[4893]: W0314 07:24:29.168009 4893 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/e1f33b9a-21cf-4ca9-82a1-5fb7592bb749/volumes/kubernetes.io~secret/metrics-certs-tls-certs Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.168023 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1f33b9a-21cf-4ca9-82a1-5fb7592bb749-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "e1f33b9a-21cf-4ca9-82a1-5fb7592bb749" (UID: "e1f33b9a-21cf-4ca9-82a1-5fb7592bb749"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.168752 4893 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5241cf43-f60b-4499-ae07-6b449f6ef57e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.168772 4893 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1f33b9a-21cf-4ca9-82a1-5fb7592bb749-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.168782 4893 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5241cf43-f60b-4499-ae07-6b449f6ef57e-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.168794 4893 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5241cf43-f60b-4499-ae07-6b449f6ef57e-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.175366 4893 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-7776dc7c77-9lm7q" podUID="203abd37-654f-480c-8a9d-719d767aec4d" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.177:8080/healthcheck\": dial tcp 10.217.0.177:8080: connect: connection refused" Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.175441 4893 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-7776dc7c77-9lm7q" podUID="203abd37-654f-480c-8a9d-719d767aec4d" containerName="proxy-server" probeResult="failure" output="Get \"https://10.217.0.177:8080/healthcheck\": dial tcp 10.217.0.177:8080: connect: connection refused" Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.233659 4893 scope.go:117] "RemoveContainer" containerID="d32e5cf30b4cda1330ae637ea8cabb6506d4c4567367cf5b048fc3ef60031d41" Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.256488 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-259c-account-create-update-5jsnr"] Mar 14 07:24:29 crc kubenswrapper[4893]: E0314 07:24:29.285221 4893 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 14 07:24:29 crc kubenswrapper[4893]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95,Command:[/bin/sh -c #!/bin/bash Mar 14 07:24:29 crc kubenswrapper[4893]: Mar 14 07:24:29 crc kubenswrapper[4893]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 14 07:24:29 crc kubenswrapper[4893]: Mar 14 07:24:29 crc kubenswrapper[4893]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 14 07:24:29 crc kubenswrapper[4893]: Mar 14 07:24:29 crc kubenswrapper[4893]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 14 07:24:29 crc kubenswrapper[4893]: Mar 14 07:24:29 crc kubenswrapper[4893]: if [ -n "placement" ]; then Mar 14 07:24:29 crc kubenswrapper[4893]: GRANT_DATABASE="placement" Mar 14 07:24:29 crc kubenswrapper[4893]: else Mar 14 07:24:29 crc kubenswrapper[4893]: GRANT_DATABASE="*" Mar 14 07:24:29 crc kubenswrapper[4893]: fi Mar 14 07:24:29 crc kubenswrapper[4893]: Mar 14 07:24:29 crc kubenswrapper[4893]: # going for maximum compatibility here: Mar 14 07:24:29 crc kubenswrapper[4893]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 14 07:24:29 crc kubenswrapper[4893]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 14 07:24:29 crc kubenswrapper[4893]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 14 07:24:29 crc kubenswrapper[4893]: # support updates Mar 14 07:24:29 crc kubenswrapper[4893]: Mar 14 07:24:29 crc kubenswrapper[4893]: $MYSQL_CMD < logger="UnhandledError" Mar 14 07:24:29 crc kubenswrapper[4893]: E0314 07:24:29.285245 4893 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 14 07:24:29 crc kubenswrapper[4893]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95,Command:[/bin/sh -c #!/bin/bash Mar 14 07:24:29 crc kubenswrapper[4893]: Mar 14 07:24:29 crc kubenswrapper[4893]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 14 07:24:29 crc kubenswrapper[4893]: Mar 14 07:24:29 crc kubenswrapper[4893]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 14 07:24:29 crc kubenswrapper[4893]: Mar 14 07:24:29 crc kubenswrapper[4893]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 14 07:24:29 crc kubenswrapper[4893]: Mar 14 07:24:29 crc kubenswrapper[4893]: if [ -n "neutron" ]; then Mar 14 07:24:29 crc kubenswrapper[4893]: GRANT_DATABASE="neutron" Mar 14 07:24:29 crc kubenswrapper[4893]: else Mar 14 07:24:29 crc kubenswrapper[4893]: GRANT_DATABASE="*" Mar 14 07:24:29 crc kubenswrapper[4893]: fi Mar 14 07:24:29 crc kubenswrapper[4893]: Mar 14 07:24:29 crc kubenswrapper[4893]: # going for maximum compatibility here: Mar 14 07:24:29 crc kubenswrapper[4893]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 14 07:24:29 crc kubenswrapper[4893]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 14 07:24:29 crc kubenswrapper[4893]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 14 07:24:29 crc kubenswrapper[4893]: # support updates Mar 14 07:24:29 crc kubenswrapper[4893]: Mar 14 07:24:29 crc kubenswrapper[4893]: $MYSQL_CMD < logger="UnhandledError" Mar 14 07:24:29 crc kubenswrapper[4893]: E0314 07:24:29.288291 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"placement-db-secret\\\" not found\"" pod="openstack/placement-3b5a-account-create-update-hkpb6" podUID="c9a21025-970d-4b50-8f80-b0926242b929" Mar 14 07:24:29 crc kubenswrapper[4893]: E0314 07:24:29.291632 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"neutron-db-secret\\\" not found\"" pod="openstack/neutron-78ff-account-create-update-rbfhn" podUID="22218c00-c79a-4d4a-a0e4-dc59f10ebaaf" Mar 14 07:24:29 crc kubenswrapper[4893]: W0314 07:24:29.318014 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5296cc7d_3008_44e6_ae0b_f88c333e13aa.slice/crio-09399e503c45b2d4ae5828d5e690c1fba86620964c10b469e35cc06a37f5c97f WatchSource:0}: Error finding container 09399e503c45b2d4ae5828d5e690c1fba86620964c10b469e35cc06a37f5c97f: Status 404 returned error can't find the container with id 09399e503c45b2d4ae5828d5e690c1fba86620964c10b469e35cc06a37f5c97f Mar 14 07:24:29 crc kubenswrapper[4893]: E0314 07:24:29.327489 4893 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 14 07:24:29 crc kubenswrapper[4893]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95,Command:[/bin/sh -c #!/bin/bash Mar 14 07:24:29 crc kubenswrapper[4893]: Mar 14 07:24:29 crc kubenswrapper[4893]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 14 07:24:29 crc kubenswrapper[4893]: Mar 14 07:24:29 crc kubenswrapper[4893]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 14 07:24:29 crc kubenswrapper[4893]: Mar 14 07:24:29 crc kubenswrapper[4893]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 14 07:24:29 crc kubenswrapper[4893]: Mar 14 07:24:29 crc kubenswrapper[4893]: if [ -n "nova_cell1" ]; then Mar 14 07:24:29 crc kubenswrapper[4893]: GRANT_DATABASE="nova_cell1" Mar 14 07:24:29 crc kubenswrapper[4893]: else Mar 14 07:24:29 crc kubenswrapper[4893]: GRANT_DATABASE="*" Mar 14 07:24:29 crc kubenswrapper[4893]: fi Mar 14 07:24:29 crc kubenswrapper[4893]: Mar 14 07:24:29 crc kubenswrapper[4893]: # going for maximum compatibility here: Mar 14 07:24:29 crc kubenswrapper[4893]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 14 07:24:29 crc kubenswrapper[4893]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 14 07:24:29 crc kubenswrapper[4893]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 14 07:24:29 crc kubenswrapper[4893]: # support updates Mar 14 07:24:29 crc kubenswrapper[4893]: Mar 14 07:24:29 crc kubenswrapper[4893]: $MYSQL_CMD < logger="UnhandledError" Mar 14 07:24:29 crc kubenswrapper[4893]: E0314 07:24:29.327998 4893 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 14 07:24:29 crc kubenswrapper[4893]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95,Command:[/bin/sh -c #!/bin/bash Mar 14 07:24:29 crc kubenswrapper[4893]: Mar 14 07:24:29 crc kubenswrapper[4893]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 14 07:24:29 crc kubenswrapper[4893]: Mar 14 07:24:29 crc kubenswrapper[4893]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 14 07:24:29 crc kubenswrapper[4893]: Mar 14 07:24:29 crc kubenswrapper[4893]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 14 07:24:29 crc kubenswrapper[4893]: Mar 14 07:24:29 crc kubenswrapper[4893]: if [ -n "nova_cell0" ]; then Mar 14 07:24:29 crc kubenswrapper[4893]: GRANT_DATABASE="nova_cell0" Mar 14 07:24:29 crc kubenswrapper[4893]: else Mar 14 07:24:29 crc kubenswrapper[4893]: GRANT_DATABASE="*" Mar 14 07:24:29 crc kubenswrapper[4893]: fi Mar 14 07:24:29 crc kubenswrapper[4893]: Mar 14 07:24:29 crc kubenswrapper[4893]: # going for maximum compatibility here: Mar 14 07:24:29 crc kubenswrapper[4893]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 14 07:24:29 crc kubenswrapper[4893]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 14 07:24:29 crc kubenswrapper[4893]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 14 07:24:29 crc kubenswrapper[4893]: # support updates Mar 14 07:24:29 crc kubenswrapper[4893]: Mar 14 07:24:29 crc kubenswrapper[4893]: $MYSQL_CMD < logger="UnhandledError" Mar 14 07:24:29 crc kubenswrapper[4893]: E0314 07:24:29.329307 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell1-db-secret\\\" not found\"" pod="openstack/nova-cell1-33a2-account-create-update-fzsjz" podUID="ab0242d4-11e1-4a4b-9e55-a841f2ba874d" Mar 14 07:24:29 crc kubenswrapper[4893]: E0314 07:24:29.331180 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell0-db-secret\\\" not found\"" pod="openstack/nova-cell0-259c-account-create-update-5jsnr" podUID="5296cc7d-3008-44e6-ae0b-f88c333e13aa" Mar 14 07:24:29 crc kubenswrapper[4893]: E0314 07:24:29.364405 4893 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="027f613615f6fb782a95ff00ffffb3250ba4bf7974f3a506df7ad3cbe9d15d84" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Mar 14 07:24:29 crc kubenswrapper[4893]: E0314 07:24:29.365361 4893 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="027f613615f6fb782a95ff00ffffb3250ba4bf7974f3a506df7ad3cbe9d15d84" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Mar 14 07:24:29 crc kubenswrapper[4893]: E0314 07:24:29.366344 4893 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="027f613615f6fb782a95ff00ffffb3250ba4bf7974f3a506df7ad3cbe9d15d84" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Mar 14 07:24:29 crc kubenswrapper[4893]: E0314 07:24:29.366372 4893 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="4f6afb83-bfaa-41a6-8429-b8588d82c7a7" containerName="galera" Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.396232 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b4ee6b9-d425-48ad-a01e-9fbb7354e798" path="/var/lib/kubelet/pods/0b4ee6b9-d425-48ad-a01e-9fbb7354e798/volumes" Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.397341 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12390666-c0e4-4e0f-90fe-1cf230bc4702" path="/var/lib/kubelet/pods/12390666-c0e4-4e0f-90fe-1cf230bc4702/volumes" Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.397983 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24dd0d41-68ca-458d-9011-a2c167fda868" path="/var/lib/kubelet/pods/24dd0d41-68ca-458d-9011-a2c167fda868/volumes" Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.399189 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a9bdb3e-d59b-4614-a1dd-bc6212ba104c" path="/var/lib/kubelet/pods/3a9bdb3e-d59b-4614-a1dd-bc6212ba104c/volumes" Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.400015 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="438ee5c6-8f2a-491b-903d-78537b8465f4" path="/var/lib/kubelet/pods/438ee5c6-8f2a-491b-903d-78537b8465f4/volumes" Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.401662 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="481d0c9c-3ac9-4bae-bd8c-52489deda58c" path="/var/lib/kubelet/pods/481d0c9c-3ac9-4bae-bd8c-52489deda58c/volumes" Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.403952 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48cda34b-1f0b-463f-92a5-ba03353eac80" path="/var/lib/kubelet/pods/48cda34b-1f0b-463f-92a5-ba03353eac80/volumes" Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.404672 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b191a119-f244-4d99-98f2-e0ae52bd6613" path="/var/lib/kubelet/pods/b191a119-f244-4d99-98f2-e0ae52bd6613/volumes" Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.405925 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b57ae19b-38cb-419e-ab8f-7f0bb5ead383" path="/var/lib/kubelet/pods/b57ae19b-38cb-419e-ab8f-7f0bb5ead383/volumes" Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.406832 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e12ac0dd-46ad-4d51-9ebc-acdd264649b2" path="/var/lib/kubelet/pods/e12ac0dd-46ad-4d51-9ebc-acdd264649b2/volumes" Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.407656 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e580c7ca-eac4-4dd6-bdd6-478814d7f65d" path="/var/lib/kubelet/pods/e580c7ca-eac4-4dd6-bdd6-478814d7f65d/volumes" Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.408952 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e685f75c-8311-4120-88ca-be0e6060d132" path="/var/lib/kubelet/pods/e685f75c-8311-4120-88ca-be0e6060d132/volumes" Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.409875 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e77e9706-a6b7-4f26-9897-8f5d66642a67" path="/var/lib/kubelet/pods/e77e9706-a6b7-4f26-9897-8f5d66642a67/volumes" Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.410693 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef8f0e2b-b3e6-406e-9ee7-7569256f6bc3" path="/var/lib/kubelet/pods/ef8f0e2b-b3e6-406e-9ee7-7569256f6bc3/volumes" Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.412268 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f25145f5-abb0-4f54-aea2-d23716f0af23" path="/var/lib/kubelet/pods/f25145f5-abb0-4f54-aea2-d23716f0af23/volumes" Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.412837 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f50f3b01-c781-4d53-8c0c-62cb289ebbde" path="/var/lib/kubelet/pods/f50f3b01-c781-4d53-8c0c-62cb289ebbde/volumes" Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.413328 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6d550ac-ae8f-4fec-97fc-0e9816dd6b7a" path="/var/lib/kubelet/pods/f6d550ac-ae8f-4fec-97fc-0e9816dd6b7a/volumes" Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.514495 4893 scope.go:117] "RemoveContainer" containerID="8649f7f4f4c1d54eaf72dbf5276bdbd35d2015a4eb0c3467c699f17e2aee8e05" Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.524022 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-f82a-account-create-update-cwrq9" Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.549236 4893 scope.go:117] "RemoveContainer" containerID="89d23ad9d8b1070ff1ada0fd7fa8efcb4659ecb5a3b9953b1e1bb4bcf13f5396" Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.560937 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.567371 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.616089 4893 scope.go:117] "RemoveContainer" containerID="eda28ec59118715e8794d9877944b4ef0d5e23bf08d3c266e2f2246bfd2ec127" Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.653080 4893 scope.go:117] "RemoveContainer" containerID="acd718ecc01f85e0d44e4851299d2daef071eb198233eb55a80f4008d7f7cece" Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.682175 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bd578b9-f296-486d-ba12-376e227c3b09-operator-scripts\") pod \"6bd578b9-f296-486d-ba12-376e227c3b09\" (UID: \"6bd578b9-f296-486d-ba12-376e227c3b09\") " Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.682336 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhjfl\" (UniqueName: \"kubernetes.io/projected/6bd578b9-f296-486d-ba12-376e227c3b09-kube-api-access-fhjfl\") pod \"6bd578b9-f296-486d-ba12-376e227c3b09\" (UID: \"6bd578b9-f296-486d-ba12-376e227c3b09\") " Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.687351 4893 scope.go:117] "RemoveContainer" containerID="8a23c67b80013882f7c50d0e0b6c68e638069643a08b3c1cf1fe8d7791b8d9ab" Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.687766 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bd578b9-f296-486d-ba12-376e227c3b09-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6bd578b9-f296-486d-ba12-376e227c3b09" (UID: "6bd578b9-f296-486d-ba12-376e227c3b09"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.689874 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bd578b9-f296-486d-ba12-376e227c3b09-kube-api-access-fhjfl" (OuterVolumeSpecName: "kube-api-access-fhjfl") pod "6bd578b9-f296-486d-ba12-376e227c3b09" (UID: "6bd578b9-f296-486d-ba12-376e227c3b09"). InnerVolumeSpecName "kube-api-access-fhjfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.704884 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6f4f4558c4-87m4w" Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.734758 4893 scope.go:117] "RemoveContainer" containerID="743b3347e9bfc67743c1d12d642d3877d05a6b762a5e1bfa35ad0c3f14efe090" Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.771709 4893 scope.go:117] "RemoveContainer" containerID="743b3347e9bfc67743c1d12d642d3877d05a6b762a5e1bfa35ad0c3f14efe090" Mar 14 07:24:29 crc kubenswrapper[4893]: E0314 07:24:29.778718 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"743b3347e9bfc67743c1d12d642d3877d05a6b762a5e1bfa35ad0c3f14efe090\": container with ID starting with 743b3347e9bfc67743c1d12d642d3877d05a6b762a5e1bfa35ad0c3f14efe090 not found: ID does not exist" containerID="743b3347e9bfc67743c1d12d642d3877d05a6b762a5e1bfa35ad0c3f14efe090" Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.778804 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"743b3347e9bfc67743c1d12d642d3877d05a6b762a5e1bfa35ad0c3f14efe090"} err="failed to get container status \"743b3347e9bfc67743c1d12d642d3877d05a6b762a5e1bfa35ad0c3f14efe090\": rpc error: code = NotFound desc = could not find container \"743b3347e9bfc67743c1d12d642d3877d05a6b762a5e1bfa35ad0c3f14efe090\": container with ID starting with 743b3347e9bfc67743c1d12d642d3877d05a6b762a5e1bfa35ad0c3f14efe090 not found: ID does not exist" Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.785186 4893 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bd578b9-f296-486d-ba12-376e227c3b09-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.785238 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhjfl\" (UniqueName: \"kubernetes.io/projected/6bd578b9-f296-486d-ba12-376e227c3b09-kube-api-access-fhjfl\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.886289 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/49d7922c-f9be-40bc-ba17-ec777a331998-config-data-custom\") pod \"49d7922c-f9be-40bc-ba17-ec777a331998\" (UID: \"49d7922c-f9be-40bc-ba17-ec777a331998\") " Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.886374 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49d7922c-f9be-40bc-ba17-ec777a331998-combined-ca-bundle\") pod \"49d7922c-f9be-40bc-ba17-ec777a331998\" (UID: \"49d7922c-f9be-40bc-ba17-ec777a331998\") " Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.886402 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49d7922c-f9be-40bc-ba17-ec777a331998-config-data\") pod \"49d7922c-f9be-40bc-ba17-ec777a331998\" (UID: \"49d7922c-f9be-40bc-ba17-ec777a331998\") " Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.886432 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88zmj\" (UniqueName: \"kubernetes.io/projected/49d7922c-f9be-40bc-ba17-ec777a331998-kube-api-access-88zmj\") pod \"49d7922c-f9be-40bc-ba17-ec777a331998\" (UID: \"49d7922c-f9be-40bc-ba17-ec777a331998\") " Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.886591 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49d7922c-f9be-40bc-ba17-ec777a331998-logs\") pod \"49d7922c-f9be-40bc-ba17-ec777a331998\" (UID: \"49d7922c-f9be-40bc-ba17-ec777a331998\") " Mar 14 07:24:29 crc kubenswrapper[4893]: E0314 07:24:29.887860 4893 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 14 07:24:29 crc kubenswrapper[4893]: E0314 07:24:29.887914 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e-config-data podName:7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e nodeName:}" failed. No retries permitted until 2026-03-14 07:24:33.887900988 +0000 UTC m=+1553.150077780 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e-config-data") pod "rabbitmq-cell1-server-0" (UID: "7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e") : configmap "rabbitmq-cell1-config-data" not found Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.894901 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49d7922c-f9be-40bc-ba17-ec777a331998-logs" (OuterVolumeSpecName: "logs") pod "49d7922c-f9be-40bc-ba17-ec777a331998" (UID: "49d7922c-f9be-40bc-ba17-ec777a331998"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.926933 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49d7922c-f9be-40bc-ba17-ec777a331998-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "49d7922c-f9be-40bc-ba17-ec777a331998" (UID: "49d7922c-f9be-40bc-ba17-ec777a331998"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.930956 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-68559f9fc9-2zprf" Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.938846 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49d7922c-f9be-40bc-ba17-ec777a331998-kube-api-access-88zmj" (OuterVolumeSpecName: "kube-api-access-88zmj") pod "49d7922c-f9be-40bc-ba17-ec777a331998" (UID: "49d7922c-f9be-40bc-ba17-ec777a331998"). InnerVolumeSpecName "kube-api-access-88zmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.977307 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49d7922c-f9be-40bc-ba17-ec777a331998-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "49d7922c-f9be-40bc-ba17-ec777a331998" (UID: "49d7922c-f9be-40bc-ba17-ec777a331998"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.993737 4893 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/49d7922c-f9be-40bc-ba17-ec777a331998-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.993778 4893 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49d7922c-f9be-40bc-ba17-ec777a331998-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.993794 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88zmj\" (UniqueName: \"kubernetes.io/projected/49d7922c-f9be-40bc-ba17-ec777a331998-kube-api-access-88zmj\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:29 crc kubenswrapper[4893]: I0314 07:24:29.993807 4893 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49d7922c-f9be-40bc-ba17-ec777a331998-logs\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.010220 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.034472 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49d7922c-f9be-40bc-ba17-ec777a331998-config-data" (OuterVolumeSpecName: "config-data") pod "49d7922c-f9be-40bc-ba17-ec777a331998" (UID: "49d7922c-f9be-40bc-ba17-ec777a331998"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.037353 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qdfzm" event={"ID":"6354f443-b3e8-4932-a319-315187cebac7","Type":"ContainerStarted","Data":"cfd46e26e2417c4760a87b5ea2ca9a866f488b5a7b73d8dc52caaa1ec2cc8320"} Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.048081 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-259c-account-create-update-5jsnr" event={"ID":"5296cc7d-3008-44e6-ae0b-f88c333e13aa","Type":"ContainerStarted","Data":"09399e503c45b2d4ae5828d5e690c1fba86620964c10b469e35cc06a37f5c97f"} Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.086854 4893 generic.go:334] "Generic (PLEG): container finished" podID="49d7922c-f9be-40bc-ba17-ec777a331998" containerID="76671f2af7de340854a84a9a135d9964bd871f327e60af0a310986cec234a8f4" exitCode=0 Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.086895 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6f4f4558c4-87m4w" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.086977 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6f4f4558c4-87m4w" event={"ID":"49d7922c-f9be-40bc-ba17-ec777a331998","Type":"ContainerDied","Data":"76671f2af7de340854a84a9a135d9964bd871f327e60af0a310986cec234a8f4"} Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.087007 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6f4f4558c4-87m4w" event={"ID":"49d7922c-f9be-40bc-ba17-ec777a331998","Type":"ContainerDied","Data":"a93b9a216f3578d1686b9815fb061a16066f44f3f6d7d1ecddb888b83af71220"} Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.087029 4893 scope.go:117] "RemoveContainer" containerID="76671f2af7de340854a84a9a135d9964bd871f327e60af0a310986cec234a8f4" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.089127 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qdfzm" podStartSLOduration=3.1258634770000002 podStartE2EDuration="8.089106845s" podCreationTimestamp="2026-03-14 07:24:22 +0000 UTC" firstStartedPulling="2026-03-14 07:24:24.364679633 +0000 UTC m=+1543.626856415" lastFinishedPulling="2026-03-14 07:24:29.327922991 +0000 UTC m=+1548.590099783" observedRunningTime="2026-03-14 07:24:30.080235031 +0000 UTC m=+1549.342411833" watchObservedRunningTime="2026-03-14 07:24:30.089106845 +0000 UTC m=+1549.351283637" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.094288 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxns2\" (UniqueName: \"kubernetes.io/projected/1fc48c5b-eba4-4ce3-b68a-289b737dd9c4-kube-api-access-rxns2\") pod \"1fc48c5b-eba4-4ce3-b68a-289b737dd9c4\" (UID: \"1fc48c5b-eba4-4ce3-b68a-289b737dd9c4\") " Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.094334 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1fc48c5b-eba4-4ce3-b68a-289b737dd9c4-logs\") pod \"1fc48c5b-eba4-4ce3-b68a-289b737dd9c4\" (UID: \"1fc48c5b-eba4-4ce3-b68a-289b737dd9c4\") " Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.094397 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fc48c5b-eba4-4ce3-b68a-289b737dd9c4-combined-ca-bundle\") pod \"1fc48c5b-eba4-4ce3-b68a-289b737dd9c4\" (UID: \"1fc48c5b-eba4-4ce3-b68a-289b737dd9c4\") " Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.094449 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fc48c5b-eba4-4ce3-b68a-289b737dd9c4-config-data\") pod \"1fc48c5b-eba4-4ce3-b68a-289b737dd9c4\" (UID: \"1fc48c5b-eba4-4ce3-b68a-289b737dd9c4\") " Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.094504 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1fc48c5b-eba4-4ce3-b68a-289b737dd9c4-config-data-custom\") pod \"1fc48c5b-eba4-4ce3-b68a-289b737dd9c4\" (UID: \"1fc48c5b-eba4-4ce3-b68a-289b737dd9c4\") " Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.096812 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fc48c5b-eba4-4ce3-b68a-289b737dd9c4-logs" (OuterVolumeSpecName: "logs") pod "1fc48c5b-eba4-4ce3-b68a-289b737dd9c4" (UID: "1fc48c5b-eba4-4ce3-b68a-289b737dd9c4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.100309 4893 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49d7922c-f9be-40bc-ba17-ec777a331998-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.100341 4893 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1fc48c5b-eba4-4ce3-b68a-289b737dd9c4-logs\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.101080 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fc48c5b-eba4-4ce3-b68a-289b737dd9c4-kube-api-access-rxns2" (OuterVolumeSpecName: "kube-api-access-rxns2") pod "1fc48c5b-eba4-4ce3-b68a-289b737dd9c4" (UID: "1fc48c5b-eba4-4ce3-b68a-289b737dd9c4"). InnerVolumeSpecName "kube-api-access-rxns2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.108499 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fc48c5b-eba4-4ce3-b68a-289b737dd9c4-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1fc48c5b-eba4-4ce3-b68a-289b737dd9c4" (UID: "1fc48c5b-eba4-4ce3-b68a-289b737dd9c4"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.110493 4893 generic.go:334] "Generic (PLEG): container finished" podID="1fc48c5b-eba4-4ce3-b68a-289b737dd9c4" containerID="41c90b70a47f5b7fd9f983850f85e1893e4b84b4238a9030cf46ac107b91a526" exitCode=0 Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.110632 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-68559f9fc9-2zprf" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.111493 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-68559f9fc9-2zprf" event={"ID":"1fc48c5b-eba4-4ce3-b68a-289b737dd9c4","Type":"ContainerDied","Data":"41c90b70a47f5b7fd9f983850f85e1893e4b84b4238a9030cf46ac107b91a526"} Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.111556 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-68559f9fc9-2zprf" event={"ID":"1fc48c5b-eba4-4ce3-b68a-289b737dd9c4","Type":"ContainerDied","Data":"c45b4223c1181b0b8ee7a4e885b762497d18d2d9d9cc57f3b95405b85294b2d2"} Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.124002 4893 generic.go:334] "Generic (PLEG): container finished" podID="12b3c392-b02f-435f-8a96-04ad97890449" containerID="8df0344d0d349890a42132c18330de316fe3521d61a0cdd8284849c4aacee7c9" exitCode=0 Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.124148 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.124202 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"12b3c392-b02f-435f-8a96-04ad97890449","Type":"ContainerDied","Data":"8df0344d0d349890a42132c18330de316fe3521d61a0cdd8284849c4aacee7c9"} Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.124242 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"12b3c392-b02f-435f-8a96-04ad97890449","Type":"ContainerDied","Data":"15eb4af16117dcb4aace993e70f4192634171d7f8aa9f27dbd85e3a66bc48a50"} Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.139755 4893 generic.go:334] "Generic (PLEG): container finished" podID="ec3a7835-99ba-4d0d-b81d-2dea0dc7128b" containerID="14dfeeb1ac493f8c868ceb12cb9f2623644d493f702d4b8d551c54f2d4fd82cb" exitCode=0 Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.139870 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-bwq2l" event={"ID":"ec3a7835-99ba-4d0d-b81d-2dea0dc7128b","Type":"ContainerDied","Data":"14dfeeb1ac493f8c868ceb12cb9f2623644d493f702d4b8d551c54f2d4fd82cb"} Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.147700 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fc48c5b-eba4-4ce3-b68a-289b737dd9c4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1fc48c5b-eba4-4ce3-b68a-289b737dd9c4" (UID: "1fc48c5b-eba4-4ce3-b68a-289b737dd9c4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.149530 4893 generic.go:334] "Generic (PLEG): container finished" podID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerID="61f20b3d153fc2b7e2f88fb820282348d684b7c7d0ab7a0173efe70cf59b854d" exitCode=0 Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.149561 4893 generic.go:334] "Generic (PLEG): container finished" podID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerID="8e99f57da5e8b525a1c282c63776aab48b212ab715d4fbb62b57b9f55dd89177" exitCode=0 Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.149569 4893 generic.go:334] "Generic (PLEG): container finished" podID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerID="b2ee065d0fc6e1370be16e3b3da813d71fb595937dc649af8cca1aaacddcfda3" exitCode=0 Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.149577 4893 generic.go:334] "Generic (PLEG): container finished" podID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerID="c81f7949be58dba02aa8c25ce6d88e2e1c6c71c62d11b3a1abcfa0c71bd14098" exitCode=0 Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.149585 4893 generic.go:334] "Generic (PLEG): container finished" podID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerID="5caf4411272c4d0599dad02231d0a82c29d93699fd274a70d9030405888f8c29" exitCode=0 Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.149593 4893 generic.go:334] "Generic (PLEG): container finished" podID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerID="68d10b4deb6f11784dbbc34682c64f702addc8c78d192179a6921060f8643b7d" exitCode=0 Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.149601 4893 generic.go:334] "Generic (PLEG): container finished" podID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerID="9c507561a2ac6d068928170405d1a9bf41f0375a26767e40cbfa1697910cfdd9" exitCode=0 Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.149608 4893 generic.go:334] "Generic (PLEG): container finished" podID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerID="265624aa91f386fecc5aba702935646c1883c30507f6b5633f11a004e1345c10" exitCode=0 Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.149615 4893 generic.go:334] "Generic (PLEG): container finished" podID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerID="8181ccf3acc345fc42f8615ca13010757b2d59b421f286383ec54a73844363c3" exitCode=0 Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.149622 4893 generic.go:334] "Generic (PLEG): container finished" podID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerID="e6185392ccc78feefed6979a46060f7ab1bf26d3d5dd3f3682205a3151cc587f" exitCode=0 Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.149664 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"079232b7-87bb-42cf-96ff-1eb2d1cfe2b5","Type":"ContainerDied","Data":"61f20b3d153fc2b7e2f88fb820282348d684b7c7d0ab7a0173efe70cf59b854d"} Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.149691 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"079232b7-87bb-42cf-96ff-1eb2d1cfe2b5","Type":"ContainerDied","Data":"8e99f57da5e8b525a1c282c63776aab48b212ab715d4fbb62b57b9f55dd89177"} Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.149711 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"079232b7-87bb-42cf-96ff-1eb2d1cfe2b5","Type":"ContainerDied","Data":"b2ee065d0fc6e1370be16e3b3da813d71fb595937dc649af8cca1aaacddcfda3"} Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.149721 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"079232b7-87bb-42cf-96ff-1eb2d1cfe2b5","Type":"ContainerDied","Data":"c81f7949be58dba02aa8c25ce6d88e2e1c6c71c62d11b3a1abcfa0c71bd14098"} Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.149731 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"079232b7-87bb-42cf-96ff-1eb2d1cfe2b5","Type":"ContainerDied","Data":"5caf4411272c4d0599dad02231d0a82c29d93699fd274a70d9030405888f8c29"} Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.149740 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"079232b7-87bb-42cf-96ff-1eb2d1cfe2b5","Type":"ContainerDied","Data":"68d10b4deb6f11784dbbc34682c64f702addc8c78d192179a6921060f8643b7d"} Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.149750 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"079232b7-87bb-42cf-96ff-1eb2d1cfe2b5","Type":"ContainerDied","Data":"9c507561a2ac6d068928170405d1a9bf41f0375a26767e40cbfa1697910cfdd9"} Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.149760 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"079232b7-87bb-42cf-96ff-1eb2d1cfe2b5","Type":"ContainerDied","Data":"265624aa91f386fecc5aba702935646c1883c30507f6b5633f11a004e1345c10"} Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.149771 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"079232b7-87bb-42cf-96ff-1eb2d1cfe2b5","Type":"ContainerDied","Data":"8181ccf3acc345fc42f8615ca13010757b2d59b421f286383ec54a73844363c3"} Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.149779 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"079232b7-87bb-42cf-96ff-1eb2d1cfe2b5","Type":"ContainerDied","Data":"e6185392ccc78feefed6979a46060f7ab1bf26d3d5dd3f3682205a3151cc587f"} Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.151168 4893 generic.go:334] "Generic (PLEG): container finished" podID="e6be5c8e-c381-4e29-90e7-069d902c1805" containerID="1dde3e9c9cf1287de08c1c44f96f41fa14af9999481d035ed1d21d2907049654" exitCode=1 Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.151205 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-zwt4c" event={"ID":"e6be5c8e-c381-4e29-90e7-069d902c1805","Type":"ContainerDied","Data":"1dde3e9c9cf1287de08c1c44f96f41fa14af9999481d035ed1d21d2907049654"} Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.151732 4893 scope.go:117] "RemoveContainer" containerID="1dde3e9c9cf1287de08c1c44f96f41fa14af9999481d035ed1d21d2907049654" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.163309 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-33a2-account-create-update-fzsjz" event={"ID":"ab0242d4-11e1-4a4b-9e55-a841f2ba874d","Type":"ContainerStarted","Data":"f9a3e283e0e8a200e5b788aec0d31bffb3a047481f0a7bb6368ffe7377217f2d"} Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.183492 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fc48c5b-eba4-4ce3-b68a-289b737dd9c4-config-data" (OuterVolumeSpecName: "config-data") pod "1fc48c5b-eba4-4ce3-b68a-289b737dd9c4" (UID: "1fc48c5b-eba4-4ce3-b68a-289b737dd9c4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.192236 4893 generic.go:334] "Generic (PLEG): container finished" podID="203abd37-654f-480c-8a9d-719d767aec4d" containerID="52854fc52e28bf94ad21c38ff77654cb2e7662a16eb28502030a3a7f5acaab4f" exitCode=0 Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.192263 4893 generic.go:334] "Generic (PLEG): container finished" podID="203abd37-654f-480c-8a9d-719d767aec4d" containerID="26fa9847a83bdbc2805cf9345b971f151cfc501d5ecd4dfe251d3518ad82f625" exitCode=0 Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.192324 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7776dc7c77-9lm7q" event={"ID":"203abd37-654f-480c-8a9d-719d767aec4d","Type":"ContainerDied","Data":"52854fc52e28bf94ad21c38ff77654cb2e7662a16eb28502030a3a7f5acaab4f"} Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.192350 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7776dc7c77-9lm7q" event={"ID":"203abd37-654f-480c-8a9d-719d767aec4d","Type":"ContainerDied","Data":"26fa9847a83bdbc2805cf9345b971f151cfc501d5ecd4dfe251d3518ad82f625"} Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.201442 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8cxh\" (UniqueName: \"kubernetes.io/projected/12b3c392-b02f-435f-8a96-04ad97890449-kube-api-access-t8cxh\") pod \"12b3c392-b02f-435f-8a96-04ad97890449\" (UID: \"12b3c392-b02f-435f-8a96-04ad97890449\") " Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.201515 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12b3c392-b02f-435f-8a96-04ad97890449-combined-ca-bundle\") pod \"12b3c392-b02f-435f-8a96-04ad97890449\" (UID: \"12b3c392-b02f-435f-8a96-04ad97890449\") " Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.201653 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/12b3c392-b02f-435f-8a96-04ad97890449-nova-novncproxy-tls-certs\") pod \"12b3c392-b02f-435f-8a96-04ad97890449\" (UID: \"12b3c392-b02f-435f-8a96-04ad97890449\") " Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.201675 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/12b3c392-b02f-435f-8a96-04ad97890449-vencrypt-tls-certs\") pod \"12b3c392-b02f-435f-8a96-04ad97890449\" (UID: \"12b3c392-b02f-435f-8a96-04ad97890449\") " Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.201743 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12b3c392-b02f-435f-8a96-04ad97890449-config-data\") pod \"12b3c392-b02f-435f-8a96-04ad97890449\" (UID: \"12b3c392-b02f-435f-8a96-04ad97890449\") " Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.202214 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxns2\" (UniqueName: \"kubernetes.io/projected/1fc48c5b-eba4-4ce3-b68a-289b737dd9c4-kube-api-access-rxns2\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.202231 4893 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fc48c5b-eba4-4ce3-b68a-289b737dd9c4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.202242 4893 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fc48c5b-eba4-4ce3-b68a-289b737dd9c4-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.202251 4893 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1fc48c5b-eba4-4ce3-b68a-289b737dd9c4-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.213208 4893 generic.go:334] "Generic (PLEG): container finished" podID="1239ac87-7084-45c6-9eef-ecab07108656" containerID="6d4a0c5d5f1c982cd165c28bd58f8c768f1c7c5b718e400d69233659dc76ae51" exitCode=0 Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.213278 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1239ac87-7084-45c6-9eef-ecab07108656","Type":"ContainerDied","Data":"6d4a0c5d5f1c982cd165c28bd58f8c768f1c7c5b718e400d69233659dc76ae51"} Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.218829 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-f82a-account-create-update-cwrq9" event={"ID":"6bd578b9-f296-486d-ba12-376e227c3b09","Type":"ContainerDied","Data":"9075da906b5896435624346f4d356cdb99afb4688efdfce646beac4798852170"} Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.218906 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-f82a-account-create-update-cwrq9" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.235798 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12b3c392-b02f-435f-8a96-04ad97890449-kube-api-access-t8cxh" (OuterVolumeSpecName: "kube-api-access-t8cxh") pod "12b3c392-b02f-435f-8a96-04ad97890449" (UID: "12b3c392-b02f-435f-8a96-04ad97890449"). InnerVolumeSpecName "kube-api-access-t8cxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.254182 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12b3c392-b02f-435f-8a96-04ad97890449-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "12b3c392-b02f-435f-8a96-04ad97890449" (UID: "12b3c392-b02f-435f-8a96-04ad97890449"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.301802 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12b3c392-b02f-435f-8a96-04ad97890449-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "12b3c392-b02f-435f-8a96-04ad97890449" (UID: "12b3c392-b02f-435f-8a96-04ad97890449"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.301882 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12b3c392-b02f-435f-8a96-04ad97890449-config-data" (OuterVolumeSpecName: "config-data") pod "12b3c392-b02f-435f-8a96-04ad97890449" (UID: "12b3c392-b02f-435f-8a96-04ad97890449"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.301927 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12b3c392-b02f-435f-8a96-04ad97890449-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "12b3c392-b02f-435f-8a96-04ad97890449" (UID: "12b3c392-b02f-435f-8a96-04ad97890449"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.303747 4893 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/12b3c392-b02f-435f-8a96-04ad97890449-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.303770 4893 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/12b3c392-b02f-435f-8a96-04ad97890449-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.303779 4893 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12b3c392-b02f-435f-8a96-04ad97890449-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.303787 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8cxh\" (UniqueName: \"kubernetes.io/projected/12b3c392-b02f-435f-8a96-04ad97890449-kube-api-access-t8cxh\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.303796 4893 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12b3c392-b02f-435f-8a96-04ad97890449-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.351221 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-6f4f4558c4-87m4w"] Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.391216 4893 scope.go:117] "RemoveContainer" containerID="dfb104ad3656c054cc0079116aa3b717e79a6e2e4bfdd1d18ecce304d17812e0" Mar 14 07:24:30 crc kubenswrapper[4893]: E0314 07:24:30.407172 4893 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 14 07:24:30 crc kubenswrapper[4893]: E0314 07:24:30.407238 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a752b3c8-284e-490f-be39-506e7a075c6f-config-data podName:a752b3c8-284e-490f-be39-506e7a075c6f nodeName:}" failed. No retries permitted until 2026-03-14 07:24:34.407225081 +0000 UTC m=+1553.669401873 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/a752b3c8-284e-490f-be39-506e7a075c6f-config-data") pod "rabbitmq-server-0" (UID: "a752b3c8-284e-490f-be39-506e7a075c6f") : configmap "rabbitmq-config-data" not found Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.416936 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-6f4f4558c4-87m4w"] Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.453326 4893 scope.go:117] "RemoveContainer" containerID="76671f2af7de340854a84a9a135d9964bd871f327e60af0a310986cec234a8f4" Mar 14 07:24:30 crc kubenswrapper[4893]: E0314 07:24:30.457920 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76671f2af7de340854a84a9a135d9964bd871f327e60af0a310986cec234a8f4\": container with ID starting with 76671f2af7de340854a84a9a135d9964bd871f327e60af0a310986cec234a8f4 not found: ID does not exist" containerID="76671f2af7de340854a84a9a135d9964bd871f327e60af0a310986cec234a8f4" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.457969 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76671f2af7de340854a84a9a135d9964bd871f327e60af0a310986cec234a8f4"} err="failed to get container status \"76671f2af7de340854a84a9a135d9964bd871f327e60af0a310986cec234a8f4\": rpc error: code = NotFound desc = could not find container \"76671f2af7de340854a84a9a135d9964bd871f327e60af0a310986cec234a8f4\": container with ID starting with 76671f2af7de340854a84a9a135d9964bd871f327e60af0a310986cec234a8f4 not found: ID does not exist" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.458014 4893 scope.go:117] "RemoveContainer" containerID="dfb104ad3656c054cc0079116aa3b717e79a6e2e4bfdd1d18ecce304d17812e0" Mar 14 07:24:30 crc kubenswrapper[4893]: E0314 07:24:30.462377 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfb104ad3656c054cc0079116aa3b717e79a6e2e4bfdd1d18ecce304d17812e0\": container with ID starting with dfb104ad3656c054cc0079116aa3b717e79a6e2e4bfdd1d18ecce304d17812e0 not found: ID does not exist" containerID="dfb104ad3656c054cc0079116aa3b717e79a6e2e4bfdd1d18ecce304d17812e0" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.462420 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfb104ad3656c054cc0079116aa3b717e79a6e2e4bfdd1d18ecce304d17812e0"} err="failed to get container status \"dfb104ad3656c054cc0079116aa3b717e79a6e2e4bfdd1d18ecce304d17812e0\": rpc error: code = NotFound desc = could not find container \"dfb104ad3656c054cc0079116aa3b717e79a6e2e4bfdd1d18ecce304d17812e0\": container with ID starting with dfb104ad3656c054cc0079116aa3b717e79a6e2e4bfdd1d18ecce304d17812e0 not found: ID does not exist" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.462438 4893 scope.go:117] "RemoveContainer" containerID="41c90b70a47f5b7fd9f983850f85e1893e4b84b4238a9030cf46ac107b91a526" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.517579 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-f82a-account-create-update-cwrq9"] Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.523094 4893 scope.go:117] "RemoveContainer" containerID="a6139b7db93a2d6935130af661309f66a9cc34448d14819a79327f7b0ac80cc2" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.543077 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-f82a-account-create-update-cwrq9"] Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.555675 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-68559f9fc9-2zprf"] Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.560636 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-68559f9fc9-2zprf"] Mar 14 07:24:30 crc kubenswrapper[4893]: E0314 07:24:30.562649 4893 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 14dfeeb1ac493f8c868ceb12cb9f2623644d493f702d4b8d551c54f2d4fd82cb is running failed: container process not found" containerID="14dfeeb1ac493f8c868ceb12cb9f2623644d493f702d4b8d551c54f2d4fd82cb" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 14 07:24:30 crc kubenswrapper[4893]: E0314 07:24:30.564664 4893 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 14dfeeb1ac493f8c868ceb12cb9f2623644d493f702d4b8d551c54f2d4fd82cb is running failed: container process not found" containerID="14dfeeb1ac493f8c868ceb12cb9f2623644d493f702d4b8d551c54f2d4fd82cb" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 14 07:24:30 crc kubenswrapper[4893]: E0314 07:24:30.564722 4893 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="815433c66f3f469063c8f960c990cc45531d8077d9256e3db6d30e4aeb47f852" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 14 07:24:30 crc kubenswrapper[4893]: E0314 07:24:30.566940 4893 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 14dfeeb1ac493f8c868ceb12cb9f2623644d493f702d4b8d551c54f2d4fd82cb is running failed: container process not found" containerID="14dfeeb1ac493f8c868ceb12cb9f2623644d493f702d4b8d551c54f2d4fd82cb" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 14 07:24:30 crc kubenswrapper[4893]: E0314 07:24:30.566977 4893 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 14dfeeb1ac493f8c868ceb12cb9f2623644d493f702d4b8d551c54f2d4fd82cb is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-bwq2l" podUID="ec3a7835-99ba-4d0d-b81d-2dea0dc7128b" containerName="ovsdb-server" Mar 14 07:24:30 crc kubenswrapper[4893]: E0314 07:24:30.567211 4893 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="815433c66f3f469063c8f960c990cc45531d8077d9256e3db6d30e4aeb47f852" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.567269 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.573000 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 14 07:24:30 crc kubenswrapper[4893]: E0314 07:24:30.589495 4893 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="815433c66f3f469063c8f960c990cc45531d8077d9256e3db6d30e4aeb47f852" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 14 07:24:30 crc kubenswrapper[4893]: E0314 07:24:30.589567 4893 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-bwq2l" podUID="ec3a7835-99ba-4d0d-b81d-2dea0dc7128b" containerName="ovs-vswitchd" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.607876 4893 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="ab16027b-4fcf-42bf-b586-a7b8ff348305" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.176:8776/healthcheck\": read tcp 10.217.0.2:56866->10.217.0.176:8776: read: connection reset by peer" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.628202 4893 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-8rcbf" podUID="a4b44171-12ae-4a98-aac1-1adc9dff3941" containerName="ovn-controller" probeResult="failure" output=< Mar 14 07:24:30 crc kubenswrapper[4893]: ERROR - Failed to get connection status from ovn-controller, ovn-appctl exit status: 0 Mar 14 07:24:30 crc kubenswrapper[4893]: > Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.645088 4893 scope.go:117] "RemoveContainer" containerID="41c90b70a47f5b7fd9f983850f85e1893e4b84b4238a9030cf46ac107b91a526" Mar 14 07:24:30 crc kubenswrapper[4893]: E0314 07:24:30.645723 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41c90b70a47f5b7fd9f983850f85e1893e4b84b4238a9030cf46ac107b91a526\": container with ID starting with 41c90b70a47f5b7fd9f983850f85e1893e4b84b4238a9030cf46ac107b91a526 not found: ID does not exist" containerID="41c90b70a47f5b7fd9f983850f85e1893e4b84b4238a9030cf46ac107b91a526" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.645779 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41c90b70a47f5b7fd9f983850f85e1893e4b84b4238a9030cf46ac107b91a526"} err="failed to get container status \"41c90b70a47f5b7fd9f983850f85e1893e4b84b4238a9030cf46ac107b91a526\": rpc error: code = NotFound desc = could not find container \"41c90b70a47f5b7fd9f983850f85e1893e4b84b4238a9030cf46ac107b91a526\": container with ID starting with 41c90b70a47f5b7fd9f983850f85e1893e4b84b4238a9030cf46ac107b91a526 not found: ID does not exist" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.645806 4893 scope.go:117] "RemoveContainer" containerID="a6139b7db93a2d6935130af661309f66a9cc34448d14819a79327f7b0ac80cc2" Mar 14 07:24:30 crc kubenswrapper[4893]: E0314 07:24:30.646178 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6139b7db93a2d6935130af661309f66a9cc34448d14819a79327f7b0ac80cc2\": container with ID starting with a6139b7db93a2d6935130af661309f66a9cc34448d14819a79327f7b0ac80cc2 not found: ID does not exist" containerID="a6139b7db93a2d6935130af661309f66a9cc34448d14819a79327f7b0ac80cc2" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.646198 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6139b7db93a2d6935130af661309f66a9cc34448d14819a79327f7b0ac80cc2"} err="failed to get container status \"a6139b7db93a2d6935130af661309f66a9cc34448d14819a79327f7b0ac80cc2\": rpc error: code = NotFound desc = could not find container \"a6139b7db93a2d6935130af661309f66a9cc34448d14819a79327f7b0ac80cc2\": container with ID starting with a6139b7db93a2d6935130af661309f66a9cc34448d14819a79327f7b0ac80cc2 not found: ID does not exist" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.646231 4893 scope.go:117] "RemoveContainer" containerID="8df0344d0d349890a42132c18330de316fe3521d61a0cdd8284849c4aacee7c9" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.754366 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.755121 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dc7c1963-417f-453f-8983-1c03d349f76d" containerName="ceilometer-central-agent" containerID="cri-o://55f440a65215e639fbf8031519f6a32152e474e4f88918f7a1c6749a48adaca4" gracePeriod=30 Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.755637 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dc7c1963-417f-453f-8983-1c03d349f76d" containerName="proxy-httpd" containerID="cri-o://5de4204094b61d61b74c39b2bd96d3d51618d7bb7e19601f38fede117a170eb3" gracePeriod=30 Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.755666 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dc7c1963-417f-453f-8983-1c03d349f76d" containerName="ceilometer-notification-agent" containerID="cri-o://93e846e5581223fc8c1f1a453c95a7636bc81b66f008e4b7deba28bbe41f5000" gracePeriod=30 Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.755647 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dc7c1963-417f-453f-8983-1c03d349f76d" containerName="sg-core" containerID="cri-o://93c8ba18223ce6d9916a1a48fe599793ff7a55be5ba49065b9cec75d19d40537" gracePeriod=30 Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.814329 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.814543 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="1cad381a-c3bf-4fc8-a314-6f45028f3482" containerName="kube-state-metrics" containerID="cri-o://1479c505fa63564b68dc67398eb06f224838c90b598b1f33bb9bfdfc5ff3333d" gracePeriod=30 Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.850195 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7776dc7c77-9lm7q" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.867910 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-259c-account-create-update-5jsnr" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.889729 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-1fc4-account-create-update-dlgm6"] Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.910662 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-1fc4-account-create-update-dlgm6"] Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.945076 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-1fc4-account-create-update-9ngml"] Mar 14 07:24:30 crc kubenswrapper[4893]: E0314 07:24:30.945439 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5241cf43-f60b-4499-ae07-6b449f6ef57e" containerName="init" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.945450 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="5241cf43-f60b-4499-ae07-6b449f6ef57e" containerName="init" Mar 14 07:24:30 crc kubenswrapper[4893]: E0314 07:24:30.945466 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e12ac0dd-46ad-4d51-9ebc-acdd264649b2" containerName="ovsdbserver-nb" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.945472 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="e12ac0dd-46ad-4d51-9ebc-acdd264649b2" containerName="ovsdbserver-nb" Mar 14 07:24:30 crc kubenswrapper[4893]: E0314 07:24:30.945484 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12b3c392-b02f-435f-8a96-04ad97890449" containerName="nova-cell1-novncproxy-novncproxy" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.945490 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="12b3c392-b02f-435f-8a96-04ad97890449" containerName="nova-cell1-novncproxy-novncproxy" Mar 14 07:24:30 crc kubenswrapper[4893]: E0314 07:24:30.945497 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="457f660f-9b87-4d37-a92e-0c30bb2a2fea" containerName="openstack-network-exporter" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.945503 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="457f660f-9b87-4d37-a92e-0c30bb2a2fea" containerName="openstack-network-exporter" Mar 14 07:24:30 crc kubenswrapper[4893]: E0314 07:24:30.945529 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="203abd37-654f-480c-8a9d-719d767aec4d" containerName="proxy-httpd" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.945535 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="203abd37-654f-480c-8a9d-719d767aec4d" containerName="proxy-httpd" Mar 14 07:24:30 crc kubenswrapper[4893]: E0314 07:24:30.945546 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49d7922c-f9be-40bc-ba17-ec777a331998" containerName="barbican-keystone-listener" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.945552 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="49d7922c-f9be-40bc-ba17-ec777a331998" containerName="barbican-keystone-listener" Mar 14 07:24:30 crc kubenswrapper[4893]: E0314 07:24:30.945565 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1f33b9a-21cf-4ca9-82a1-5fb7592bb749" containerName="openstack-network-exporter" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.945570 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1f33b9a-21cf-4ca9-82a1-5fb7592bb749" containerName="openstack-network-exporter" Mar 14 07:24:30 crc kubenswrapper[4893]: E0314 07:24:30.945585 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fc48c5b-eba4-4ce3-b68a-289b737dd9c4" containerName="barbican-worker-log" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.945591 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fc48c5b-eba4-4ce3-b68a-289b737dd9c4" containerName="barbican-worker-log" Mar 14 07:24:30 crc kubenswrapper[4893]: E0314 07:24:30.945599 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="203abd37-654f-480c-8a9d-719d767aec4d" containerName="proxy-server" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.946385 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="203abd37-654f-480c-8a9d-719d767aec4d" containerName="proxy-server" Mar 14 07:24:30 crc kubenswrapper[4893]: E0314 07:24:30.946400 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e12ac0dd-46ad-4d51-9ebc-acdd264649b2" containerName="openstack-network-exporter" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.946407 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="e12ac0dd-46ad-4d51-9ebc-acdd264649b2" containerName="openstack-network-exporter" Mar 14 07:24:30 crc kubenswrapper[4893]: E0314 07:24:30.946421 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5241cf43-f60b-4499-ae07-6b449f6ef57e" containerName="dnsmasq-dns" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.946427 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="5241cf43-f60b-4499-ae07-6b449f6ef57e" containerName="dnsmasq-dns" Mar 14 07:24:30 crc kubenswrapper[4893]: E0314 07:24:30.946435 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fc48c5b-eba4-4ce3-b68a-289b737dd9c4" containerName="barbican-worker" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.946813 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fc48c5b-eba4-4ce3-b68a-289b737dd9c4" containerName="barbican-worker" Mar 14 07:24:30 crc kubenswrapper[4893]: E0314 07:24:30.946826 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49d7922c-f9be-40bc-ba17-ec777a331998" containerName="barbican-keystone-listener-log" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.946832 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="49d7922c-f9be-40bc-ba17-ec777a331998" containerName="barbican-keystone-listener-log" Mar 14 07:24:30 crc kubenswrapper[4893]: E0314 07:24:30.946840 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1f33b9a-21cf-4ca9-82a1-5fb7592bb749" containerName="ovsdbserver-sb" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.946848 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1f33b9a-21cf-4ca9-82a1-5fb7592bb749" containerName="ovsdbserver-sb" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.947024 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fc48c5b-eba4-4ce3-b68a-289b737dd9c4" containerName="barbican-worker-log" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.947036 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1f33b9a-21cf-4ca9-82a1-5fb7592bb749" containerName="openstack-network-exporter" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.947046 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="5241cf43-f60b-4499-ae07-6b449f6ef57e" containerName="dnsmasq-dns" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.947057 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="203abd37-654f-480c-8a9d-719d767aec4d" containerName="proxy-httpd" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.947067 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="457f660f-9b87-4d37-a92e-0c30bb2a2fea" containerName="openstack-network-exporter" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.947076 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="49d7922c-f9be-40bc-ba17-ec777a331998" containerName="barbican-keystone-listener-log" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.947084 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="12b3c392-b02f-435f-8a96-04ad97890449" containerName="nova-cell1-novncproxy-novncproxy" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.947092 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="203abd37-654f-480c-8a9d-719d767aec4d" containerName="proxy-server" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.947103 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fc48c5b-eba4-4ce3-b68a-289b737dd9c4" containerName="barbican-worker" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.947110 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1f33b9a-21cf-4ca9-82a1-5fb7592bb749" containerName="ovsdbserver-sb" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.947117 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="e12ac0dd-46ad-4d51-9ebc-acdd264649b2" containerName="openstack-network-exporter" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.947128 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="49d7922c-f9be-40bc-ba17-ec777a331998" containerName="barbican-keystone-listener" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.947135 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="e12ac0dd-46ad-4d51-9ebc-acdd264649b2" containerName="ovsdbserver-nb" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.947683 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-1fc4-account-create-update-9ngml" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.950009 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.975796 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-1fc4-account-create-update-9ngml"] Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.976775 4893 scope.go:117] "RemoveContainer" containerID="8df0344d0d349890a42132c18330de316fe3521d61a0cdd8284849c4aacee7c9" Mar 14 07:24:30 crc kubenswrapper[4893]: E0314 07:24:30.980644 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8df0344d0d349890a42132c18330de316fe3521d61a0cdd8284849c4aacee7c9\": container with ID starting with 8df0344d0d349890a42132c18330de316fe3521d61a0cdd8284849c4aacee7c9 not found: ID does not exist" containerID="8df0344d0d349890a42132c18330de316fe3521d61a0cdd8284849c4aacee7c9" Mar 14 07:24:30 crc kubenswrapper[4893]: I0314 07:24:30.980681 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8df0344d0d349890a42132c18330de316fe3521d61a0cdd8284849c4aacee7c9"} err="failed to get container status \"8df0344d0d349890a42132c18330de316fe3521d61a0cdd8284849c4aacee7c9\": rpc error: code = NotFound desc = could not find container \"8df0344d0d349890a42132c18330de316fe3521d61a0cdd8284849c4aacee7c9\": container with ID starting with 8df0344d0d349890a42132c18330de316fe3521d61a0cdd8284849c4aacee7c9 not found: ID does not exist" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.023131 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/203abd37-654f-480c-8a9d-719d767aec4d-internal-tls-certs\") pod \"203abd37-654f-480c-8a9d-719d767aec4d\" (UID: \"203abd37-654f-480c-8a9d-719d767aec4d\") " Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.023296 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95l7h\" (UniqueName: \"kubernetes.io/projected/203abd37-654f-480c-8a9d-719d767aec4d-kube-api-access-95l7h\") pod \"203abd37-654f-480c-8a9d-719d767aec4d\" (UID: \"203abd37-654f-480c-8a9d-719d767aec4d\") " Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.023357 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/203abd37-654f-480c-8a9d-719d767aec4d-etc-swift\") pod \"203abd37-654f-480c-8a9d-719d767aec4d\" (UID: \"203abd37-654f-480c-8a9d-719d767aec4d\") " Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.023435 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/203abd37-654f-480c-8a9d-719d767aec4d-combined-ca-bundle\") pod \"203abd37-654f-480c-8a9d-719d767aec4d\" (UID: \"203abd37-654f-480c-8a9d-719d767aec4d\") " Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.023479 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/203abd37-654f-480c-8a9d-719d767aec4d-run-httpd\") pod \"203abd37-654f-480c-8a9d-719d767aec4d\" (UID: \"203abd37-654f-480c-8a9d-719d767aec4d\") " Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.023501 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5296cc7d-3008-44e6-ae0b-f88c333e13aa-operator-scripts\") pod \"5296cc7d-3008-44e6-ae0b-f88c333e13aa\" (UID: \"5296cc7d-3008-44e6-ae0b-f88c333e13aa\") " Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.023559 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7dzmx\" (UniqueName: \"kubernetes.io/projected/5296cc7d-3008-44e6-ae0b-f88c333e13aa-kube-api-access-7dzmx\") pod \"5296cc7d-3008-44e6-ae0b-f88c333e13aa\" (UID: \"5296cc7d-3008-44e6-ae0b-f88c333e13aa\") " Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.023588 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/203abd37-654f-480c-8a9d-719d767aec4d-config-data\") pod \"203abd37-654f-480c-8a9d-719d767aec4d\" (UID: \"203abd37-654f-480c-8a9d-719d767aec4d\") " Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.023637 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/203abd37-654f-480c-8a9d-719d767aec4d-log-httpd\") pod \"203abd37-654f-480c-8a9d-719d767aec4d\" (UID: \"203abd37-654f-480c-8a9d-719d767aec4d\") " Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.023665 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/203abd37-654f-480c-8a9d-719d767aec4d-public-tls-certs\") pod \"203abd37-654f-480c-8a9d-719d767aec4d\" (UID: \"203abd37-654f-480c-8a9d-719d767aec4d\") " Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.029697 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-vc8cn"] Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.030077 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/203abd37-654f-480c-8a9d-719d767aec4d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "203abd37-654f-480c-8a9d-719d767aec4d" (UID: "203abd37-654f-480c-8a9d-719d767aec4d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.030986 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5296cc7d-3008-44e6-ae0b-f88c333e13aa-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5296cc7d-3008-44e6-ae0b-f88c333e13aa" (UID: "5296cc7d-3008-44e6-ae0b-f88c333e13aa"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.037877 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/203abd37-654f-480c-8a9d-719d767aec4d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "203abd37-654f-480c-8a9d-719d767aec4d" (UID: "203abd37-654f-480c-8a9d-719d767aec4d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.042221 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/203abd37-654f-480c-8a9d-719d767aec4d-kube-api-access-95l7h" (OuterVolumeSpecName: "kube-api-access-95l7h") pod "203abd37-654f-480c-8a9d-719d767aec4d" (UID: "203abd37-654f-480c-8a9d-719d767aec4d"). InnerVolumeSpecName "kube-api-access-95l7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.046068 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5296cc7d-3008-44e6-ae0b-f88c333e13aa-kube-api-access-7dzmx" (OuterVolumeSpecName: "kube-api-access-7dzmx") pod "5296cc7d-3008-44e6-ae0b-f88c333e13aa" (UID: "5296cc7d-3008-44e6-ae0b-f88c333e13aa"). InnerVolumeSpecName "kube-api-access-7dzmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.063856 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/203abd37-654f-480c-8a9d-719d767aec4d-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "203abd37-654f-480c-8a9d-719d767aec4d" (UID: "203abd37-654f-480c-8a9d-719d767aec4d"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.108615 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-vc8cn"] Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.126040 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbkhk\" (UniqueName: \"kubernetes.io/projected/0a23c46a-df73-4630-bc6b-8df7a4bff754-kube-api-access-jbkhk\") pod \"keystone-1fc4-account-create-update-9ngml\" (UID: \"0a23c46a-df73-4630-bc6b-8df7a4bff754\") " pod="openstack/keystone-1fc4-account-create-update-9ngml" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.126137 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0a23c46a-df73-4630-bc6b-8df7a4bff754-operator-scripts\") pod \"keystone-1fc4-account-create-update-9ngml\" (UID: \"0a23c46a-df73-4630-bc6b-8df7a4bff754\") " pod="openstack/keystone-1fc4-account-create-update-9ngml" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.126204 4893 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/203abd37-654f-480c-8a9d-719d767aec4d-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.126215 4893 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/203abd37-654f-480c-8a9d-719d767aec4d-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.126224 4893 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5296cc7d-3008-44e6-ae0b-f88c333e13aa-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.126233 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7dzmx\" (UniqueName: \"kubernetes.io/projected/5296cc7d-3008-44e6-ae0b-f88c333e13aa-kube-api-access-7dzmx\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.126241 4893 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/203abd37-654f-480c-8a9d-719d767aec4d-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.126249 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95l7h\" (UniqueName: \"kubernetes.io/projected/203abd37-654f-480c-8a9d-719d767aec4d-kube-api-access-95l7h\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.147880 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-5gj2n"] Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.169989 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-5gj2n"] Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.226700 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-787475576f-6cj4v"] Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.226917 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-787475576f-6cj4v" podUID="0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a" containerName="keystone-api" containerID="cri-o://c53128a0bfb2847ab538756827eaf9e85484436e9a236685873bbc44f3f46b7f" gracePeriod=30 Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.228822 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbkhk\" (UniqueName: \"kubernetes.io/projected/0a23c46a-df73-4630-bc6b-8df7a4bff754-kube-api-access-jbkhk\") pod \"keystone-1fc4-account-create-update-9ngml\" (UID: \"0a23c46a-df73-4630-bc6b-8df7a4bff754\") " pod="openstack/keystone-1fc4-account-create-update-9ngml" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.228956 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0a23c46a-df73-4630-bc6b-8df7a4bff754-operator-scripts\") pod \"keystone-1fc4-account-create-update-9ngml\" (UID: \"0a23c46a-df73-4630-bc6b-8df7a4bff754\") " pod="openstack/keystone-1fc4-account-create-update-9ngml" Mar 14 07:24:31 crc kubenswrapper[4893]: E0314 07:24:31.229184 4893 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 14 07:24:31 crc kubenswrapper[4893]: E0314 07:24:31.229253 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0a23c46a-df73-4630-bc6b-8df7a4bff754-operator-scripts podName:0a23c46a-df73-4630-bc6b-8df7a4bff754 nodeName:}" failed. No retries permitted until 2026-03-14 07:24:31.729239287 +0000 UTC m=+1550.991416079 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/0a23c46a-df73-4630-bc6b-8df7a4bff754-operator-scripts") pod "keystone-1fc4-account-create-update-9ngml" (UID: "0a23c46a-df73-4630-bc6b-8df7a4bff754") : configmap "openstack-scripts" not found Mar 14 07:24:31 crc kubenswrapper[4893]: E0314 07:24:31.232861 4893 projected.go:194] Error preparing data for projected volume kube-api-access-jbkhk for pod openstack/keystone-1fc4-account-create-update-9ngml: failed to fetch token: serviceaccounts "galera-openstack" not found Mar 14 07:24:31 crc kubenswrapper[4893]: E0314 07:24:31.232918 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0a23c46a-df73-4630-bc6b-8df7a4bff754-kube-api-access-jbkhk podName:0a23c46a-df73-4630-bc6b-8df7a4bff754 nodeName:}" failed. No retries permitted until 2026-03-14 07:24:31.732902305 +0000 UTC m=+1550.995079097 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-jbkhk" (UniqueName: "kubernetes.io/projected/0a23c46a-df73-4630-bc6b-8df7a4bff754-kube-api-access-jbkhk") pod "keystone-1fc4-account-create-update-9ngml" (UID: "0a23c46a-df73-4630-bc6b-8df7a4bff754") : failed to fetch token: serviceaccounts "galera-openstack" not found Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.245958 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/203abd37-654f-480c-8a9d-719d767aec4d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "203abd37-654f-480c-8a9d-719d767aec4d" (UID: "203abd37-654f-480c-8a9d-719d767aec4d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.257199 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7776dc7c77-9lm7q" event={"ID":"203abd37-654f-480c-8a9d-719d767aec4d","Type":"ContainerDied","Data":"fcc09e4de4f9872e828eb1c3dc4431a94be74768e7c23a24e47a9ed68a0404cc"} Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.257257 4893 scope.go:117] "RemoveContainer" containerID="52854fc52e28bf94ad21c38ff77654cb2e7662a16eb28502030a3a7f5acaab4f" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.257484 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7776dc7c77-9lm7q" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.264962 4893 generic.go:334] "Generic (PLEG): container finished" podID="ab16027b-4fcf-42bf-b586-a7b8ff348305" containerID="471e6c6aa09a8b1a609a1618d206f8f19bac722872a90829f538c1ebf60510ec" exitCode=0 Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.265132 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ab16027b-4fcf-42bf-b586-a7b8ff348305","Type":"ContainerDied","Data":"471e6c6aa09a8b1a609a1618d206f8f19bac722872a90829f538c1ebf60510ec"} Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.266432 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3b5a-account-create-update-hkpb6" event={"ID":"c9a21025-970d-4b50-8f80-b0926242b929","Type":"ContainerDied","Data":"f12c3a3e28c4e24d17c934f0dfe8ea2e3761eb5fe3bf77770f89caee4b0dfa8d"} Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.266454 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f12c3a3e28c4e24d17c934f0dfe8ea2e3761eb5fe3bf77770f89caee4b0dfa8d" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.267631 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/203abd37-654f-480c-8a9d-719d767aec4d-config-data" (OuterVolumeSpecName: "config-data") pod "203abd37-654f-480c-8a9d-719d767aec4d" (UID: "203abd37-654f-480c-8a9d-719d767aec4d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.267769 4893 generic.go:334] "Generic (PLEG): container finished" podID="2195ecfb-6eeb-48f1-8b55-c57520974663" containerID="071be3d7b6163f25cab591d62f20975ae6d40b5630f0ede440ea7ddafb12315f" exitCode=0 Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.267811 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d4589578b-zwqpr" event={"ID":"2195ecfb-6eeb-48f1-8b55-c57520974663","Type":"ContainerDied","Data":"071be3d7b6163f25cab591d62f20975ae6d40b5630f0ede440ea7ddafb12315f"} Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.273661 4893 generic.go:334] "Generic (PLEG): container finished" podID="a07f9759-5cdb-4e42-b7c6-714d0e34ee55" containerID="2fc121bd16a2e48ba5ae0e8191ee49932905a50f6c55bad773e4ec5663d05d86" exitCode=0 Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.273707 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a07f9759-5cdb-4e42-b7c6-714d0e34ee55","Type":"ContainerDied","Data":"2fc121bd16a2e48ba5ae0e8191ee49932905a50f6c55bad773e4ec5663d05d86"} Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.274803 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-259c-account-create-update-5jsnr" event={"ID":"5296cc7d-3008-44e6-ae0b-f88c333e13aa","Type":"ContainerDied","Data":"09399e503c45b2d4ae5828d5e690c1fba86620964c10b469e35cc06a37f5c97f"} Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.274827 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-259c-account-create-update-5jsnr" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.292267 4893 generic.go:334] "Generic (PLEG): container finished" podID="dc7c1963-417f-453f-8983-1c03d349f76d" containerID="5de4204094b61d61b74c39b2bd96d3d51618d7bb7e19601f38fede117a170eb3" exitCode=0 Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.292296 4893 generic.go:334] "Generic (PLEG): container finished" podID="dc7c1963-417f-453f-8983-1c03d349f76d" containerID="93c8ba18223ce6d9916a1a48fe599793ff7a55be5ba49065b9cec75d19d40537" exitCode=2 Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.292334 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc7c1963-417f-453f-8983-1c03d349f76d","Type":"ContainerDied","Data":"5de4204094b61d61b74c39b2bd96d3d51618d7bb7e19601f38fede117a170eb3"} Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.292361 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc7c1963-417f-453f-8983-1c03d349f76d","Type":"ContainerDied","Data":"93c8ba18223ce6d9916a1a48fe599793ff7a55be5ba49065b9cec75d19d40537"} Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.292385 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.298815 4893 generic.go:334] "Generic (PLEG): container finished" podID="1cad381a-c3bf-4fc8-a314-6f45028f3482" containerID="1479c505fa63564b68dc67398eb06f224838c90b598b1f33bb9bfdfc5ff3333d" exitCode=2 Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.298884 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1cad381a-c3bf-4fc8-a314-6f45028f3482","Type":"ContainerDied","Data":"1479c505fa63564b68dc67398eb06f224838c90b598b1f33bb9bfdfc5ff3333d"} Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.307490 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-1fc4-account-create-update-9ngml"] Mar 14 07:24:31 crc kubenswrapper[4893]: E0314 07:24:31.308376 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-jbkhk operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/keystone-1fc4-account-create-update-9ngml" podUID="0a23c46a-df73-4630-bc6b-8df7a4bff754" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.315327 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-wjrnn"] Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.318281 4893 generic.go:334] "Generic (PLEG): container finished" podID="dc1eed68-2de8-46ed-91c9-3eb4fe897d3a" containerID="4d1c426d2b310ef6f81cd1f91ebfb6670cc4b8ebb961307ca3eb73f633e88e39" exitCode=0 Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.318395 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"dc1eed68-2de8-46ed-91c9-3eb4fe897d3a","Type":"ContainerDied","Data":"4d1c426d2b310ef6f81cd1f91ebfb6670cc4b8ebb961307ca3eb73f633e88e39"} Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.319727 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-33a2-account-create-update-fzsjz" event={"ID":"ab0242d4-11e1-4a4b-9e55-a841f2ba874d","Type":"ContainerDied","Data":"f9a3e283e0e8a200e5b788aec0d31bffb3a047481f0a7bb6368ffe7377217f2d"} Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.319751 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9a3e283e0e8a200e5b788aec0d31bffb3a047481f0a7bb6368ffe7377217f2d" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.321827 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/203abd37-654f-480c-8a9d-719d767aec4d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "203abd37-654f-480c-8a9d-719d767aec4d" (UID: "203abd37-654f-480c-8a9d-719d767aec4d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.322758 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-wjrnn"] Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.324301 4893 generic.go:334] "Generic (PLEG): container finished" podID="4f6afb83-bfaa-41a6-8429-b8588d82c7a7" containerID="027f613615f6fb782a95ff00ffffb3250ba4bf7974f3a506df7ad3cbe9d15d84" exitCode=0 Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.324473 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"4f6afb83-bfaa-41a6-8429-b8588d82c7a7","Type":"ContainerDied","Data":"027f613615f6fb782a95ff00ffffb3250ba4bf7974f3a506df7ad3cbe9d15d84"} Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.328436 4893 generic.go:334] "Generic (PLEG): container finished" podID="e6be5c8e-c381-4e29-90e7-069d902c1805" containerID="3e4b4f462f2eb63d686cf632d5ec4e1a3100f1a2224fff5ea78dfd30b8750794" exitCode=1 Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.328465 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-zwt4c" event={"ID":"e6be5c8e-c381-4e29-90e7-069d902c1805","Type":"ContainerDied","Data":"3e4b4f462f2eb63d686cf632d5ec4e1a3100f1a2224fff5ea78dfd30b8750794"} Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.329040 4893 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/root-account-create-update-zwt4c" secret="" err="secret \"galera-openstack-dockercfg-f6n62\" not found" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.329079 4893 scope.go:117] "RemoveContainer" containerID="3e4b4f462f2eb63d686cf632d5ec4e1a3100f1a2224fff5ea78dfd30b8750794" Mar 14 07:24:31 crc kubenswrapper[4893]: E0314 07:24:31.329499 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-create-update pod=root-account-create-update-zwt4c_openstack(e6be5c8e-c381-4e29-90e7-069d902c1805)\"" pod="openstack/root-account-create-update-zwt4c" podUID="e6be5c8e-c381-4e29-90e7-069d902c1805" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.330264 4893 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/203abd37-654f-480c-8a9d-719d767aec4d-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.330283 4893 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/203abd37-654f-480c-8a9d-719d767aec4d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.330295 4893 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/203abd37-654f-480c-8a9d-719d767aec4d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.336297 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-zwt4c"] Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.338443 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/203abd37-654f-480c-8a9d-719d767aec4d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "203abd37-654f-480c-8a9d-719d767aec4d" (UID: "203abd37-654f-480c-8a9d-719d767aec4d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.390491 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12b3c392-b02f-435f-8a96-04ad97890449" path="/var/lib/kubelet/pods/12b3c392-b02f-435f-8a96-04ad97890449/volumes" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.391392 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fc48c5b-eba4-4ce3-b68a-289b737dd9c4" path="/var/lib/kubelet/pods/1fc48c5b-eba4-4ce3-b68a-289b737dd9c4/volumes" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.392088 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49d7922c-f9be-40bc-ba17-ec777a331998" path="/var/lib/kubelet/pods/49d7922c-f9be-40bc-ba17-ec777a331998/volumes" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.393103 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55d860af-b6b6-4796-855c-ade1a4f88f33" path="/var/lib/kubelet/pods/55d860af-b6b6-4796-855c-ade1a4f88f33/volumes" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.393659 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62b69960-585e-4e89-a290-e00ea2f20283" path="/var/lib/kubelet/pods/62b69960-585e-4e89-a290-e00ea2f20283/volumes" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.394128 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bd578b9-f296-486d-ba12-376e227c3b09" path="/var/lib/kubelet/pods/6bd578b9-f296-486d-ba12-376e227c3b09/volumes" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.394480 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81acf3d1-58a7-43ed-808d-411467094efe" path="/var/lib/kubelet/pods/81acf3d1-58a7-43ed-808d-411467094efe/volumes" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.395427 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a15dc402-9291-4c21-aec2-11e96c353687" path="/var/lib/kubelet/pods/a15dc402-9291-4c21-aec2-11e96c353687/volumes" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.396284 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1f33b9a-21cf-4ca9-82a1-5fb7592bb749" path="/var/lib/kubelet/pods/e1f33b9a-21cf-4ca9-82a1-5fb7592bb749/volumes" Mar 14 07:24:31 crc kubenswrapper[4893]: E0314 07:24:31.432652 4893 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.432845 4893 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/203abd37-654f-480c-8a9d-719d767aec4d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:31 crc kubenswrapper[4893]: E0314 07:24:31.433155 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e6be5c8e-c381-4e29-90e7-069d902c1805-operator-scripts podName:e6be5c8e-c381-4e29-90e7-069d902c1805 nodeName:}" failed. No retries permitted until 2026-03-14 07:24:31.933135499 +0000 UTC m=+1551.195312291 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/e6be5c8e-c381-4e29-90e7-069d902c1805-operator-scripts") pod "root-account-create-update-zwt4c" (UID: "e6be5c8e-c381-4e29-90e7-069d902c1805") : configmap "openstack-scripts" not found Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.560125 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-33a2-account-create-update-fzsjz" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.562763 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3b5a-account-create-update-hkpb6" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.565969 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.567633 4893 scope.go:117] "RemoveContainer" containerID="26fa9847a83bdbc2805cf9345b971f151cfc501d5ecd4dfe251d3518ad82f625" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.569204 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-78ff-account-create-update-rbfhn" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.597689 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="444c0d3d-4ad4-47a3-9281-b7028d69a78a" containerName="galera" containerID="cri-o://320bfcf2a61e367399a0d0afd62dedcfc1630863667c2d338451fad1fcb6b519" gracePeriod=30 Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.616035 4893 scope.go:117] "RemoveContainer" containerID="1dde3e9c9cf1287de08c1c44f96f41fa14af9999481d035ed1d21d2907049654" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.637882 4893 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-684d6b469b-c8l2b" podUID="fefe82b2-447a-4f97-8221-7050b61ef60c" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.169:9311/healthcheck\": read tcp 10.217.0.2:37630->10.217.0.169:9311: read: connection reset by peer" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.637934 4893 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-684d6b469b-c8l2b" podUID="fefe82b2-447a-4f97-8221-7050b61ef60c" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.169:9311/healthcheck\": read tcp 10.217.0.2:37628->10.217.0.169:9311: read: connection reset by peer" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.666406 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-7776dc7c77-9lm7q"] Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.672317 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-7776dc7c77-9lm7q"] Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.733297 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.740886 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f6afb83-bfaa-41a6-8429-b8588d82c7a7-combined-ca-bundle\") pod \"4f6afb83-bfaa-41a6-8429-b8588d82c7a7\" (UID: \"4f6afb83-bfaa-41a6-8429-b8588d82c7a7\") " Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.740983 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab0242d4-11e1-4a4b-9e55-a841f2ba874d-operator-scripts\") pod \"ab0242d4-11e1-4a4b-9e55-a841f2ba874d\" (UID: \"ab0242d4-11e1-4a4b-9e55-a841f2ba874d\") " Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.741003 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f6afb83-bfaa-41a6-8429-b8588d82c7a7-operator-scripts\") pod \"4f6afb83-bfaa-41a6-8429-b8588d82c7a7\" (UID: \"4f6afb83-bfaa-41a6-8429-b8588d82c7a7\") " Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.741050 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x92hw\" (UniqueName: \"kubernetes.io/projected/c9a21025-970d-4b50-8f80-b0926242b929-kube-api-access-x92hw\") pod \"c9a21025-970d-4b50-8f80-b0926242b929\" (UID: \"c9a21025-970d-4b50-8f80-b0926242b929\") " Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.741098 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"4f6afb83-bfaa-41a6-8429-b8588d82c7a7\" (UID: \"4f6afb83-bfaa-41a6-8429-b8588d82c7a7\") " Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.741117 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4f6afb83-bfaa-41a6-8429-b8588d82c7a7-kolla-config\") pod \"4f6afb83-bfaa-41a6-8429-b8588d82c7a7\" (UID: \"4f6afb83-bfaa-41a6-8429-b8588d82c7a7\") " Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.741139 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlt62\" (UniqueName: \"kubernetes.io/projected/4f6afb83-bfaa-41a6-8429-b8588d82c7a7-kube-api-access-rlt62\") pod \"4f6afb83-bfaa-41a6-8429-b8588d82c7a7\" (UID: \"4f6afb83-bfaa-41a6-8429-b8588d82c7a7\") " Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.741180 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5sgmd\" (UniqueName: \"kubernetes.io/projected/22218c00-c79a-4d4a-a0e4-dc59f10ebaaf-kube-api-access-5sgmd\") pod \"22218c00-c79a-4d4a-a0e4-dc59f10ebaaf\" (UID: \"22218c00-c79a-4d4a-a0e4-dc59f10ebaaf\") " Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.741212 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9a21025-970d-4b50-8f80-b0926242b929-operator-scripts\") pod \"c9a21025-970d-4b50-8f80-b0926242b929\" (UID: \"c9a21025-970d-4b50-8f80-b0926242b929\") " Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.741248 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzl2v\" (UniqueName: \"kubernetes.io/projected/ab0242d4-11e1-4a4b-9e55-a841f2ba874d-kube-api-access-zzl2v\") pod \"ab0242d4-11e1-4a4b-9e55-a841f2ba874d\" (UID: \"ab0242d4-11e1-4a4b-9e55-a841f2ba874d\") " Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.741306 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f6afb83-bfaa-41a6-8429-b8588d82c7a7-galera-tls-certs\") pod \"4f6afb83-bfaa-41a6-8429-b8588d82c7a7\" (UID: \"4f6afb83-bfaa-41a6-8429-b8588d82c7a7\") " Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.741345 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22218c00-c79a-4d4a-a0e4-dc59f10ebaaf-operator-scripts\") pod \"22218c00-c79a-4d4a-a0e4-dc59f10ebaaf\" (UID: \"22218c00-c79a-4d4a-a0e4-dc59f10ebaaf\") " Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.741366 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4f6afb83-bfaa-41a6-8429-b8588d82c7a7-config-data-default\") pod \"4f6afb83-bfaa-41a6-8429-b8588d82c7a7\" (UID: \"4f6afb83-bfaa-41a6-8429-b8588d82c7a7\") " Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.741390 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4f6afb83-bfaa-41a6-8429-b8588d82c7a7-config-data-generated\") pod \"4f6afb83-bfaa-41a6-8429-b8588d82c7a7\" (UID: \"4f6afb83-bfaa-41a6-8429-b8588d82c7a7\") " Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.741434 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab0242d4-11e1-4a4b-9e55-a841f2ba874d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ab0242d4-11e1-4a4b-9e55-a841f2ba874d" (UID: "ab0242d4-11e1-4a4b-9e55-a841f2ba874d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.741738 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f6afb83-bfaa-41a6-8429-b8588d82c7a7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4f6afb83-bfaa-41a6-8429-b8588d82c7a7" (UID: "4f6afb83-bfaa-41a6-8429-b8588d82c7a7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.741919 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbkhk\" (UniqueName: \"kubernetes.io/projected/0a23c46a-df73-4630-bc6b-8df7a4bff754-kube-api-access-jbkhk\") pod \"keystone-1fc4-account-create-update-9ngml\" (UID: \"0a23c46a-df73-4630-bc6b-8df7a4bff754\") " pod="openstack/keystone-1fc4-account-create-update-9ngml" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.743009 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f6afb83-bfaa-41a6-8429-b8588d82c7a7-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "4f6afb83-bfaa-41a6-8429-b8588d82c7a7" (UID: "4f6afb83-bfaa-41a6-8429-b8588d82c7a7"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.743687 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f6afb83-bfaa-41a6-8429-b8588d82c7a7-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "4f6afb83-bfaa-41a6-8429-b8588d82c7a7" (UID: "4f6afb83-bfaa-41a6-8429-b8588d82c7a7"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:24:31 crc kubenswrapper[4893]: E0314 07:24:31.743932 4893 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 14 07:24:31 crc kubenswrapper[4893]: E0314 07:24:31.743992 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0a23c46a-df73-4630-bc6b-8df7a4bff754-operator-scripts podName:0a23c46a-df73-4630-bc6b-8df7a4bff754 nodeName:}" failed. No retries permitted until 2026-03-14 07:24:32.743969768 +0000 UTC m=+1552.006146560 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/0a23c46a-df73-4630-bc6b-8df7a4bff754-operator-scripts") pod "keystone-1fc4-account-create-update-9ngml" (UID: "0a23c46a-df73-4630-bc6b-8df7a4bff754") : configmap "openstack-scripts" not found Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.744170 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f6afb83-bfaa-41a6-8429-b8588d82c7a7-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "4f6afb83-bfaa-41a6-8429-b8588d82c7a7" (UID: "4f6afb83-bfaa-41a6-8429-b8588d82c7a7"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.744286 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0a23c46a-df73-4630-bc6b-8df7a4bff754-operator-scripts\") pod \"keystone-1fc4-account-create-update-9ngml\" (UID: \"0a23c46a-df73-4630-bc6b-8df7a4bff754\") " pod="openstack/keystone-1fc4-account-create-update-9ngml" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.745293 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9a21025-970d-4b50-8f80-b0926242b929-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c9a21025-970d-4b50-8f80-b0926242b929" (UID: "c9a21025-970d-4b50-8f80-b0926242b929"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.746651 4893 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4f6afb83-bfaa-41a6-8429-b8588d82c7a7-config-data-default\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.746938 4893 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4f6afb83-bfaa-41a6-8429-b8588d82c7a7-config-data-generated\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.747069 4893 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab0242d4-11e1-4a4b-9e55-a841f2ba874d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.747208 4893 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f6afb83-bfaa-41a6-8429-b8588d82c7a7-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.747319 4893 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4f6afb83-bfaa-41a6-8429-b8588d82c7a7-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.747707 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22218c00-c79a-4d4a-a0e4-dc59f10ebaaf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "22218c00-c79a-4d4a-a0e4-dc59f10ebaaf" (UID: "22218c00-c79a-4d4a-a0e4-dc59f10ebaaf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.756096 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab0242d4-11e1-4a4b-9e55-a841f2ba874d-kube-api-access-zzl2v" (OuterVolumeSpecName: "kube-api-access-zzl2v") pod "ab0242d4-11e1-4a4b-9e55-a841f2ba874d" (UID: "ab0242d4-11e1-4a4b-9e55-a841f2ba874d"). InnerVolumeSpecName "kube-api-access-zzl2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.765728 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9a21025-970d-4b50-8f80-b0926242b929-kube-api-access-x92hw" (OuterVolumeSpecName: "kube-api-access-x92hw") pod "c9a21025-970d-4b50-8f80-b0926242b929" (UID: "c9a21025-970d-4b50-8f80-b0926242b929"). InnerVolumeSpecName "kube-api-access-x92hw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:24:31 crc kubenswrapper[4893]: E0314 07:24:31.777650 4893 projected.go:194] Error preparing data for projected volume kube-api-access-jbkhk for pod openstack/keystone-1fc4-account-create-update-9ngml: failed to fetch token: serviceaccounts "galera-openstack" not found Mar 14 07:24:31 crc kubenswrapper[4893]: E0314 07:24:31.777759 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0a23c46a-df73-4630-bc6b-8df7a4bff754-kube-api-access-jbkhk podName:0a23c46a-df73-4630-bc6b-8df7a4bff754 nodeName:}" failed. No retries permitted until 2026-03-14 07:24:32.777736495 +0000 UTC m=+1552.039913287 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-jbkhk" (UniqueName: "kubernetes.io/projected/0a23c46a-df73-4630-bc6b-8df7a4bff754-kube-api-access-jbkhk") pod "keystone-1fc4-account-create-update-9ngml" (UID: "0a23c46a-df73-4630-bc6b-8df7a4bff754") : failed to fetch token: serviceaccounts "galera-openstack" not found Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.778449 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f6afb83-bfaa-41a6-8429-b8588d82c7a7-kube-api-access-rlt62" (OuterVolumeSpecName: "kube-api-access-rlt62") pod "4f6afb83-bfaa-41a6-8429-b8588d82c7a7" (UID: "4f6afb83-bfaa-41a6-8429-b8588d82c7a7"). InnerVolumeSpecName "kube-api-access-rlt62". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.792803 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22218c00-c79a-4d4a-a0e4-dc59f10ebaaf-kube-api-access-5sgmd" (OuterVolumeSpecName: "kube-api-access-5sgmd") pod "22218c00-c79a-4d4a-a0e4-dc59f10ebaaf" (UID: "22218c00-c79a-4d4a-a0e4-dc59f10ebaaf"). InnerVolumeSpecName "kube-api-access-5sgmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.795457 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f6afb83-bfaa-41a6-8429-b8588d82c7a7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4f6afb83-bfaa-41a6-8429-b8588d82c7a7" (UID: "4f6afb83-bfaa-41a6-8429-b8588d82c7a7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.805108 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "mysql-db") pod "4f6afb83-bfaa-41a6-8429-b8588d82c7a7" (UID: "4f6afb83-bfaa-41a6-8429-b8588d82c7a7"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.818376 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.829790 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f6afb83-bfaa-41a6-8429-b8588d82c7a7-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "4f6afb83-bfaa-41a6-8429-b8588d82c7a7" (UID: "4f6afb83-bfaa-41a6-8429-b8588d82c7a7"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.848340 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab16027b-4fcf-42bf-b586-a7b8ff348305-public-tls-certs\") pod \"ab16027b-4fcf-42bf-b586-a7b8ff348305\" (UID: \"ab16027b-4fcf-42bf-b586-a7b8ff348305\") " Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.849193 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab16027b-4fcf-42bf-b586-a7b8ff348305-internal-tls-certs\") pod \"ab16027b-4fcf-42bf-b586-a7b8ff348305\" (UID: \"ab16027b-4fcf-42bf-b586-a7b8ff348305\") " Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.849249 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ab16027b-4fcf-42bf-b586-a7b8ff348305-etc-machine-id\") pod \"ab16027b-4fcf-42bf-b586-a7b8ff348305\" (UID: \"ab16027b-4fcf-42bf-b586-a7b8ff348305\") " Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.849296 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab16027b-4fcf-42bf-b586-a7b8ff348305-combined-ca-bundle\") pod \"ab16027b-4fcf-42bf-b586-a7b8ff348305\" (UID: \"ab16027b-4fcf-42bf-b586-a7b8ff348305\") " Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.849314 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tcgpx\" (UniqueName: \"kubernetes.io/projected/ab16027b-4fcf-42bf-b586-a7b8ff348305-kube-api-access-tcgpx\") pod \"ab16027b-4fcf-42bf-b586-a7b8ff348305\" (UID: \"ab16027b-4fcf-42bf-b586-a7b8ff348305\") " Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.849382 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab16027b-4fcf-42bf-b586-a7b8ff348305-config-data\") pod \"ab16027b-4fcf-42bf-b586-a7b8ff348305\" (UID: \"ab16027b-4fcf-42bf-b586-a7b8ff348305\") " Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.849439 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab16027b-4fcf-42bf-b586-a7b8ff348305-scripts\") pod \"ab16027b-4fcf-42bf-b586-a7b8ff348305\" (UID: \"ab16027b-4fcf-42bf-b586-a7b8ff348305\") " Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.849469 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab16027b-4fcf-42bf-b586-a7b8ff348305-logs\") pod \"ab16027b-4fcf-42bf-b586-a7b8ff348305\" (UID: \"ab16027b-4fcf-42bf-b586-a7b8ff348305\") " Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.849495 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ab16027b-4fcf-42bf-b586-a7b8ff348305-config-data-custom\") pod \"ab16027b-4fcf-42bf-b586-a7b8ff348305\" (UID: \"ab16027b-4fcf-42bf-b586-a7b8ff348305\") " Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.849877 4893 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f6afb83-bfaa-41a6-8429-b8588d82c7a7-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.849894 4893 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22218c00-c79a-4d4a-a0e4-dc59f10ebaaf-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.849903 4893 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f6afb83-bfaa-41a6-8429-b8588d82c7a7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.849912 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x92hw\" (UniqueName: \"kubernetes.io/projected/c9a21025-970d-4b50-8f80-b0926242b929-kube-api-access-x92hw\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.849931 4893 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.849940 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rlt62\" (UniqueName: \"kubernetes.io/projected/4f6afb83-bfaa-41a6-8429-b8588d82c7a7-kube-api-access-rlt62\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.849950 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5sgmd\" (UniqueName: \"kubernetes.io/projected/22218c00-c79a-4d4a-a0e4-dc59f10ebaaf-kube-api-access-5sgmd\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.849958 4893 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9a21025-970d-4b50-8f80-b0926242b929-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.849968 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzl2v\" (UniqueName: \"kubernetes.io/projected/ab0242d4-11e1-4a4b-9e55-a841f2ba874d-kube-api-access-zzl2v\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.862915 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ab16027b-4fcf-42bf-b586-a7b8ff348305-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "ab16027b-4fcf-42bf-b586-a7b8ff348305" (UID: "ab16027b-4fcf-42bf-b586-a7b8ff348305"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.874802 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab16027b-4fcf-42bf-b586-a7b8ff348305-logs" (OuterVolumeSpecName: "logs") pod "ab16027b-4fcf-42bf-b586-a7b8ff348305" (UID: "ab16027b-4fcf-42bf-b586-a7b8ff348305"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.891917 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab16027b-4fcf-42bf-b586-a7b8ff348305-scripts" (OuterVolumeSpecName: "scripts") pod "ab16027b-4fcf-42bf-b586-a7b8ff348305" (UID: "ab16027b-4fcf-42bf-b586-a7b8ff348305"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.895791 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab16027b-4fcf-42bf-b586-a7b8ff348305-kube-api-access-tcgpx" (OuterVolumeSpecName: "kube-api-access-tcgpx") pod "ab16027b-4fcf-42bf-b586-a7b8ff348305" (UID: "ab16027b-4fcf-42bf-b586-a7b8ff348305"). InnerVolumeSpecName "kube-api-access-tcgpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.897544 4893 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.913691 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab16027b-4fcf-42bf-b586-a7b8ff348305-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ab16027b-4fcf-42bf-b586-a7b8ff348305" (UID: "ab16027b-4fcf-42bf-b586-a7b8ff348305"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.951416 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc1eed68-2de8-46ed-91c9-3eb4fe897d3a-combined-ca-bundle\") pod \"dc1eed68-2de8-46ed-91c9-3eb4fe897d3a\" (UID: \"dc1eed68-2de8-46ed-91c9-3eb4fe897d3a\") " Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.951454 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5gkx\" (UniqueName: \"kubernetes.io/projected/dc1eed68-2de8-46ed-91c9-3eb4fe897d3a-kube-api-access-w5gkx\") pod \"dc1eed68-2de8-46ed-91c9-3eb4fe897d3a\" (UID: \"dc1eed68-2de8-46ed-91c9-3eb4fe897d3a\") " Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.951502 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc1eed68-2de8-46ed-91c9-3eb4fe897d3a-scripts\") pod \"dc1eed68-2de8-46ed-91c9-3eb4fe897d3a\" (UID: \"dc1eed68-2de8-46ed-91c9-3eb4fe897d3a\") " Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.951543 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc1eed68-2de8-46ed-91c9-3eb4fe897d3a-internal-tls-certs\") pod \"dc1eed68-2de8-46ed-91c9-3eb4fe897d3a\" (UID: \"dc1eed68-2de8-46ed-91c9-3eb4fe897d3a\") " Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.951656 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc1eed68-2de8-46ed-91c9-3eb4fe897d3a-config-data\") pod \"dc1eed68-2de8-46ed-91c9-3eb4fe897d3a\" (UID: \"dc1eed68-2de8-46ed-91c9-3eb4fe897d3a\") " Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.951685 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dc1eed68-2de8-46ed-91c9-3eb4fe897d3a-httpd-run\") pod \"dc1eed68-2de8-46ed-91c9-3eb4fe897d3a\" (UID: \"dc1eed68-2de8-46ed-91c9-3eb4fe897d3a\") " Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.951744 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"dc1eed68-2de8-46ed-91c9-3eb4fe897d3a\" (UID: \"dc1eed68-2de8-46ed-91c9-3eb4fe897d3a\") " Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.951768 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc1eed68-2de8-46ed-91c9-3eb4fe897d3a-logs\") pod \"dc1eed68-2de8-46ed-91c9-3eb4fe897d3a\" (UID: \"dc1eed68-2de8-46ed-91c9-3eb4fe897d3a\") " Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.952110 4893 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ab16027b-4fcf-42bf-b586-a7b8ff348305-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.952122 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tcgpx\" (UniqueName: \"kubernetes.io/projected/ab16027b-4fcf-42bf-b586-a7b8ff348305-kube-api-access-tcgpx\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.952131 4893 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.952139 4893 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab16027b-4fcf-42bf-b586-a7b8ff348305-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.952146 4893 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab16027b-4fcf-42bf-b586-a7b8ff348305-logs\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.952154 4893 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ab16027b-4fcf-42bf-b586-a7b8ff348305-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:31 crc kubenswrapper[4893]: E0314 07:24:31.952212 4893 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 14 07:24:31 crc kubenswrapper[4893]: E0314 07:24:31.953093 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e6be5c8e-c381-4e29-90e7-069d902c1805-operator-scripts podName:e6be5c8e-c381-4e29-90e7-069d902c1805 nodeName:}" failed. No retries permitted until 2026-03-14 07:24:32.953075338 +0000 UTC m=+1552.215252130 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/e6be5c8e-c381-4e29-90e7-069d902c1805-operator-scripts") pod "root-account-create-update-zwt4c" (UID: "e6be5c8e-c381-4e29-90e7-069d902c1805") : configmap "openstack-scripts" not found Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.954677 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc1eed68-2de8-46ed-91c9-3eb4fe897d3a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "dc1eed68-2de8-46ed-91c9-3eb4fe897d3a" (UID: "dc1eed68-2de8-46ed-91c9-3eb4fe897d3a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.965692 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc1eed68-2de8-46ed-91c9-3eb4fe897d3a-logs" (OuterVolumeSpecName: "logs") pod "dc1eed68-2de8-46ed-91c9-3eb4fe897d3a" (UID: "dc1eed68-2de8-46ed-91c9-3eb4fe897d3a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.977956 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc1eed68-2de8-46ed-91c9-3eb4fe897d3a-scripts" (OuterVolumeSpecName: "scripts") pod "dc1eed68-2de8-46ed-91c9-3eb4fe897d3a" (UID: "dc1eed68-2de8-46ed-91c9-3eb4fe897d3a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.990146 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc1eed68-2de8-46ed-91c9-3eb4fe897d3a-kube-api-access-w5gkx" (OuterVolumeSpecName: "kube-api-access-w5gkx") pod "dc1eed68-2de8-46ed-91c9-3eb4fe897d3a" (UID: "dc1eed68-2de8-46ed-91c9-3eb4fe897d3a"). InnerVolumeSpecName "kube-api-access-w5gkx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:24:31 crc kubenswrapper[4893]: I0314 07:24:31.992445 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "dc1eed68-2de8-46ed-91c9-3eb4fe897d3a" (UID: "dc1eed68-2de8-46ed-91c9-3eb4fe897d3a"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.053904 4893 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.053934 4893 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc1eed68-2de8-46ed-91c9-3eb4fe897d3a-logs\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.053945 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5gkx\" (UniqueName: \"kubernetes.io/projected/dc1eed68-2de8-46ed-91c9-3eb4fe897d3a-kube-api-access-w5gkx\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.053955 4893 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc1eed68-2de8-46ed-91c9-3eb4fe897d3a-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.053965 4893 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dc1eed68-2de8-46ed-91c9-3eb4fe897d3a-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.075390 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab16027b-4fcf-42bf-b586-a7b8ff348305-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ab16027b-4fcf-42bf-b586-a7b8ff348305" (UID: "ab16027b-4fcf-42bf-b586-a7b8ff348305"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.105545 4893 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.142429 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc1eed68-2de8-46ed-91c9-3eb4fe897d3a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dc1eed68-2de8-46ed-91c9-3eb4fe897d3a" (UID: "dc1eed68-2de8-46ed-91c9-3eb4fe897d3a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.145420 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab16027b-4fcf-42bf-b586-a7b8ff348305-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ab16027b-4fcf-42bf-b586-a7b8ff348305" (UID: "ab16027b-4fcf-42bf-b586-a7b8ff348305"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.155942 4893 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab16027b-4fcf-42bf-b586-a7b8ff348305-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.156019 4893 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.156034 4893 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc1eed68-2de8-46ed-91c9-3eb4fe897d3a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.156073 4893 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab16027b-4fcf-42bf-b586-a7b8ff348305-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.168025 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab16027b-4fcf-42bf-b586-a7b8ff348305-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ab16027b-4fcf-42bf-b586-a7b8ff348305" (UID: "ab16027b-4fcf-42bf-b586-a7b8ff348305"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.175659 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc1eed68-2de8-46ed-91c9-3eb4fe897d3a-config-data" (OuterVolumeSpecName: "config-data") pod "dc1eed68-2de8-46ed-91c9-3eb4fe897d3a" (UID: "dc1eed68-2de8-46ed-91c9-3eb4fe897d3a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.194628 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc1eed68-2de8-46ed-91c9-3eb4fe897d3a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "dc1eed68-2de8-46ed-91c9-3eb4fe897d3a" (UID: "dc1eed68-2de8-46ed-91c9-3eb4fe897d3a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.216988 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab16027b-4fcf-42bf-b586-a7b8ff348305-config-data" (OuterVolumeSpecName: "config-data") pod "ab16027b-4fcf-42bf-b586-a7b8ff348305" (UID: "ab16027b-4fcf-42bf-b586-a7b8ff348305"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.258436 4893 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc1eed68-2de8-46ed-91c9-3eb4fe897d3a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.258502 4893 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab16027b-4fcf-42bf-b586-a7b8ff348305-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.258514 4893 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc1eed68-2de8-46ed-91c9-3eb4fe897d3a-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.258537 4893 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab16027b-4fcf-42bf-b586-a7b8ff348305-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.340230 4893 generic.go:334] "Generic (PLEG): container finished" podID="1239ac87-7084-45c6-9eef-ecab07108656" containerID="200f0500dd95736bf725b120a1467c014a0ae3d6596c940427a6b0fa69a9ddd9" exitCode=0 Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.340308 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1239ac87-7084-45c6-9eef-ecab07108656","Type":"ContainerDied","Data":"200f0500dd95736bf725b120a1467c014a0ae3d6596c940427a6b0fa69a9ddd9"} Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.345740 4893 generic.go:334] "Generic (PLEG): container finished" podID="dc7c1963-417f-453f-8983-1c03d349f76d" containerID="55f440a65215e639fbf8031519f6a32152e474e4f88918f7a1c6749a48adaca4" exitCode=0 Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.345818 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc7c1963-417f-453f-8983-1c03d349f76d","Type":"ContainerDied","Data":"55f440a65215e639fbf8031519f6a32152e474e4f88918f7a1c6749a48adaca4"} Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.348057 4893 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/root-account-create-update-zwt4c" secret="" err="secret \"galera-openstack-dockercfg-f6n62\" not found" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.348118 4893 scope.go:117] "RemoveContainer" containerID="3e4b4f462f2eb63d686cf632d5ec4e1a3100f1a2224fff5ea78dfd30b8750794" Mar 14 07:24:32 crc kubenswrapper[4893]: E0314 07:24:32.349044 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-create-update pod=root-account-create-update-zwt4c_openstack(e6be5c8e-c381-4e29-90e7-069d902c1805)\"" pod="openstack/root-account-create-update-zwt4c" podUID="e6be5c8e-c381-4e29-90e7-069d902c1805" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.355021 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.355019 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ab16027b-4fcf-42bf-b586-a7b8ff348305","Type":"ContainerDied","Data":"2b3f223f3d3be067b7af41b7ac07c3c891459c19442c72cfd982aa0ff3500e8a"} Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.355181 4893 scope.go:117] "RemoveContainer" containerID="471e6c6aa09a8b1a609a1618d206f8f19bac722872a90829f538c1ebf60510ec" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.360595 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d4589578b-zwqpr" event={"ID":"2195ecfb-6eeb-48f1-8b55-c57520974663","Type":"ContainerDied","Data":"38eeda47e7a5eea0b6dfddbcfda7c7aa248c1d05790dee7938fc29ba025a25f6"} Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.360754 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38eeda47e7a5eea0b6dfddbcfda7c7aa248c1d05790dee7938fc29ba025a25f6" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.379279 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a07f9759-5cdb-4e42-b7c6-714d0e34ee55","Type":"ContainerDied","Data":"a4b5d6451188781e94a46b2be23e8d8f4e42365d0974c6fec260c7d6efb45da6"} Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.379601 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a4b5d6451188781e94a46b2be23e8d8f4e42365d0974c6fec260c7d6efb45da6" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.390823 4893 generic.go:334] "Generic (PLEG): container finished" podID="b6ca04fa-accd-437a-ab63-d39d14a49777" containerID="3ed3ad5f48a78c2ee75e5dd1a6438ae0b7531b0fefc6d91c0f46aad1b39d3ee9" exitCode=0 Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.390915 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b6ca04fa-accd-437a-ab63-d39d14a49777","Type":"ContainerDied","Data":"3ed3ad5f48a78c2ee75e5dd1a6438ae0b7531b0fefc6d91c0f46aad1b39d3ee9"} Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.404544 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-78ff-account-create-update-rbfhn" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.404683 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-78ff-account-create-update-rbfhn" event={"ID":"22218c00-c79a-4d4a-a0e4-dc59f10ebaaf","Type":"ContainerDied","Data":"ad03753bc9e5096402c46b4a68aac223028a3c27d17a6333c612615b285cf61f"} Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.414821 4893 generic.go:334] "Generic (PLEG): container finished" podID="eb5d1a2a-ad9f-4eb7-8a70-37a1523a1b6f" containerID="7ab1a7eb14a49fb840db8cffee0e903b4ae4cb885907191a6bc0ab1496f7f86c" exitCode=0 Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.414873 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"eb5d1a2a-ad9f-4eb7-8a70-37a1523a1b6f","Type":"ContainerDied","Data":"7ab1a7eb14a49fb840db8cffee0e903b4ae4cb885907191a6bc0ab1496f7f86c"} Mar 14 07:24:32 crc kubenswrapper[4893]: E0314 07:24:32.417571 4893 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c9b16cf0fb0ddb6f10075d1c80980022cc10c492e67b85046aa17943882b5a0b is running failed: container process not found" containerID="c9b16cf0fb0ddb6f10075d1c80980022cc10c492e67b85046aa17943882b5a0b" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 14 07:24:32 crc kubenswrapper[4893]: E0314 07:24:32.418666 4893 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c9b16cf0fb0ddb6f10075d1c80980022cc10c492e67b85046aa17943882b5a0b is running failed: container process not found" containerID="c9b16cf0fb0ddb6f10075d1c80980022cc10c492e67b85046aa17943882b5a0b" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 14 07:24:32 crc kubenswrapper[4893]: E0314 07:24:32.419466 4893 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c9b16cf0fb0ddb6f10075d1c80980022cc10c492e67b85046aa17943882b5a0b is running failed: container process not found" containerID="c9b16cf0fb0ddb6f10075d1c80980022cc10c492e67b85046aa17943882b5a0b" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 14 07:24:32 crc kubenswrapper[4893]: E0314 07:24:32.419528 4893 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c9b16cf0fb0ddb6f10075d1c80980022cc10c492e67b85046aa17943882b5a0b is running failed: container process not found" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="d0f57646-651c-4b8f-b73d-6606d06fa3a3" containerName="nova-cell1-conductor-conductor" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.426859 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"dc1eed68-2de8-46ed-91c9-3eb4fe897d3a","Type":"ContainerDied","Data":"3f9e7a2379346ac60b5e650f8b10b47f08982c0ae473e7d2b73e4a1058b47d2e"} Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.427068 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.434913 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1cad381a-c3bf-4fc8-a314-6f45028f3482","Type":"ContainerDied","Data":"4b6ebdcad9604ec201a9052264933fd32ee7e23b9e9bcc41a262190418c54706"} Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.434946 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b6ebdcad9604ec201a9052264933fd32ee7e23b9e9bcc41a262190418c54706" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.437503 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d4589578b-zwqpr" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.439234 4893 generic.go:334] "Generic (PLEG): container finished" podID="fefe82b2-447a-4f97-8221-7050b61ef60c" containerID="a0362c79961c85ff09cc537289495d9c32c6edf16fe15acfc1b254f825509254" exitCode=0 Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.439315 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-684d6b469b-c8l2b" event={"ID":"fefe82b2-447a-4f97-8221-7050b61ef60c","Type":"ContainerDied","Data":"a0362c79961c85ff09cc537289495d9c32c6edf16fe15acfc1b254f825509254"} Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.444604 4893 generic.go:334] "Generic (PLEG): container finished" podID="d0f57646-651c-4b8f-b73d-6606d06fa3a3" containerID="c9b16cf0fb0ddb6f10075d1c80980022cc10c492e67b85046aa17943882b5a0b" exitCode=0 Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.444648 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"d0f57646-651c-4b8f-b73d-6606d06fa3a3","Type":"ContainerDied","Data":"c9b16cf0fb0ddb6f10075d1c80980022cc10c492e67b85046aa17943882b5a0b"} Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.446599 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.446726 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"4f6afb83-bfaa-41a6-8429-b8588d82c7a7","Type":"ContainerDied","Data":"9c18ae52999bed19e6062471aa41398ff6a7cc15c07667a19f1cbd9e1d26ed4a"} Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.446790 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.449591 4893 scope.go:117] "RemoveContainer" containerID="492cd7441a800dc10a30bd9b6a28dd3d89e262a94b290571698a0218593d8d7f" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.451478 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3b5a-account-create-update-hkpb6" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.453574 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-1fc4-account-create-update-9ngml" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.453680 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-33a2-account-create-update-fzsjz" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.469220 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.482008 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.487350 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.499388 4893 scope.go:117] "RemoveContainer" containerID="4d1c426d2b310ef6f81cd1f91ebfb6670cc4b8ebb961307ca3eb73f633e88e39" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.509456 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.524651 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-78ff-account-create-update-rbfhn"] Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.539857 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-78ff-account-create-update-rbfhn"] Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.545204 4893 scope.go:117] "RemoveContainer" containerID="43f69db823a3f9f8c85a6a93d52c4972fbfa042fe09bc248a37ffd104f7a1ea4" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.566396 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2195ecfb-6eeb-48f1-8b55-c57520974663-public-tls-certs\") pod \"2195ecfb-6eeb-48f1-8b55-c57520974663\" (UID: \"2195ecfb-6eeb-48f1-8b55-c57520974663\") " Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.566434 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/1cad381a-c3bf-4fc8-a314-6f45028f3482-kube-state-metrics-tls-config\") pod \"1cad381a-c3bf-4fc8-a314-6f45028f3482\" (UID: \"1cad381a-c3bf-4fc8-a314-6f45028f3482\") " Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.566476 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"a07f9759-5cdb-4e42-b7c6-714d0e34ee55\" (UID: \"a07f9759-5cdb-4e42-b7c6-714d0e34ee55\") " Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.566492 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a07f9759-5cdb-4e42-b7c6-714d0e34ee55-config-data\") pod \"a07f9759-5cdb-4e42-b7c6-714d0e34ee55\" (UID: \"a07f9759-5cdb-4e42-b7c6-714d0e34ee55\") " Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.566510 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a07f9759-5cdb-4e42-b7c6-714d0e34ee55-public-tls-certs\") pod \"a07f9759-5cdb-4e42-b7c6-714d0e34ee55\" (UID: \"a07f9759-5cdb-4e42-b7c6-714d0e34ee55\") " Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.566576 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2195ecfb-6eeb-48f1-8b55-c57520974663-internal-tls-certs\") pod \"2195ecfb-6eeb-48f1-8b55-c57520974663\" (UID: \"2195ecfb-6eeb-48f1-8b55-c57520974663\") " Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.566604 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a07f9759-5cdb-4e42-b7c6-714d0e34ee55-scripts\") pod \"a07f9759-5cdb-4e42-b7c6-714d0e34ee55\" (UID: \"a07f9759-5cdb-4e42-b7c6-714d0e34ee55\") " Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.566659 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrhtt\" (UniqueName: \"kubernetes.io/projected/a07f9759-5cdb-4e42-b7c6-714d0e34ee55-kube-api-access-xrhtt\") pod \"a07f9759-5cdb-4e42-b7c6-714d0e34ee55\" (UID: \"a07f9759-5cdb-4e42-b7c6-714d0e34ee55\") " Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.566701 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a07f9759-5cdb-4e42-b7c6-714d0e34ee55-logs\") pod \"a07f9759-5cdb-4e42-b7c6-714d0e34ee55\" (UID: \"a07f9759-5cdb-4e42-b7c6-714d0e34ee55\") " Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.566752 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2195ecfb-6eeb-48f1-8b55-c57520974663-logs\") pod \"2195ecfb-6eeb-48f1-8b55-c57520974663\" (UID: \"2195ecfb-6eeb-48f1-8b55-c57520974663\") " Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.566778 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2195ecfb-6eeb-48f1-8b55-c57520974663-scripts\") pod \"2195ecfb-6eeb-48f1-8b55-c57520974663\" (UID: \"2195ecfb-6eeb-48f1-8b55-c57520974663\") " Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.566795 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2195ecfb-6eeb-48f1-8b55-c57520974663-combined-ca-bundle\") pod \"2195ecfb-6eeb-48f1-8b55-c57520974663\" (UID: \"2195ecfb-6eeb-48f1-8b55-c57520974663\") " Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.566813 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a07f9759-5cdb-4e42-b7c6-714d0e34ee55-combined-ca-bundle\") pod \"a07f9759-5cdb-4e42-b7c6-714d0e34ee55\" (UID: \"a07f9759-5cdb-4e42-b7c6-714d0e34ee55\") " Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.566829 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/1cad381a-c3bf-4fc8-a314-6f45028f3482-kube-state-metrics-tls-certs\") pod \"1cad381a-c3bf-4fc8-a314-6f45028f3482\" (UID: \"1cad381a-c3bf-4fc8-a314-6f45028f3482\") " Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.566863 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tdws\" (UniqueName: \"kubernetes.io/projected/2195ecfb-6eeb-48f1-8b55-c57520974663-kube-api-access-4tdws\") pod \"2195ecfb-6eeb-48f1-8b55-c57520974663\" (UID: \"2195ecfb-6eeb-48f1-8b55-c57520974663\") " Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.566884 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a07f9759-5cdb-4e42-b7c6-714d0e34ee55-httpd-run\") pod \"a07f9759-5cdb-4e42-b7c6-714d0e34ee55\" (UID: \"a07f9759-5cdb-4e42-b7c6-714d0e34ee55\") " Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.566899 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2195ecfb-6eeb-48f1-8b55-c57520974663-config-data\") pod \"2195ecfb-6eeb-48f1-8b55-c57520974663\" (UID: \"2195ecfb-6eeb-48f1-8b55-c57520974663\") " Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.566924 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrkvb\" (UniqueName: \"kubernetes.io/projected/1cad381a-c3bf-4fc8-a314-6f45028f3482-kube-api-access-wrkvb\") pod \"1cad381a-c3bf-4fc8-a314-6f45028f3482\" (UID: \"1cad381a-c3bf-4fc8-a314-6f45028f3482\") " Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.566948 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cad381a-c3bf-4fc8-a314-6f45028f3482-combined-ca-bundle\") pod \"1cad381a-c3bf-4fc8-a314-6f45028f3482\" (UID: \"1cad381a-c3bf-4fc8-a314-6f45028f3482\") " Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.568432 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-1fc4-account-create-update-9ngml" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.576501 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2195ecfb-6eeb-48f1-8b55-c57520974663-logs" (OuterVolumeSpecName: "logs") pod "2195ecfb-6eeb-48f1-8b55-c57520974663" (UID: "2195ecfb-6eeb-48f1-8b55-c57520974663"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.576612 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.577792 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a07f9759-5cdb-4e42-b7c6-714d0e34ee55-logs" (OuterVolumeSpecName: "logs") pod "a07f9759-5cdb-4e42-b7c6-714d0e34ee55" (UID: "a07f9759-5cdb-4e42-b7c6-714d0e34ee55"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.587369 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a07f9759-5cdb-4e42-b7c6-714d0e34ee55-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a07f9759-5cdb-4e42-b7c6-714d0e34ee55" (UID: "a07f9759-5cdb-4e42-b7c6-714d0e34ee55"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.593495 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "a07f9759-5cdb-4e42-b7c6-714d0e34ee55" (UID: "a07f9759-5cdb-4e42-b7c6-714d0e34ee55"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.601666 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cad381a-c3bf-4fc8-a314-6f45028f3482-kube-api-access-wrkvb" (OuterVolumeSpecName: "kube-api-access-wrkvb") pod "1cad381a-c3bf-4fc8-a314-6f45028f3482" (UID: "1cad381a-c3bf-4fc8-a314-6f45028f3482"). InnerVolumeSpecName "kube-api-access-wrkvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.606023 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.606917 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2195ecfb-6eeb-48f1-8b55-c57520974663-scripts" (OuterVolumeSpecName: "scripts") pod "2195ecfb-6eeb-48f1-8b55-c57520974663" (UID: "2195ecfb-6eeb-48f1-8b55-c57520974663"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.606931 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a07f9759-5cdb-4e42-b7c6-714d0e34ee55-kube-api-access-xrhtt" (OuterVolumeSpecName: "kube-api-access-xrhtt") pod "a07f9759-5cdb-4e42-b7c6-714d0e34ee55" (UID: "a07f9759-5cdb-4e42-b7c6-714d0e34ee55"). InnerVolumeSpecName "kube-api-access-xrhtt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.607000 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a07f9759-5cdb-4e42-b7c6-714d0e34ee55-scripts" (OuterVolumeSpecName: "scripts") pod "a07f9759-5cdb-4e42-b7c6-714d0e34ee55" (UID: "a07f9759-5cdb-4e42-b7c6-714d0e34ee55"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.612341 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.619143 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.633831 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2195ecfb-6eeb-48f1-8b55-c57520974663-kube-api-access-4tdws" (OuterVolumeSpecName: "kube-api-access-4tdws") pod "2195ecfb-6eeb-48f1-8b55-c57520974663" (UID: "2195ecfb-6eeb-48f1-8b55-c57520974663"). InnerVolumeSpecName "kube-api-access-4tdws". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.645056 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-684d6b469b-c8l2b" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.675091 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-33a2-account-create-update-fzsjz"] Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.675922 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-33a2-account-create-update-fzsjz"] Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.676408 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4lnp\" (UniqueName: \"kubernetes.io/projected/b6ca04fa-accd-437a-ab63-d39d14a49777-kube-api-access-m4lnp\") pod \"b6ca04fa-accd-437a-ab63-d39d14a49777\" (UID: \"b6ca04fa-accd-437a-ab63-d39d14a49777\") " Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.676452 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6ca04fa-accd-437a-ab63-d39d14a49777-combined-ca-bundle\") pod \"b6ca04fa-accd-437a-ab63-d39d14a49777\" (UID: \"b6ca04fa-accd-437a-ab63-d39d14a49777\") " Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.676472 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6ca04fa-accd-437a-ab63-d39d14a49777-logs\") pod \"b6ca04fa-accd-437a-ab63-d39d14a49777\" (UID: \"b6ca04fa-accd-437a-ab63-d39d14a49777\") " Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.676671 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6ca04fa-accd-437a-ab63-d39d14a49777-config-data\") pod \"b6ca04fa-accd-437a-ab63-d39d14a49777\" (UID: \"b6ca04fa-accd-437a-ab63-d39d14a49777\") " Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.676823 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6ca04fa-accd-437a-ab63-d39d14a49777-nova-metadata-tls-certs\") pod \"b6ca04fa-accd-437a-ab63-d39d14a49777\" (UID: \"b6ca04fa-accd-437a-ab63-d39d14a49777\") " Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.677168 4893 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2195ecfb-6eeb-48f1-8b55-c57520974663-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.677187 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tdws\" (UniqueName: \"kubernetes.io/projected/2195ecfb-6eeb-48f1-8b55-c57520974663-kube-api-access-4tdws\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.677198 4893 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a07f9759-5cdb-4e42-b7c6-714d0e34ee55-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.677206 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrkvb\" (UniqueName: \"kubernetes.io/projected/1cad381a-c3bf-4fc8-a314-6f45028f3482-kube-api-access-wrkvb\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.677225 4893 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.677234 4893 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a07f9759-5cdb-4e42-b7c6-714d0e34ee55-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.677243 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrhtt\" (UniqueName: \"kubernetes.io/projected/a07f9759-5cdb-4e42-b7c6-714d0e34ee55-kube-api-access-xrhtt\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.677251 4893 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a07f9759-5cdb-4e42-b7c6-714d0e34ee55-logs\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.677259 4893 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2195ecfb-6eeb-48f1-8b55-c57520974663-logs\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.677232 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6ca04fa-accd-437a-ab63-d39d14a49777-logs" (OuterVolumeSpecName: "logs") pod "b6ca04fa-accd-437a-ab63-d39d14a49777" (UID: "b6ca04fa-accd-437a-ab63-d39d14a49777"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.694942 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-3b5a-account-create-update-hkpb6"] Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.697169 4893 scope.go:117] "RemoveContainer" containerID="027f613615f6fb782a95ff00ffffb3250ba4bf7974f3a506df7ad3cbe9d15d84" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.709805 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-3b5a-account-create-update-hkpb6"] Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.716842 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cad381a-c3bf-4fc8-a314-6f45028f3482-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "1cad381a-c3bf-4fc8-a314-6f45028f3482" (UID: "1cad381a-c3bf-4fc8-a314-6f45028f3482"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.736776 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6ca04fa-accd-437a-ab63-d39d14a49777-kube-api-access-m4lnp" (OuterVolumeSpecName: "kube-api-access-m4lnp") pod "b6ca04fa-accd-437a-ab63-d39d14a49777" (UID: "b6ca04fa-accd-437a-ab63-d39d14a49777"). InnerVolumeSpecName "kube-api-access-m4lnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.762779 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cad381a-c3bf-4fc8-a314-6f45028f3482-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1cad381a-c3bf-4fc8-a314-6f45028f3482" (UID: "1cad381a-c3bf-4fc8-a314-6f45028f3482"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.768320 4893 scope.go:117] "RemoveContainer" containerID="5d483ecdea3679e076c45905acc44deb617277c515c2b95cca2cf608060961b7" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.780230 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fefe82b2-447a-4f97-8221-7050b61ef60c-combined-ca-bundle\") pod \"fefe82b2-447a-4f97-8221-7050b61ef60c\" (UID: \"fefe82b2-447a-4f97-8221-7050b61ef60c\") " Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.780403 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fefe82b2-447a-4f97-8221-7050b61ef60c-config-data-custom\") pod \"fefe82b2-447a-4f97-8221-7050b61ef60c\" (UID: \"fefe82b2-447a-4f97-8221-7050b61ef60c\") " Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.780579 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgwxv\" (UniqueName: \"kubernetes.io/projected/fefe82b2-447a-4f97-8221-7050b61ef60c-kube-api-access-tgwxv\") pod \"fefe82b2-447a-4f97-8221-7050b61ef60c\" (UID: \"fefe82b2-447a-4f97-8221-7050b61ef60c\") " Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.780717 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fefe82b2-447a-4f97-8221-7050b61ef60c-config-data\") pod \"fefe82b2-447a-4f97-8221-7050b61ef60c\" (UID: \"fefe82b2-447a-4f97-8221-7050b61ef60c\") " Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.789676 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fefe82b2-447a-4f97-8221-7050b61ef60c-internal-tls-certs\") pod \"fefe82b2-447a-4f97-8221-7050b61ef60c\" (UID: \"fefe82b2-447a-4f97-8221-7050b61ef60c\") " Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.789775 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fefe82b2-447a-4f97-8221-7050b61ef60c-logs\") pod \"fefe82b2-447a-4f97-8221-7050b61ef60c\" (UID: \"fefe82b2-447a-4f97-8221-7050b61ef60c\") " Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.789842 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fefe82b2-447a-4f97-8221-7050b61ef60c-public-tls-certs\") pod \"fefe82b2-447a-4f97-8221-7050b61ef60c\" (UID: \"fefe82b2-447a-4f97-8221-7050b61ef60c\") " Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.790157 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbkhk\" (UniqueName: \"kubernetes.io/projected/0a23c46a-df73-4630-bc6b-8df7a4bff754-kube-api-access-jbkhk\") pod \"keystone-1fc4-account-create-update-9ngml\" (UID: \"0a23c46a-df73-4630-bc6b-8df7a4bff754\") " pod="openstack/keystone-1fc4-account-create-update-9ngml" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.790331 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0a23c46a-df73-4630-bc6b-8df7a4bff754-operator-scripts\") pod \"keystone-1fc4-account-create-update-9ngml\" (UID: \"0a23c46a-df73-4630-bc6b-8df7a4bff754\") " pod="openstack/keystone-1fc4-account-create-update-9ngml" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.790496 4893 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cad381a-c3bf-4fc8-a314-6f45028f3482-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.790508 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4lnp\" (UniqueName: \"kubernetes.io/projected/b6ca04fa-accd-437a-ab63-d39d14a49777-kube-api-access-m4lnp\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.790532 4893 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6ca04fa-accd-437a-ab63-d39d14a49777-logs\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.790543 4893 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/1cad381a-c3bf-4fc8-a314-6f45028f3482-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:32 crc kubenswrapper[4893]: E0314 07:24:32.790608 4893 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 14 07:24:32 crc kubenswrapper[4893]: E0314 07:24:32.790660 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0a23c46a-df73-4630-bc6b-8df7a4bff754-operator-scripts podName:0a23c46a-df73-4630-bc6b-8df7a4bff754 nodeName:}" failed. No retries permitted until 2026-03-14 07:24:34.790642559 +0000 UTC m=+1554.052819351 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/0a23c46a-df73-4630-bc6b-8df7a4bff754-operator-scripts") pod "keystone-1fc4-account-create-update-9ngml" (UID: "0a23c46a-df73-4630-bc6b-8df7a4bff754") : configmap "openstack-scripts" not found Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.796758 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fefe82b2-447a-4f97-8221-7050b61ef60c-logs" (OuterVolumeSpecName: "logs") pod "fefe82b2-447a-4f97-8221-7050b61ef60c" (UID: "fefe82b2-447a-4f97-8221-7050b61ef60c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:24:32 crc kubenswrapper[4893]: E0314 07:24:32.799545 4893 projected.go:194] Error preparing data for projected volume kube-api-access-jbkhk for pod openstack/keystone-1fc4-account-create-update-9ngml: failed to fetch token: serviceaccounts "galera-openstack" not found Mar 14 07:24:32 crc kubenswrapper[4893]: E0314 07:24:32.799605 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0a23c46a-df73-4630-bc6b-8df7a4bff754-kube-api-access-jbkhk podName:0a23c46a-df73-4630-bc6b-8df7a4bff754 nodeName:}" failed. No retries permitted until 2026-03-14 07:24:34.799589035 +0000 UTC m=+1554.061765827 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-jbkhk" (UniqueName: "kubernetes.io/projected/0a23c46a-df73-4630-bc6b-8df7a4bff754-kube-api-access-jbkhk") pod "keystone-1fc4-account-create-update-9ngml" (UID: "0a23c46a-df73-4630-bc6b-8df7a4bff754") : failed to fetch token: serviceaccounts "galera-openstack" not found Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.815041 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a07f9759-5cdb-4e42-b7c6-714d0e34ee55-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a07f9759-5cdb-4e42-b7c6-714d0e34ee55" (UID: "a07f9759-5cdb-4e42-b7c6-714d0e34ee55"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.827233 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fefe82b2-447a-4f97-8221-7050b61ef60c-kube-api-access-tgwxv" (OuterVolumeSpecName: "kube-api-access-tgwxv") pod "fefe82b2-447a-4f97-8221-7050b61ef60c" (UID: "fefe82b2-447a-4f97-8221-7050b61ef60c"). InnerVolumeSpecName "kube-api-access-tgwxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.828143 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fefe82b2-447a-4f97-8221-7050b61ef60c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "fefe82b2-447a-4f97-8221-7050b61ef60c" (UID: "fefe82b2-447a-4f97-8221-7050b61ef60c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.861978 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.893751 4893 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fefe82b2-447a-4f97-8221-7050b61ef60c-logs\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.893801 4893 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fefe82b2-447a-4f97-8221-7050b61ef60c-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.893812 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgwxv\" (UniqueName: \"kubernetes.io/projected/fefe82b2-447a-4f97-8221-7050b61ef60c-kube-api-access-tgwxv\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.893847 4893 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a07f9759-5cdb-4e42-b7c6-714d0e34ee55-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.915733 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2195ecfb-6eeb-48f1-8b55-c57520974663-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2195ecfb-6eeb-48f1-8b55-c57520974663" (UID: "2195ecfb-6eeb-48f1-8b55-c57520974663"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.946297 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.995302 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1239ac87-7084-45c6-9eef-ecab07108656-config-data-custom\") pod \"1239ac87-7084-45c6-9eef-ecab07108656\" (UID: \"1239ac87-7084-45c6-9eef-ecab07108656\") " Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.995396 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb5d1a2a-ad9f-4eb7-8a70-37a1523a1b6f-combined-ca-bundle\") pod \"eb5d1a2a-ad9f-4eb7-8a70-37a1523a1b6f\" (UID: \"eb5d1a2a-ad9f-4eb7-8a70-37a1523a1b6f\") " Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.995465 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpglh\" (UniqueName: \"kubernetes.io/projected/eb5d1a2a-ad9f-4eb7-8a70-37a1523a1b6f-kube-api-access-bpglh\") pod \"eb5d1a2a-ad9f-4eb7-8a70-37a1523a1b6f\" (UID: \"eb5d1a2a-ad9f-4eb7-8a70-37a1523a1b6f\") " Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.995594 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5dvb\" (UniqueName: \"kubernetes.io/projected/1239ac87-7084-45c6-9eef-ecab07108656-kube-api-access-z5dvb\") pod \"1239ac87-7084-45c6-9eef-ecab07108656\" (UID: \"1239ac87-7084-45c6-9eef-ecab07108656\") " Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.995652 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1239ac87-7084-45c6-9eef-ecab07108656-scripts\") pod \"1239ac87-7084-45c6-9eef-ecab07108656\" (UID: \"1239ac87-7084-45c6-9eef-ecab07108656\") " Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.995679 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1239ac87-7084-45c6-9eef-ecab07108656-config-data\") pod \"1239ac87-7084-45c6-9eef-ecab07108656\" (UID: \"1239ac87-7084-45c6-9eef-ecab07108656\") " Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.995726 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1239ac87-7084-45c6-9eef-ecab07108656-etc-machine-id\") pod \"1239ac87-7084-45c6-9eef-ecab07108656\" (UID: \"1239ac87-7084-45c6-9eef-ecab07108656\") " Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.995791 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb5d1a2a-ad9f-4eb7-8a70-37a1523a1b6f-config-data\") pod \"eb5d1a2a-ad9f-4eb7-8a70-37a1523a1b6f\" (UID: \"eb5d1a2a-ad9f-4eb7-8a70-37a1523a1b6f\") " Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.995863 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1239ac87-7084-45c6-9eef-ecab07108656-combined-ca-bundle\") pod \"1239ac87-7084-45c6-9eef-ecab07108656\" (UID: \"1239ac87-7084-45c6-9eef-ecab07108656\") " Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.996212 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1239ac87-7084-45c6-9eef-ecab07108656-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "1239ac87-7084-45c6-9eef-ecab07108656" (UID: "1239ac87-7084-45c6-9eef-ecab07108656"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.996441 4893 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1239ac87-7084-45c6-9eef-ecab07108656-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:32 crc kubenswrapper[4893]: I0314 07:24:32.996460 4893 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2195ecfb-6eeb-48f1-8b55-c57520974663-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:32 crc kubenswrapper[4893]: E0314 07:24:32.996543 4893 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 14 07:24:32 crc kubenswrapper[4893]: E0314 07:24:32.996605 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e6be5c8e-c381-4e29-90e7-069d902c1805-operator-scripts podName:e6be5c8e-c381-4e29-90e7-069d902c1805 nodeName:}" failed. No retries permitted until 2026-03-14 07:24:34.996580811 +0000 UTC m=+1554.258757613 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/e6be5c8e-c381-4e29-90e7-069d902c1805-operator-scripts") pod "root-account-create-update-zwt4c" (UID: "e6be5c8e-c381-4e29-90e7-069d902c1805") : configmap "openstack-scripts" not found Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.018979 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1239ac87-7084-45c6-9eef-ecab07108656-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1239ac87-7084-45c6-9eef-ecab07108656" (UID: "1239ac87-7084-45c6-9eef-ecab07108656"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.019039 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1239ac87-7084-45c6-9eef-ecab07108656-scripts" (OuterVolumeSpecName: "scripts") pod "1239ac87-7084-45c6-9eef-ecab07108656" (UID: "1239ac87-7084-45c6-9eef-ecab07108656"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.032477 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb5d1a2a-ad9f-4eb7-8a70-37a1523a1b6f-kube-api-access-bpglh" (OuterVolumeSpecName: "kube-api-access-bpglh") pod "eb5d1a2a-ad9f-4eb7-8a70-37a1523a1b6f" (UID: "eb5d1a2a-ad9f-4eb7-8a70-37a1523a1b6f"). InnerVolumeSpecName "kube-api-access-bpglh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.040165 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1239ac87-7084-45c6-9eef-ecab07108656-kube-api-access-z5dvb" (OuterVolumeSpecName: "kube-api-access-z5dvb") pod "1239ac87-7084-45c6-9eef-ecab07108656" (UID: "1239ac87-7084-45c6-9eef-ecab07108656"). InnerVolumeSpecName "kube-api-access-z5dvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.059026 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fefe82b2-447a-4f97-8221-7050b61ef60c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fefe82b2-447a-4f97-8221-7050b61ef60c" (UID: "fefe82b2-447a-4f97-8221-7050b61ef60c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.074475 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6ca04fa-accd-437a-ab63-d39d14a49777-config-data" (OuterVolumeSpecName: "config-data") pod "b6ca04fa-accd-437a-ab63-d39d14a49777" (UID: "b6ca04fa-accd-437a-ab63-d39d14a49777"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.102582 4893 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1239ac87-7084-45c6-9eef-ecab07108656-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.102607 4893 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fefe82b2-447a-4f97-8221-7050b61ef60c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.103047 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpglh\" (UniqueName: \"kubernetes.io/projected/eb5d1a2a-ad9f-4eb7-8a70-37a1523a1b6f-kube-api-access-bpglh\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.103061 4893 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6ca04fa-accd-437a-ab63-d39d14a49777-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.103070 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z5dvb\" (UniqueName: \"kubernetes.io/projected/1239ac87-7084-45c6-9eef-ecab07108656-kube-api-access-z5dvb\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.103078 4893 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1239ac87-7084-45c6-9eef-ecab07108656-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.104203 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2195ecfb-6eeb-48f1-8b55-c57520974663-config-data" (OuterVolumeSpecName: "config-data") pod "2195ecfb-6eeb-48f1-8b55-c57520974663" (UID: "2195ecfb-6eeb-48f1-8b55-c57520974663"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.105633 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a07f9759-5cdb-4e42-b7c6-714d0e34ee55-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a07f9759-5cdb-4e42-b7c6-714d0e34ee55" (UID: "a07f9759-5cdb-4e42-b7c6-714d0e34ee55"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.109708 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qdfzm" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.109810 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qdfzm" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.132707 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fefe82b2-447a-4f97-8221-7050b61ef60c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "fefe82b2-447a-4f97-8221-7050b61ef60c" (UID: "fefe82b2-447a-4f97-8221-7050b61ef60c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.134426 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cad381a-c3bf-4fc8-a314-6f45028f3482-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "1cad381a-c3bf-4fc8-a314-6f45028f3482" (UID: "1cad381a-c3bf-4fc8-a314-6f45028f3482"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.136024 4893 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.155328 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6ca04fa-accd-437a-ab63-d39d14a49777-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "b6ca04fa-accd-437a-ab63-d39d14a49777" (UID: "b6ca04fa-accd-437a-ab63-d39d14a49777"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.156264 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2195ecfb-6eeb-48f1-8b55-c57520974663-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2195ecfb-6eeb-48f1-8b55-c57520974663" (UID: "2195ecfb-6eeb-48f1-8b55-c57520974663"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.170223 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a07f9759-5cdb-4e42-b7c6-714d0e34ee55-config-data" (OuterVolumeSpecName: "config-data") pod "a07f9759-5cdb-4e42-b7c6-714d0e34ee55" (UID: "a07f9759-5cdb-4e42-b7c6-714d0e34ee55"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.187610 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb5d1a2a-ad9f-4eb7-8a70-37a1523a1b6f-config-data" (OuterVolumeSpecName: "config-data") pod "eb5d1a2a-ad9f-4eb7-8a70-37a1523a1b6f" (UID: "eb5d1a2a-ad9f-4eb7-8a70-37a1523a1b6f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.204806 4893 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/1cad381a-c3bf-4fc8-a314-6f45028f3482-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.204846 4893 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb5d1a2a-ad9f-4eb7-8a70-37a1523a1b6f-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.204866 4893 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6ca04fa-accd-437a-ab63-d39d14a49777-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.204878 4893 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2195ecfb-6eeb-48f1-8b55-c57520974663-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.204888 4893 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fefe82b2-447a-4f97-8221-7050b61ef60c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.204898 4893 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a07f9759-5cdb-4e42-b7c6-714d0e34ee55-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.204909 4893 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.204919 4893 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a07f9759-5cdb-4e42-b7c6-714d0e34ee55-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.204929 4893 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2195ecfb-6eeb-48f1-8b55-c57520974663-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.205982 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6ca04fa-accd-437a-ab63-d39d14a49777-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b6ca04fa-accd-437a-ab63-d39d14a49777" (UID: "b6ca04fa-accd-437a-ab63-d39d14a49777"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.217583 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb5d1a2a-ad9f-4eb7-8a70-37a1523a1b6f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eb5d1a2a-ad9f-4eb7-8a70-37a1523a1b6f" (UID: "eb5d1a2a-ad9f-4eb7-8a70-37a1523a1b6f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.228804 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fefe82b2-447a-4f97-8221-7050b61ef60c-config-data" (OuterVolumeSpecName: "config-data") pod "fefe82b2-447a-4f97-8221-7050b61ef60c" (UID: "fefe82b2-447a-4f97-8221-7050b61ef60c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.232705 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1239ac87-7084-45c6-9eef-ecab07108656-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1239ac87-7084-45c6-9eef-ecab07108656" (UID: "1239ac87-7084-45c6-9eef-ecab07108656"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.248000 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fefe82b2-447a-4f97-8221-7050b61ef60c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "fefe82b2-447a-4f97-8221-7050b61ef60c" (UID: "fefe82b2-447a-4f97-8221-7050b61ef60c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.271897 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2195ecfb-6eeb-48f1-8b55-c57520974663-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2195ecfb-6eeb-48f1-8b55-c57520974663" (UID: "2195ecfb-6eeb-48f1-8b55-c57520974663"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.291373 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1239ac87-7084-45c6-9eef-ecab07108656-config-data" (OuterVolumeSpecName: "config-data") pod "1239ac87-7084-45c6-9eef-ecab07108656" (UID: "1239ac87-7084-45c6-9eef-ecab07108656"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.306773 4893 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1239ac87-7084-45c6-9eef-ecab07108656-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.306801 4893 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fefe82b2-447a-4f97-8221-7050b61ef60c-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.306811 4893 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fefe82b2-447a-4f97-8221-7050b61ef60c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.306821 4893 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1239ac87-7084-45c6-9eef-ecab07108656-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.306831 4893 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6ca04fa-accd-437a-ab63-d39d14a49777-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.306839 4893 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb5d1a2a-ad9f-4eb7-8a70-37a1523a1b6f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.306848 4893 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2195ecfb-6eeb-48f1-8b55-c57520974663-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.347657 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.358684 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.400003 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="203abd37-654f-480c-8a9d-719d767aec4d" path="/var/lib/kubelet/pods/203abd37-654f-480c-8a9d-719d767aec4d/volumes" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.400588 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22218c00-c79a-4d4a-a0e4-dc59f10ebaaf" path="/var/lib/kubelet/pods/22218c00-c79a-4d4a-a0e4-dc59f10ebaaf/volumes" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.401136 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f6afb83-bfaa-41a6-8429-b8588d82c7a7" path="/var/lib/kubelet/pods/4f6afb83-bfaa-41a6-8429-b8588d82c7a7/volumes" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.402143 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab0242d4-11e1-4a4b-9e55-a841f2ba874d" path="/var/lib/kubelet/pods/ab0242d4-11e1-4a4b-9e55-a841f2ba874d/volumes" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.402494 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab16027b-4fcf-42bf-b586-a7b8ff348305" path="/var/lib/kubelet/pods/ab16027b-4fcf-42bf-b586-a7b8ff348305/volumes" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.403414 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9a21025-970d-4b50-8f80-b0926242b929" path="/var/lib/kubelet/pods/c9a21025-970d-4b50-8f80-b0926242b929/volumes" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.404049 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc1eed68-2de8-46ed-91c9-3eb4fe897d3a" path="/var/lib/kubelet/pods/dc1eed68-2de8-46ed-91c9-3eb4fe897d3a/volumes" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.407936 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5xhs\" (UniqueName: \"kubernetes.io/projected/d0f57646-651c-4b8f-b73d-6606d06fa3a3-kube-api-access-l5xhs\") pod \"d0f57646-651c-4b8f-b73d-6606d06fa3a3\" (UID: \"d0f57646-651c-4b8f-b73d-6606d06fa3a3\") " Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.416816 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0f57646-651c-4b8f-b73d-6606d06fa3a3-kube-api-access-l5xhs" (OuterVolumeSpecName: "kube-api-access-l5xhs") pod "d0f57646-651c-4b8f-b73d-6606d06fa3a3" (UID: "d0f57646-651c-4b8f-b73d-6606d06fa3a3"). InnerVolumeSpecName "kube-api-access-l5xhs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.420308 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0f57646-651c-4b8f-b73d-6606d06fa3a3-config-data\") pod \"d0f57646-651c-4b8f-b73d-6606d06fa3a3\" (UID: \"d0f57646-651c-4b8f-b73d-6606d06fa3a3\") " Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.420480 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0f57646-651c-4b8f-b73d-6606d06fa3a3-combined-ca-bundle\") pod \"d0f57646-651c-4b8f-b73d-6606d06fa3a3\" (UID: \"d0f57646-651c-4b8f-b73d-6606d06fa3a3\") " Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.422618 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5xhs\" (UniqueName: \"kubernetes.io/projected/d0f57646-651c-4b8f-b73d-6606d06fa3a3-kube-api-access-l5xhs\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.446356 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0f57646-651c-4b8f-b73d-6606d06fa3a3-config-data" (OuterVolumeSpecName: "config-data") pod "d0f57646-651c-4b8f-b73d-6606d06fa3a3" (UID: "d0f57646-651c-4b8f-b73d-6606d06fa3a3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.459670 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0f57646-651c-4b8f-b73d-6606d06fa3a3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d0f57646-651c-4b8f-b73d-6606d06fa3a3" (UID: "d0f57646-651c-4b8f-b73d-6606d06fa3a3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.467622 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b6ca04fa-accd-437a-ab63-d39d14a49777","Type":"ContainerDied","Data":"c757d516e66354fa6a18b9042b8d29238cef02b11c13610c41f431086a774b2e"} Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.467669 4893 scope.go:117] "RemoveContainer" containerID="3ed3ad5f48a78c2ee75e5dd1a6438ae0b7531b0fefc6d91c0f46aad1b39d3ee9" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.467743 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.471002 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"eb5d1a2a-ad9f-4eb7-8a70-37a1523a1b6f","Type":"ContainerDied","Data":"718f695f76bab1d83372fb0a3cbf3c3aaf7b424b06a7cb6f76e286df01b44511"} Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.471072 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.486584 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1239ac87-7084-45c6-9eef-ecab07108656","Type":"ContainerDied","Data":"3b1b5b753d3d3c389a58958b6db29604cea9261ff16da90a1c0179249689611d"} Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.486716 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.498945 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-684d6b469b-c8l2b" event={"ID":"fefe82b2-447a-4f97-8221-7050b61ef60c","Type":"ContainerDied","Data":"107594f7db66cb8800e239ba66cb0bb80a38fe43aa097c5639b7bb2ebaf04975"} Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.499547 4893 scope.go:117] "RemoveContainer" containerID="928a88e96ca63f18e3047a55a85109e7284281e3348f6b7748c2c86111c4f15e" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.499704 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-684d6b469b-c8l2b" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.504192 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"d0f57646-651c-4b8f-b73d-6606d06fa3a3","Type":"ContainerDied","Data":"280b9a33b356361ff4e472b1ae89ba0321e49995373f4302c52cc974cfcc8c57"} Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.504246 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.508255 4893 generic.go:334] "Generic (PLEG): container finished" podID="a0d5e368-4fac-48b9-a64c-717f3acf9388" containerID="8b903cc17bbe84a61bbbbd1a8b800ac009c7878c7b0d328302abe5a9b20394cb" exitCode=0 Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.508338 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.508388 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a0d5e368-4fac-48b9-a64c-717f3acf9388","Type":"ContainerDied","Data":"8b903cc17bbe84a61bbbbd1a8b800ac009c7878c7b0d328302abe5a9b20394cb"} Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.508493 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a0d5e368-4fac-48b9-a64c-717f3acf9388","Type":"ContainerDied","Data":"0e443246654e1bf453e67cfc9d4f720cd7e137ce2408b6c5f4c882b5ee0c4b30"} Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.513410 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.513470 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.513646 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-1fc4-account-create-update-9ngml" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.513685 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d4589578b-zwqpr" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.523244 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0d5e368-4fac-48b9-a64c-717f3acf9388-public-tls-certs\") pod \"a0d5e368-4fac-48b9-a64c-717f3acf9388\" (UID: \"a0d5e368-4fac-48b9-a64c-717f3acf9388\") " Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.523370 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4prf8\" (UniqueName: \"kubernetes.io/projected/a0d5e368-4fac-48b9-a64c-717f3acf9388-kube-api-access-4prf8\") pod \"a0d5e368-4fac-48b9-a64c-717f3acf9388\" (UID: \"a0d5e368-4fac-48b9-a64c-717f3acf9388\") " Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.523423 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0d5e368-4fac-48b9-a64c-717f3acf9388-combined-ca-bundle\") pod \"a0d5e368-4fac-48b9-a64c-717f3acf9388\" (UID: \"a0d5e368-4fac-48b9-a64c-717f3acf9388\") " Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.523440 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0d5e368-4fac-48b9-a64c-717f3acf9388-internal-tls-certs\") pod \"a0d5e368-4fac-48b9-a64c-717f3acf9388\" (UID: \"a0d5e368-4fac-48b9-a64c-717f3acf9388\") " Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.523466 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0d5e368-4fac-48b9-a64c-717f3acf9388-config-data\") pod \"a0d5e368-4fac-48b9-a64c-717f3acf9388\" (UID: \"a0d5e368-4fac-48b9-a64c-717f3acf9388\") " Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.523537 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0d5e368-4fac-48b9-a64c-717f3acf9388-logs\") pod \"a0d5e368-4fac-48b9-a64c-717f3acf9388\" (UID: \"a0d5e368-4fac-48b9-a64c-717f3acf9388\") " Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.525761 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0d5e368-4fac-48b9-a64c-717f3acf9388-logs" (OuterVolumeSpecName: "logs") pod "a0d5e368-4fac-48b9-a64c-717f3acf9388" (UID: "a0d5e368-4fac-48b9-a64c-717f3acf9388"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.533092 4893 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0f57646-651c-4b8f-b73d-6606d06fa3a3-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.533117 4893 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0d5e368-4fac-48b9-a64c-717f3acf9388-logs\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.533127 4893 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0f57646-651c-4b8f-b73d-6606d06fa3a3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.555088 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0d5e368-4fac-48b9-a64c-717f3acf9388-kube-api-access-4prf8" (OuterVolumeSpecName: "kube-api-access-4prf8") pod "a0d5e368-4fac-48b9-a64c-717f3acf9388" (UID: "a0d5e368-4fac-48b9-a64c-717f3acf9388"). InnerVolumeSpecName "kube-api-access-4prf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.559078 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0d5e368-4fac-48b9-a64c-717f3acf9388-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a0d5e368-4fac-48b9-a64c-717f3acf9388" (UID: "a0d5e368-4fac-48b9-a64c-717f3acf9388"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.571431 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0d5e368-4fac-48b9-a64c-717f3acf9388-config-data" (OuterVolumeSpecName: "config-data") pod "a0d5e368-4fac-48b9-a64c-717f3acf9388" (UID: "a0d5e368-4fac-48b9-a64c-717f3acf9388"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.579227 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0d5e368-4fac-48b9-a64c-717f3acf9388-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a0d5e368-4fac-48b9-a64c-717f3acf9388" (UID: "a0d5e368-4fac-48b9-a64c-717f3acf9388"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.596758 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0d5e368-4fac-48b9-a64c-717f3acf9388-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a0d5e368-4fac-48b9-a64c-717f3acf9388" (UID: "a0d5e368-4fac-48b9-a64c-717f3acf9388"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.634402 4893 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0d5e368-4fac-48b9-a64c-717f3acf9388-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.634745 4893 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0d5e368-4fac-48b9-a64c-717f3acf9388-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.634756 4893 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0d5e368-4fac-48b9-a64c-717f3acf9388-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.634765 4893 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0d5e368-4fac-48b9-a64c-717f3acf9388-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.634775 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4prf8\" (UniqueName: \"kubernetes.io/projected/a0d5e368-4fac-48b9-a64c-717f3acf9388-kube-api-access-4prf8\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.677437 4893 scope.go:117] "RemoveContainer" containerID="7ab1a7eb14a49fb840db8cffee0e903b4ae4cb885907191a6bc0ab1496f7f86c" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.683676 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-d4589578b-zwqpr"] Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.694685 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-d4589578b-zwqpr"] Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.700809 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.721592 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.763236 4893 scope.go:117] "RemoveContainer" containerID="6d4a0c5d5f1c982cd165c28bd58f8c768f1c7c5b718e400d69233659dc76ae51" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.792991 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-684d6b469b-c8l2b"] Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.809811 4893 scope.go:117] "RemoveContainer" containerID="200f0500dd95736bf725b120a1467c014a0ae3d6596c940427a6b0fa69a9ddd9" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.824643 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-684d6b469b-c8l2b"] Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.843567 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.857784 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.869535 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.875283 4893 scope.go:117] "RemoveContainer" containerID="a0362c79961c85ff09cc537289495d9c32c6edf16fe15acfc1b254f825509254" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.888543 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.900933 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-1fc4-account-create-update-9ngml"] Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.908300 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-1fc4-account-create-update-9ngml"] Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.916417 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.924664 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.930577 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.936942 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.942137 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.946089 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zwt4c" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.948136 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 14 07:24:33 crc kubenswrapper[4893]: E0314 07:24:33.949711 4893 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 14 07:24:33 crc kubenswrapper[4893]: E0314 07:24:33.949761 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e-config-data podName:7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e nodeName:}" failed. No retries permitted until 2026-03-14 07:24:41.94974792 +0000 UTC m=+1561.211924712 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e-config-data") pod "rabbitmq-cell1-server-0" (UID: "7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e") : configmap "rabbitmq-cell1-config-data" not found Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.967818 4893 scope.go:117] "RemoveContainer" containerID="142a5e92673e832e9dc3ab1aaae6b53df019680363c32f4373ec08b60c93fa74" Mar 14 07:24:33 crc kubenswrapper[4893]: I0314 07:24:33.967974 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.028881 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.059802 4893 scope.go:117] "RemoveContainer" containerID="c9b16cf0fb0ddb6f10075d1c80980022cc10c492e67b85046aa17943882b5a0b" Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.060587 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrqb5\" (UniqueName: \"kubernetes.io/projected/e6be5c8e-c381-4e29-90e7-069d902c1805-kube-api-access-rrqb5\") pod \"e6be5c8e-c381-4e29-90e7-069d902c1805\" (UID: \"e6be5c8e-c381-4e29-90e7-069d902c1805\") " Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.060682 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6be5c8e-c381-4e29-90e7-069d902c1805-operator-scripts\") pod \"e6be5c8e-c381-4e29-90e7-069d902c1805\" (UID: \"e6be5c8e-c381-4e29-90e7-069d902c1805\") " Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.061626 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbkhk\" (UniqueName: \"kubernetes.io/projected/0a23c46a-df73-4630-bc6b-8df7a4bff754-kube-api-access-jbkhk\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.061647 4893 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0a23c46a-df73-4630-bc6b-8df7a4bff754-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.068225 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6be5c8e-c381-4e29-90e7-069d902c1805-kube-api-access-rrqb5" (OuterVolumeSpecName: "kube-api-access-rrqb5") pod "e6be5c8e-c381-4e29-90e7-069d902c1805" (UID: "e6be5c8e-c381-4e29-90e7-069d902c1805"). InnerVolumeSpecName "kube-api-access-rrqb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.069935 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6be5c8e-c381-4e29-90e7-069d902c1805-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e6be5c8e-c381-4e29-90e7-069d902c1805" (UID: "e6be5c8e-c381-4e29-90e7-069d902c1805"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.150751 4893 scope.go:117] "RemoveContainer" containerID="8b903cc17bbe84a61bbbbd1a8b800ac009c7878c7b0d328302abe5a9b20394cb" Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.162880 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrqb5\" (UniqueName: \"kubernetes.io/projected/e6be5c8e-c381-4e29-90e7-069d902c1805-kube-api-access-rrqb5\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.162910 4893 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6be5c8e-c381-4e29-90e7-069d902c1805-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.183941 4893 scope.go:117] "RemoveContainer" containerID="d1f0172858057fee57fcad3f87f58b29148aa386e794d735c5c5da631d14ed80" Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.190368 4893 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qdfzm" podUID="6354f443-b3e8-4932-a319-315187cebac7" containerName="registry-server" probeResult="failure" output=< Mar 14 07:24:34 crc kubenswrapper[4893]: timeout: failed to connect service ":50051" within 1s Mar 14 07:24:34 crc kubenswrapper[4893]: > Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.234700 4893 scope.go:117] "RemoveContainer" containerID="8b903cc17bbe84a61bbbbd1a8b800ac009c7878c7b0d328302abe5a9b20394cb" Mar 14 07:24:34 crc kubenswrapper[4893]: E0314 07:24:34.235392 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b903cc17bbe84a61bbbbd1a8b800ac009c7878c7b0d328302abe5a9b20394cb\": container with ID starting with 8b903cc17bbe84a61bbbbd1a8b800ac009c7878c7b0d328302abe5a9b20394cb not found: ID does not exist" containerID="8b903cc17bbe84a61bbbbd1a8b800ac009c7878c7b0d328302abe5a9b20394cb" Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.235419 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b903cc17bbe84a61bbbbd1a8b800ac009c7878c7b0d328302abe5a9b20394cb"} err="failed to get container status \"8b903cc17bbe84a61bbbbd1a8b800ac009c7878c7b0d328302abe5a9b20394cb\": rpc error: code = NotFound desc = could not find container \"8b903cc17bbe84a61bbbbd1a8b800ac009c7878c7b0d328302abe5a9b20394cb\": container with ID starting with 8b903cc17bbe84a61bbbbd1a8b800ac009c7878c7b0d328302abe5a9b20394cb not found: ID does not exist" Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.235452 4893 scope.go:117] "RemoveContainer" containerID="d1f0172858057fee57fcad3f87f58b29148aa386e794d735c5c5da631d14ed80" Mar 14 07:24:34 crc kubenswrapper[4893]: E0314 07:24:34.235812 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1f0172858057fee57fcad3f87f58b29148aa386e794d735c5c5da631d14ed80\": container with ID starting with d1f0172858057fee57fcad3f87f58b29148aa386e794d735c5c5da631d14ed80 not found: ID does not exist" containerID="d1f0172858057fee57fcad3f87f58b29148aa386e794d735c5c5da631d14ed80" Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.235829 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1f0172858057fee57fcad3f87f58b29148aa386e794d735c5c5da631d14ed80"} err="failed to get container status \"d1f0172858057fee57fcad3f87f58b29148aa386e794d735c5c5da631d14ed80\": rpc error: code = NotFound desc = could not find container \"d1f0172858057fee57fcad3f87f58b29148aa386e794d735c5c5da631d14ed80\": container with ID starting with d1f0172858057fee57fcad3f87f58b29148aa386e794d735c5c5da631d14ed80 not found: ID does not exist" Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.321843 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.442875 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_c7a6e4cc-c5a8-4551-bd39-ab4eb11331df/ovn-northd/0.log" Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.442983 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.473812 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/444c0d3d-4ad4-47a3-9281-b7028d69a78a-config-data-default\") pod \"444c0d3d-4ad4-47a3-9281-b7028d69a78a\" (UID: \"444c0d3d-4ad4-47a3-9281-b7028d69a78a\") " Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.473866 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/444c0d3d-4ad4-47a3-9281-b7028d69a78a-combined-ca-bundle\") pod \"444c0d3d-4ad4-47a3-9281-b7028d69a78a\" (UID: \"444c0d3d-4ad4-47a3-9281-b7028d69a78a\") " Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.473933 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/444c0d3d-4ad4-47a3-9281-b7028d69a78a-galera-tls-certs\") pod \"444c0d3d-4ad4-47a3-9281-b7028d69a78a\" (UID: \"444c0d3d-4ad4-47a3-9281-b7028d69a78a\") " Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.474499 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/444c0d3d-4ad4-47a3-9281-b7028d69a78a-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "444c0d3d-4ad4-47a3-9281-b7028d69a78a" (UID: "444c0d3d-4ad4-47a3-9281-b7028d69a78a"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.474815 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/444c0d3d-4ad4-47a3-9281-b7028d69a78a-kolla-config\") pod \"444c0d3d-4ad4-47a3-9281-b7028d69a78a\" (UID: \"444c0d3d-4ad4-47a3-9281-b7028d69a78a\") " Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.474856 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/444c0d3d-4ad4-47a3-9281-b7028d69a78a-config-data-generated\") pod \"444c0d3d-4ad4-47a3-9281-b7028d69a78a\" (UID: \"444c0d3d-4ad4-47a3-9281-b7028d69a78a\") " Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.474922 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"444c0d3d-4ad4-47a3-9281-b7028d69a78a\" (UID: \"444c0d3d-4ad4-47a3-9281-b7028d69a78a\") " Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.474960 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2vqj\" (UniqueName: \"kubernetes.io/projected/444c0d3d-4ad4-47a3-9281-b7028d69a78a-kube-api-access-c2vqj\") pod \"444c0d3d-4ad4-47a3-9281-b7028d69a78a\" (UID: \"444c0d3d-4ad4-47a3-9281-b7028d69a78a\") " Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.474978 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/444c0d3d-4ad4-47a3-9281-b7028d69a78a-operator-scripts\") pod \"444c0d3d-4ad4-47a3-9281-b7028d69a78a\" (UID: \"444c0d3d-4ad4-47a3-9281-b7028d69a78a\") " Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.475345 4893 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/444c0d3d-4ad4-47a3-9281-b7028d69a78a-config-data-default\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.476295 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/444c0d3d-4ad4-47a3-9281-b7028d69a78a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "444c0d3d-4ad4-47a3-9281-b7028d69a78a" (UID: "444c0d3d-4ad4-47a3-9281-b7028d69a78a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:24:34 crc kubenswrapper[4893]: E0314 07:24:34.476314 4893 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 14 07:24:34 crc kubenswrapper[4893]: E0314 07:24:34.476355 4893 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a752b3c8-284e-490f-be39-506e7a075c6f-config-data podName:a752b3c8-284e-490f-be39-506e7a075c6f nodeName:}" failed. No retries permitted until 2026-03-14 07:24:42.476340499 +0000 UTC m=+1561.738517291 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/a752b3c8-284e-490f-be39-506e7a075c6f-config-data") pod "rabbitmq-server-0" (UID: "a752b3c8-284e-490f-be39-506e7a075c6f") : configmap "rabbitmq-config-data" not found Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.477042 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/444c0d3d-4ad4-47a3-9281-b7028d69a78a-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "444c0d3d-4ad4-47a3-9281-b7028d69a78a" (UID: "444c0d3d-4ad4-47a3-9281-b7028d69a78a"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.477279 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/444c0d3d-4ad4-47a3-9281-b7028d69a78a-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "444c0d3d-4ad4-47a3-9281-b7028d69a78a" (UID: "444c0d3d-4ad4-47a3-9281-b7028d69a78a"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.485007 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/444c0d3d-4ad4-47a3-9281-b7028d69a78a-kube-api-access-c2vqj" (OuterVolumeSpecName: "kube-api-access-c2vqj") pod "444c0d3d-4ad4-47a3-9281-b7028d69a78a" (UID: "444c0d3d-4ad4-47a3-9281-b7028d69a78a"). InnerVolumeSpecName "kube-api-access-c2vqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.493013 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "mysql-db") pod "444c0d3d-4ad4-47a3-9281-b7028d69a78a" (UID: "444c0d3d-4ad4-47a3-9281-b7028d69a78a"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.510787 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/444c0d3d-4ad4-47a3-9281-b7028d69a78a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "444c0d3d-4ad4-47a3-9281-b7028d69a78a" (UID: "444c0d3d-4ad4-47a3-9281-b7028d69a78a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.519254 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/444c0d3d-4ad4-47a3-9281-b7028d69a78a-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "444c0d3d-4ad4-47a3-9281-b7028d69a78a" (UID: "444c0d3d-4ad4-47a3-9281-b7028d69a78a"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.527514 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_c7a6e4cc-c5a8-4551-bd39-ab4eb11331df/ovn-northd/0.log" Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.527599 4893 generic.go:334] "Generic (PLEG): container finished" podID="c7a6e4cc-c5a8-4551-bd39-ab4eb11331df" containerID="7eb5ea7adcd2372f21f4aaa33bfc67aa71545d592bca12e3e249bea3ada6eeec" exitCode=139 Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.527670 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.527706 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c7a6e4cc-c5a8-4551-bd39-ab4eb11331df","Type":"ContainerDied","Data":"7eb5ea7adcd2372f21f4aaa33bfc67aa71545d592bca12e3e249bea3ada6eeec"} Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.527757 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c7a6e4cc-c5a8-4551-bd39-ab4eb11331df","Type":"ContainerDied","Data":"7b7e7564bf784fd604eab18c4e27a75f662ed319eef40cfe755745c6359a0f95"} Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.527780 4893 scope.go:117] "RemoveContainer" containerID="a353de5b0e8e4b5839fc133d37ff96079b4483877961980a7a0ff324ee5c28c1" Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.536090 4893 generic.go:334] "Generic (PLEG): container finished" podID="444c0d3d-4ad4-47a3-9281-b7028d69a78a" containerID="320bfcf2a61e367399a0d0afd62dedcfc1630863667c2d338451fad1fcb6b519" exitCode=0 Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.536160 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"444c0d3d-4ad4-47a3-9281-b7028d69a78a","Type":"ContainerDied","Data":"320bfcf2a61e367399a0d0afd62dedcfc1630863667c2d338451fad1fcb6b519"} Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.536199 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"444c0d3d-4ad4-47a3-9281-b7028d69a78a","Type":"ContainerDied","Data":"7e7da4e22bf60aa0ec9b55713ac88c949061bd33595456b4882b823f5ccc3da7"} Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.536173 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.537857 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-zwt4c" event={"ID":"e6be5c8e-c381-4e29-90e7-069d902c1805","Type":"ContainerDied","Data":"dff232fbc3c9a3abb4d7e60a566fdb00bf629c3d040d8e58813243ceb43d28ae"} Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.537886 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zwt4c" Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.552304 4893 scope.go:117] "RemoveContainer" containerID="7eb5ea7adcd2372f21f4aaa33bfc67aa71545d592bca12e3e249bea3ada6eeec" Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.569422 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-zwt4c"] Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.577011 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rdqs\" (UniqueName: \"kubernetes.io/projected/c7a6e4cc-c5a8-4551-bd39-ab4eb11331df-kube-api-access-6rdqs\") pod \"c7a6e4cc-c5a8-4551-bd39-ab4eb11331df\" (UID: \"c7a6e4cc-c5a8-4551-bd39-ab4eb11331df\") " Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.577082 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7a6e4cc-c5a8-4551-bd39-ab4eb11331df-combined-ca-bundle\") pod \"c7a6e4cc-c5a8-4551-bd39-ab4eb11331df\" (UID: \"c7a6e4cc-c5a8-4551-bd39-ab4eb11331df\") " Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.577199 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7a6e4cc-c5a8-4551-bd39-ab4eb11331df-metrics-certs-tls-certs\") pod \"c7a6e4cc-c5a8-4551-bd39-ab4eb11331df\" (UID: \"c7a6e4cc-c5a8-4551-bd39-ab4eb11331df\") " Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.577236 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c7a6e4cc-c5a8-4551-bd39-ab4eb11331df-ovn-rundir\") pod \"c7a6e4cc-c5a8-4551-bd39-ab4eb11331df\" (UID: \"c7a6e4cc-c5a8-4551-bd39-ab4eb11331df\") " Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.577257 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7a6e4cc-c5a8-4551-bd39-ab4eb11331df-ovn-northd-tls-certs\") pod \"c7a6e4cc-c5a8-4551-bd39-ab4eb11331df\" (UID: \"c7a6e4cc-c5a8-4551-bd39-ab4eb11331df\") " Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.577302 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7a6e4cc-c5a8-4551-bd39-ab4eb11331df-config\") pod \"c7a6e4cc-c5a8-4551-bd39-ab4eb11331df\" (UID: \"c7a6e4cc-c5a8-4551-bd39-ab4eb11331df\") " Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.577337 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c7a6e4cc-c5a8-4551-bd39-ab4eb11331df-scripts\") pod \"c7a6e4cc-c5a8-4551-bd39-ab4eb11331df\" (UID: \"c7a6e4cc-c5a8-4551-bd39-ab4eb11331df\") " Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.578336 4893 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.578362 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2vqj\" (UniqueName: \"kubernetes.io/projected/444c0d3d-4ad4-47a3-9281-b7028d69a78a-kube-api-access-c2vqj\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.578373 4893 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/444c0d3d-4ad4-47a3-9281-b7028d69a78a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.578382 4893 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/444c0d3d-4ad4-47a3-9281-b7028d69a78a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.578390 4893 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/444c0d3d-4ad4-47a3-9281-b7028d69a78a-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.578398 4893 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/444c0d3d-4ad4-47a3-9281-b7028d69a78a-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.578408 4893 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/444c0d3d-4ad4-47a3-9281-b7028d69a78a-config-data-generated\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.579082 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7a6e4cc-c5a8-4551-bd39-ab4eb11331df-config" (OuterVolumeSpecName: "config") pod "c7a6e4cc-c5a8-4551-bd39-ab4eb11331df" (UID: "c7a6e4cc-c5a8-4551-bd39-ab4eb11331df"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.579269 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7a6e4cc-c5a8-4551-bd39-ab4eb11331df-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "c7a6e4cc-c5a8-4551-bd39-ab4eb11331df" (UID: "c7a6e4cc-c5a8-4551-bd39-ab4eb11331df"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.583844 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7a6e4cc-c5a8-4551-bd39-ab4eb11331df-scripts" (OuterVolumeSpecName: "scripts") pod "c7a6e4cc-c5a8-4551-bd39-ab4eb11331df" (UID: "c7a6e4cc-c5a8-4551-bd39-ab4eb11331df"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.583908 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-zwt4c"] Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.590745 4893 scope.go:117] "RemoveContainer" containerID="a353de5b0e8e4b5839fc133d37ff96079b4483877961980a7a0ff324ee5c28c1" Mar 14 07:24:34 crc kubenswrapper[4893]: E0314 07:24:34.591355 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a353de5b0e8e4b5839fc133d37ff96079b4483877961980a7a0ff324ee5c28c1\": container with ID starting with a353de5b0e8e4b5839fc133d37ff96079b4483877961980a7a0ff324ee5c28c1 not found: ID does not exist" containerID="a353de5b0e8e4b5839fc133d37ff96079b4483877961980a7a0ff324ee5c28c1" Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.591392 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a353de5b0e8e4b5839fc133d37ff96079b4483877961980a7a0ff324ee5c28c1"} err="failed to get container status \"a353de5b0e8e4b5839fc133d37ff96079b4483877961980a7a0ff324ee5c28c1\": rpc error: code = NotFound desc = could not find container \"a353de5b0e8e4b5839fc133d37ff96079b4483877961980a7a0ff324ee5c28c1\": container with ID starting with a353de5b0e8e4b5839fc133d37ff96079b4483877961980a7a0ff324ee5c28c1 not found: ID does not exist" Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.591416 4893 scope.go:117] "RemoveContainer" containerID="7eb5ea7adcd2372f21f4aaa33bfc67aa71545d592bca12e3e249bea3ada6eeec" Mar 14 07:24:34 crc kubenswrapper[4893]: E0314 07:24:34.591797 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7eb5ea7adcd2372f21f4aaa33bfc67aa71545d592bca12e3e249bea3ada6eeec\": container with ID starting with 7eb5ea7adcd2372f21f4aaa33bfc67aa71545d592bca12e3e249bea3ada6eeec not found: ID does not exist" containerID="7eb5ea7adcd2372f21f4aaa33bfc67aa71545d592bca12e3e249bea3ada6eeec" Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.591823 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7eb5ea7adcd2372f21f4aaa33bfc67aa71545d592bca12e3e249bea3ada6eeec"} err="failed to get container status \"7eb5ea7adcd2372f21f4aaa33bfc67aa71545d592bca12e3e249bea3ada6eeec\": rpc error: code = NotFound desc = could not find container \"7eb5ea7adcd2372f21f4aaa33bfc67aa71545d592bca12e3e249bea3ada6eeec\": container with ID starting with 7eb5ea7adcd2372f21f4aaa33bfc67aa71545d592bca12e3e249bea3ada6eeec not found: ID does not exist" Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.591839 4893 scope.go:117] "RemoveContainer" containerID="320bfcf2a61e367399a0d0afd62dedcfc1630863667c2d338451fad1fcb6b519" Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.592939 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.601323 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7a6e4cc-c5a8-4551-bd39-ab4eb11331df-kube-api-access-6rdqs" (OuterVolumeSpecName: "kube-api-access-6rdqs") pod "c7a6e4cc-c5a8-4551-bd39-ab4eb11331df" (UID: "c7a6e4cc-c5a8-4551-bd39-ab4eb11331df"). InnerVolumeSpecName "kube-api-access-6rdqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.607107 4893 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.608347 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.619316 4893 scope.go:117] "RemoveContainer" containerID="ced6dde900ea0387dd4c09e308e98dd1da790cf886736ef5a116af997fc079ee" Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.624719 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7a6e4cc-c5a8-4551-bd39-ab4eb11331df-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c7a6e4cc-c5a8-4551-bd39-ab4eb11331df" (UID: "c7a6e4cc-c5a8-4551-bd39-ab4eb11331df"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.661818 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7a6e4cc-c5a8-4551-bd39-ab4eb11331df-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "c7a6e4cc-c5a8-4551-bd39-ab4eb11331df" (UID: "c7a6e4cc-c5a8-4551-bd39-ab4eb11331df"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.670117 4893 scope.go:117] "RemoveContainer" containerID="320bfcf2a61e367399a0d0afd62dedcfc1630863667c2d338451fad1fcb6b519" Mar 14 07:24:34 crc kubenswrapper[4893]: E0314 07:24:34.678276 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"320bfcf2a61e367399a0d0afd62dedcfc1630863667c2d338451fad1fcb6b519\": container with ID starting with 320bfcf2a61e367399a0d0afd62dedcfc1630863667c2d338451fad1fcb6b519 not found: ID does not exist" containerID="320bfcf2a61e367399a0d0afd62dedcfc1630863667c2d338451fad1fcb6b519" Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.678344 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"320bfcf2a61e367399a0d0afd62dedcfc1630863667c2d338451fad1fcb6b519"} err="failed to get container status \"320bfcf2a61e367399a0d0afd62dedcfc1630863667c2d338451fad1fcb6b519\": rpc error: code = NotFound desc = could not find container \"320bfcf2a61e367399a0d0afd62dedcfc1630863667c2d338451fad1fcb6b519\": container with ID starting with 320bfcf2a61e367399a0d0afd62dedcfc1630863667c2d338451fad1fcb6b519 not found: ID does not exist" Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.678371 4893 scope.go:117] "RemoveContainer" containerID="ced6dde900ea0387dd4c09e308e98dd1da790cf886736ef5a116af997fc079ee" Mar 14 07:24:34 crc kubenswrapper[4893]: E0314 07:24:34.680039 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ced6dde900ea0387dd4c09e308e98dd1da790cf886736ef5a116af997fc079ee\": container with ID starting with ced6dde900ea0387dd4c09e308e98dd1da790cf886736ef5a116af997fc079ee not found: ID does not exist" containerID="ced6dde900ea0387dd4c09e308e98dd1da790cf886736ef5a116af997fc079ee" Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.680112 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ced6dde900ea0387dd4c09e308e98dd1da790cf886736ef5a116af997fc079ee"} err="failed to get container status \"ced6dde900ea0387dd4c09e308e98dd1da790cf886736ef5a116af997fc079ee\": rpc error: code = NotFound desc = could not find container \"ced6dde900ea0387dd4c09e308e98dd1da790cf886736ef5a116af997fc079ee\": container with ID starting with ced6dde900ea0387dd4c09e308e98dd1da790cf886736ef5a116af997fc079ee not found: ID does not exist" Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.680162 4893 scope.go:117] "RemoveContainer" containerID="3e4b4f462f2eb63d686cf632d5ec4e1a3100f1a2224fff5ea78dfd30b8750794" Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.680632 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rdqs\" (UniqueName: \"kubernetes.io/projected/c7a6e4cc-c5a8-4551-bd39-ab4eb11331df-kube-api-access-6rdqs\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.680652 4893 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7a6e4cc-c5a8-4551-bd39-ab4eb11331df-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.680762 4893 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7a6e4cc-c5a8-4551-bd39-ab4eb11331df-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.680772 4893 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c7a6e4cc-c5a8-4551-bd39-ab4eb11331df-ovn-rundir\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.680783 4893 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7a6e4cc-c5a8-4551-bd39-ab4eb11331df-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.680792 4893 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.680800 4893 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c7a6e4cc-c5a8-4551-bd39-ab4eb11331df-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.687867 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7a6e4cc-c5a8-4551-bd39-ab4eb11331df-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "c7a6e4cc-c5a8-4551-bd39-ab4eb11331df" (UID: "c7a6e4cc-c5a8-4551-bd39-ab4eb11331df"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.782751 4893 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7a6e4cc-c5a8-4551-bd39-ab4eb11331df-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.875501 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Mar 14 07:24:34 crc kubenswrapper[4893]: I0314 07:24:34.881765 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Mar 14 07:24:35 crc kubenswrapper[4893]: I0314 07:24:35.023449 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-787475576f-6cj4v" Mar 14 07:24:35 crc kubenswrapper[4893]: I0314 07:24:35.190197 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a-fernet-keys\") pod \"0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a\" (UID: \"0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a\") " Mar 14 07:24:35 crc kubenswrapper[4893]: I0314 07:24:35.190262 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a-scripts\") pod \"0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a\" (UID: \"0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a\") " Mar 14 07:24:35 crc kubenswrapper[4893]: I0314 07:24:35.190284 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a-public-tls-certs\") pod \"0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a\" (UID: \"0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a\") " Mar 14 07:24:35 crc kubenswrapper[4893]: I0314 07:24:35.190326 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a-credential-keys\") pod \"0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a\" (UID: \"0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a\") " Mar 14 07:24:35 crc kubenswrapper[4893]: I0314 07:24:35.190411 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a-combined-ca-bundle\") pod \"0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a\" (UID: \"0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a\") " Mar 14 07:24:35 crc kubenswrapper[4893]: I0314 07:24:35.190452 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a-config-data\") pod \"0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a\" (UID: \"0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a\") " Mar 14 07:24:35 crc kubenswrapper[4893]: I0314 07:24:35.190477 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a-internal-tls-certs\") pod \"0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a\" (UID: \"0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a\") " Mar 14 07:24:35 crc kubenswrapper[4893]: I0314 07:24:35.190566 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ph9ps\" (UniqueName: \"kubernetes.io/projected/0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a-kube-api-access-ph9ps\") pod \"0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a\" (UID: \"0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a\") " Mar 14 07:24:35 crc kubenswrapper[4893]: I0314 07:24:35.210008 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a-scripts" (OuterVolumeSpecName: "scripts") pod "0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a" (UID: "0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:35 crc kubenswrapper[4893]: I0314 07:24:35.217198 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a" (UID: "0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:35 crc kubenswrapper[4893]: I0314 07:24:35.222411 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a-kube-api-access-ph9ps" (OuterVolumeSpecName: "kube-api-access-ph9ps") pod "0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a" (UID: "0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a"). InnerVolumeSpecName "kube-api-access-ph9ps". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:24:35 crc kubenswrapper[4893]: I0314 07:24:35.242617 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a" (UID: "0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:35 crc kubenswrapper[4893]: I0314 07:24:35.244725 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a" (UID: "0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:35 crc kubenswrapper[4893]: I0314 07:24:35.260761 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a-config-data" (OuterVolumeSpecName: "config-data") pod "0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a" (UID: "0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:35 crc kubenswrapper[4893]: I0314 07:24:35.297182 4893 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:35 crc kubenswrapper[4893]: I0314 07:24:35.297226 4893 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:35 crc kubenswrapper[4893]: I0314 07:24:35.297240 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ph9ps\" (UniqueName: \"kubernetes.io/projected/0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a-kube-api-access-ph9ps\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:35 crc kubenswrapper[4893]: I0314 07:24:35.297253 4893 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:35 crc kubenswrapper[4893]: I0314 07:24:35.297266 4893 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:35 crc kubenswrapper[4893]: I0314 07:24:35.297279 4893 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:35 crc kubenswrapper[4893]: I0314 07:24:35.303253 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a" (UID: "0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:35 crc kubenswrapper[4893]: I0314 07:24:35.321671 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a" (UID: "0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:35 crc kubenswrapper[4893]: I0314 07:24:35.385389 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a23c46a-df73-4630-bc6b-8df7a4bff754" path="/var/lib/kubelet/pods/0a23c46a-df73-4630-bc6b-8df7a4bff754/volumes" Mar 14 07:24:35 crc kubenswrapper[4893]: I0314 07:24:35.385704 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1239ac87-7084-45c6-9eef-ecab07108656" path="/var/lib/kubelet/pods/1239ac87-7084-45c6-9eef-ecab07108656/volumes" Mar 14 07:24:35 crc kubenswrapper[4893]: I0314 07:24:35.386358 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cad381a-c3bf-4fc8-a314-6f45028f3482" path="/var/lib/kubelet/pods/1cad381a-c3bf-4fc8-a314-6f45028f3482/volumes" Mar 14 07:24:35 crc kubenswrapper[4893]: I0314 07:24:35.386891 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2195ecfb-6eeb-48f1-8b55-c57520974663" path="/var/lib/kubelet/pods/2195ecfb-6eeb-48f1-8b55-c57520974663/volumes" Mar 14 07:24:35 crc kubenswrapper[4893]: I0314 07:24:35.388056 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="444c0d3d-4ad4-47a3-9281-b7028d69a78a" path="/var/lib/kubelet/pods/444c0d3d-4ad4-47a3-9281-b7028d69a78a/volumes" Mar 14 07:24:35 crc kubenswrapper[4893]: I0314 07:24:35.388665 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a07f9759-5cdb-4e42-b7c6-714d0e34ee55" path="/var/lib/kubelet/pods/a07f9759-5cdb-4e42-b7c6-714d0e34ee55/volumes" Mar 14 07:24:35 crc kubenswrapper[4893]: I0314 07:24:35.390096 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0d5e368-4fac-48b9-a64c-717f3acf9388" path="/var/lib/kubelet/pods/a0d5e368-4fac-48b9-a64c-717f3acf9388/volumes" Mar 14 07:24:35 crc kubenswrapper[4893]: I0314 07:24:35.390996 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6ca04fa-accd-437a-ab63-d39d14a49777" path="/var/lib/kubelet/pods/b6ca04fa-accd-437a-ab63-d39d14a49777/volumes" Mar 14 07:24:35 crc kubenswrapper[4893]: I0314 07:24:35.391577 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7a6e4cc-c5a8-4551-bd39-ab4eb11331df" path="/var/lib/kubelet/pods/c7a6e4cc-c5a8-4551-bd39-ab4eb11331df/volumes" Mar 14 07:24:35 crc kubenswrapper[4893]: I0314 07:24:35.392487 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0f57646-651c-4b8f-b73d-6606d06fa3a3" path="/var/lib/kubelet/pods/d0f57646-651c-4b8f-b73d-6606d06fa3a3/volumes" Mar 14 07:24:35 crc kubenswrapper[4893]: I0314 07:24:35.392930 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6be5c8e-c381-4e29-90e7-069d902c1805" path="/var/lib/kubelet/pods/e6be5c8e-c381-4e29-90e7-069d902c1805/volumes" Mar 14 07:24:35 crc kubenswrapper[4893]: I0314 07:24:35.393352 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb5d1a2a-ad9f-4eb7-8a70-37a1523a1b6f" path="/var/lib/kubelet/pods/eb5d1a2a-ad9f-4eb7-8a70-37a1523a1b6f/volumes" Mar 14 07:24:35 crc kubenswrapper[4893]: I0314 07:24:35.394240 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fefe82b2-447a-4f97-8221-7050b61ef60c" path="/var/lib/kubelet/pods/fefe82b2-447a-4f97-8221-7050b61ef60c/volumes" Mar 14 07:24:35 crc kubenswrapper[4893]: I0314 07:24:35.399136 4893 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:35 crc kubenswrapper[4893]: I0314 07:24:35.399177 4893 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:35 crc kubenswrapper[4893]: E0314 07:24:35.556558 4893 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="815433c66f3f469063c8f960c990cc45531d8077d9256e3db6d30e4aeb47f852" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 14 07:24:35 crc kubenswrapper[4893]: E0314 07:24:35.556621 4893 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 14dfeeb1ac493f8c868ceb12cb9f2623644d493f702d4b8d551c54f2d4fd82cb is running failed: container process not found" containerID="14dfeeb1ac493f8c868ceb12cb9f2623644d493f702d4b8d551c54f2d4fd82cb" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 14 07:24:35 crc kubenswrapper[4893]: E0314 07:24:35.559095 4893 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="815433c66f3f469063c8f960c990cc45531d8077d9256e3db6d30e4aeb47f852" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 14 07:24:35 crc kubenswrapper[4893]: E0314 07:24:35.560655 4893 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 14dfeeb1ac493f8c868ceb12cb9f2623644d493f702d4b8d551c54f2d4fd82cb is running failed: container process not found" containerID="14dfeeb1ac493f8c868ceb12cb9f2623644d493f702d4b8d551c54f2d4fd82cb" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 14 07:24:35 crc kubenswrapper[4893]: E0314 07:24:35.564763 4893 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 14dfeeb1ac493f8c868ceb12cb9f2623644d493f702d4b8d551c54f2d4fd82cb is running failed: container process not found" containerID="14dfeeb1ac493f8c868ceb12cb9f2623644d493f702d4b8d551c54f2d4fd82cb" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 14 07:24:35 crc kubenswrapper[4893]: E0314 07:24:35.564819 4893 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 14dfeeb1ac493f8c868ceb12cb9f2623644d493f702d4b8d551c54f2d4fd82cb is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-bwq2l" podUID="ec3a7835-99ba-4d0d-b81d-2dea0dc7128b" containerName="ovsdb-server" Mar 14 07:24:35 crc kubenswrapper[4893]: E0314 07:24:35.565105 4893 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="815433c66f3f469063c8f960c990cc45531d8077d9256e3db6d30e4aeb47f852" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 14 07:24:35 crc kubenswrapper[4893]: E0314 07:24:35.565131 4893 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-bwq2l" podUID="ec3a7835-99ba-4d0d-b81d-2dea0dc7128b" containerName="ovs-vswitchd" Mar 14 07:24:35 crc kubenswrapper[4893]: I0314 07:24:35.572103 4893 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-8rcbf" podUID="a4b44171-12ae-4a98-aac1-1adc9dff3941" containerName="ovn-controller" probeResult="failure" output=< Mar 14 07:24:35 crc kubenswrapper[4893]: ERROR - Failed to get connection status from ovn-controller, ovn-appctl exit status: 0 Mar 14 07:24:35 crc kubenswrapper[4893]: > Mar 14 07:24:35 crc kubenswrapper[4893]: I0314 07:24:35.578487 4893 generic.go:334] "Generic (PLEG): container finished" podID="a752b3c8-284e-490f-be39-506e7a075c6f" containerID="ee01de4008b1335a2c5408eb31930a4fb5131e254bc97793e1f6069614788d91" exitCode=0 Mar 14 07:24:35 crc kubenswrapper[4893]: I0314 07:24:35.578629 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a752b3c8-284e-490f-be39-506e7a075c6f","Type":"ContainerDied","Data":"ee01de4008b1335a2c5408eb31930a4fb5131e254bc97793e1f6069614788d91"} Mar 14 07:24:35 crc kubenswrapper[4893]: I0314 07:24:35.594410 4893 generic.go:334] "Generic (PLEG): container finished" podID="7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e" containerID="a5f24be09a6c54f86e91629c32362dd6fc01dad63e4f1a792a6a2d72329e86ae" exitCode=0 Mar 14 07:24:35 crc kubenswrapper[4893]: I0314 07:24:35.594539 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e","Type":"ContainerDied","Data":"a5f24be09a6c54f86e91629c32362dd6fc01dad63e4f1a792a6a2d72329e86ae"} Mar 14 07:24:35 crc kubenswrapper[4893]: I0314 07:24:35.596152 4893 generic.go:334] "Generic (PLEG): container finished" podID="0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a" containerID="c53128a0bfb2847ab538756827eaf9e85484436e9a236685873bbc44f3f46b7f" exitCode=0 Mar 14 07:24:35 crc kubenswrapper[4893]: I0314 07:24:35.596198 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-787475576f-6cj4v" event={"ID":"0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a","Type":"ContainerDied","Data":"c53128a0bfb2847ab538756827eaf9e85484436e9a236685873bbc44f3f46b7f"} Mar 14 07:24:35 crc kubenswrapper[4893]: I0314 07:24:35.596224 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-787475576f-6cj4v" event={"ID":"0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a","Type":"ContainerDied","Data":"396df3dc672d4824a6f00250b633c94cf41afd05817c9b12e2f6bfc896aa4a85"} Mar 14 07:24:35 crc kubenswrapper[4893]: I0314 07:24:35.596243 4893 scope.go:117] "RemoveContainer" containerID="c53128a0bfb2847ab538756827eaf9e85484436e9a236685873bbc44f3f46b7f" Mar 14 07:24:35 crc kubenswrapper[4893]: I0314 07:24:35.596335 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-787475576f-6cj4v" Mar 14 07:24:35 crc kubenswrapper[4893]: I0314 07:24:35.613711 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-787475576f-6cj4v"] Mar 14 07:24:35 crc kubenswrapper[4893]: I0314 07:24:35.619395 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-787475576f-6cj4v"] Mar 14 07:24:35 crc kubenswrapper[4893]: I0314 07:24:35.624997 4893 scope.go:117] "RemoveContainer" containerID="c53128a0bfb2847ab538756827eaf9e85484436e9a236685873bbc44f3f46b7f" Mar 14 07:24:35 crc kubenswrapper[4893]: E0314 07:24:35.625404 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c53128a0bfb2847ab538756827eaf9e85484436e9a236685873bbc44f3f46b7f\": container with ID starting with c53128a0bfb2847ab538756827eaf9e85484436e9a236685873bbc44f3f46b7f not found: ID does not exist" containerID="c53128a0bfb2847ab538756827eaf9e85484436e9a236685873bbc44f3f46b7f" Mar 14 07:24:35 crc kubenswrapper[4893]: I0314 07:24:35.625438 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c53128a0bfb2847ab538756827eaf9e85484436e9a236685873bbc44f3f46b7f"} err="failed to get container status \"c53128a0bfb2847ab538756827eaf9e85484436e9a236685873bbc44f3f46b7f\": rpc error: code = NotFound desc = could not find container \"c53128a0bfb2847ab538756827eaf9e85484436e9a236685873bbc44f3f46b7f\": container with ID starting with c53128a0bfb2847ab538756827eaf9e85484436e9a236685873bbc44f3f46b7f not found: ID does not exist" Mar 14 07:24:35 crc kubenswrapper[4893]: E0314 07:24:35.649655 4893 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Mar 14 07:24:35 crc kubenswrapper[4893]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2026-03-14T07:24:28Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Mar 14 07:24:35 crc kubenswrapper[4893]: /etc/init.d/functions: line 589: 386 Alarm clock "$@" Mar 14 07:24:35 crc kubenswrapper[4893]: > execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-8rcbf" message=< Mar 14 07:24:35 crc kubenswrapper[4893]: Exiting ovn-controller (1) [FAILED] Mar 14 07:24:35 crc kubenswrapper[4893]: Killing ovn-controller (1) [ OK ] Mar 14 07:24:35 crc kubenswrapper[4893]: Killing ovn-controller (1) with SIGKILL [ OK ] Mar 14 07:24:35 crc kubenswrapper[4893]: 2026-03-14T07:24:28Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Mar 14 07:24:35 crc kubenswrapper[4893]: /etc/init.d/functions: line 589: 386 Alarm clock "$@" Mar 14 07:24:35 crc kubenswrapper[4893]: > Mar 14 07:24:35 crc kubenswrapper[4893]: E0314 07:24:35.649695 4893 kuberuntime_container.go:691] "PreStop hook failed" err=< Mar 14 07:24:35 crc kubenswrapper[4893]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2026-03-14T07:24:28Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Mar 14 07:24:35 crc kubenswrapper[4893]: /etc/init.d/functions: line 589: 386 Alarm clock "$@" Mar 14 07:24:35 crc kubenswrapper[4893]: > pod="openstack/ovn-controller-8rcbf" podUID="a4b44171-12ae-4a98-aac1-1adc9dff3941" containerName="ovn-controller" containerID="cri-o://9e22a88384bee5ff968ae0cd7ac28fcae82795ada865fc08d38fb7e786168779" Mar 14 07:24:35 crc kubenswrapper[4893]: I0314 07:24:35.649728 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-8rcbf" podUID="a4b44171-12ae-4a98-aac1-1adc9dff3941" containerName="ovn-controller" containerID="cri-o://9e22a88384bee5ff968ae0cd7ac28fcae82795ada865fc08d38fb7e786168779" gracePeriod=22 Mar 14 07:24:35 crc kubenswrapper[4893]: I0314 07:24:35.667963 4893 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="a752b3c8-284e-490f-be39-506e7a075c6f" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.105:5671: connect: connection refused" Mar 14 07:24:35 crc kubenswrapper[4893]: I0314 07:24:35.852493 4893 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-5cdc5f965f-t6wfv" podUID="2bee0811-3177-4034-aa99-39158e55c44f" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.170:9696/\": dial tcp 10.217.0.170:9696: connect: connection refused" Mar 14 07:24:35 crc kubenswrapper[4893]: I0314 07:24:35.879736 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:24:35 crc kubenswrapper[4893]: I0314 07:24:35.947279 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.022196 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e-rabbitmq-confd\") pod \"7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e\" (UID: \"7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e\") " Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.022275 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e-plugins-conf\") pod \"7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e\" (UID: \"7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e\") " Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.022309 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e-rabbitmq-plugins\") pod \"7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e\" (UID: \"7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e\") " Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.022405 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e-config-data\") pod \"7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e\" (UID: \"7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e\") " Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.022425 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e\" (UID: \"7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e\") " Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.022485 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e-rabbitmq-erlang-cookie\") pod \"7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e\" (UID: \"7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e\") " Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.022514 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e-erlang-cookie-secret\") pod \"7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e\" (UID: \"7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e\") " Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.022552 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e-server-conf\") pod \"7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e\" (UID: \"7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e\") " Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.022578 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5ws2\" (UniqueName: \"kubernetes.io/projected/7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e-kube-api-access-n5ws2\") pod \"7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e\" (UID: \"7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e\") " Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.022606 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e-rabbitmq-tls\") pod \"7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e\" (UID: \"7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e\") " Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.022629 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e-pod-info\") pod \"7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e\" (UID: \"7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e\") " Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.023075 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e" (UID: "7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.023763 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e" (UID: "7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.026916 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e-pod-info" (OuterVolumeSpecName: "pod-info") pod "7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e" (UID: "7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.027742 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e" (UID: "7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.027770 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e" (UID: "7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.034876 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e" (UID: "7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.035735 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "persistence") pod "7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e" (UID: "7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.049501 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e-kube-api-access-n5ws2" (OuterVolumeSpecName: "kube-api-access-n5ws2") pod "7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e" (UID: "7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e"). InnerVolumeSpecName "kube-api-access-n5ws2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.067030 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e-config-data" (OuterVolumeSpecName: "config-data") pod "7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e" (UID: "7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.070117 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e-server-conf" (OuterVolumeSpecName: "server-conf") pod "7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e" (UID: "7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.124426 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"a752b3c8-284e-490f-be39-506e7a075c6f\" (UID: \"a752b3c8-284e-490f-be39-506e7a075c6f\") " Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.124474 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a752b3c8-284e-490f-be39-506e7a075c6f-rabbitmq-plugins\") pod \"a752b3c8-284e-490f-be39-506e7a075c6f\" (UID: \"a752b3c8-284e-490f-be39-506e7a075c6f\") " Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.124607 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a752b3c8-284e-490f-be39-506e7a075c6f-rabbitmq-erlang-cookie\") pod \"a752b3c8-284e-490f-be39-506e7a075c6f\" (UID: \"a752b3c8-284e-490f-be39-506e7a075c6f\") " Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.124630 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5qkl\" (UniqueName: \"kubernetes.io/projected/a752b3c8-284e-490f-be39-506e7a075c6f-kube-api-access-k5qkl\") pod \"a752b3c8-284e-490f-be39-506e7a075c6f\" (UID: \"a752b3c8-284e-490f-be39-506e7a075c6f\") " Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.124652 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a752b3c8-284e-490f-be39-506e7a075c6f-erlang-cookie-secret\") pod \"a752b3c8-284e-490f-be39-506e7a075c6f\" (UID: \"a752b3c8-284e-490f-be39-506e7a075c6f\") " Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.124668 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a752b3c8-284e-490f-be39-506e7a075c6f-rabbitmq-confd\") pod \"a752b3c8-284e-490f-be39-506e7a075c6f\" (UID: \"a752b3c8-284e-490f-be39-506e7a075c6f\") " Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.124686 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a752b3c8-284e-490f-be39-506e7a075c6f-rabbitmq-tls\") pod \"a752b3c8-284e-490f-be39-506e7a075c6f\" (UID: \"a752b3c8-284e-490f-be39-506e7a075c6f\") " Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.124706 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a752b3c8-284e-490f-be39-506e7a075c6f-config-data\") pod \"a752b3c8-284e-490f-be39-506e7a075c6f\" (UID: \"a752b3c8-284e-490f-be39-506e7a075c6f\") " Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.124742 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a752b3c8-284e-490f-be39-506e7a075c6f-pod-info\") pod \"a752b3c8-284e-490f-be39-506e7a075c6f\" (UID: \"a752b3c8-284e-490f-be39-506e7a075c6f\") " Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.124759 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a752b3c8-284e-490f-be39-506e7a075c6f-server-conf\") pod \"a752b3c8-284e-490f-be39-506e7a075c6f\" (UID: \"a752b3c8-284e-490f-be39-506e7a075c6f\") " Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.124779 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a752b3c8-284e-490f-be39-506e7a075c6f-plugins-conf\") pod \"a752b3c8-284e-490f-be39-506e7a075c6f\" (UID: \"a752b3c8-284e-490f-be39-506e7a075c6f\") " Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.125041 4893 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.125054 4893 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.125063 4893 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.125081 4893 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.125092 4893 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.125101 4893 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.125110 4893 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e-server-conf\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.125119 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5ws2\" (UniqueName: \"kubernetes.io/projected/7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e-kube-api-access-n5ws2\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.125128 4893 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.125136 4893 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e-pod-info\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.132054 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a752b3c8-284e-490f-be39-506e7a075c6f-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "a752b3c8-284e-490f-be39-506e7a075c6f" (UID: "a752b3c8-284e-490f-be39-506e7a075c6f"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.132873 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a752b3c8-284e-490f-be39-506e7a075c6f-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "a752b3c8-284e-490f-be39-506e7a075c6f" (UID: "a752b3c8-284e-490f-be39-506e7a075c6f"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.134654 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e" (UID: "7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.134958 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a752b3c8-284e-490f-be39-506e7a075c6f-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "a752b3c8-284e-490f-be39-506e7a075c6f" (UID: "a752b3c8-284e-490f-be39-506e7a075c6f"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.136608 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a752b3c8-284e-490f-be39-506e7a075c6f-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "a752b3c8-284e-490f-be39-506e7a075c6f" (UID: "a752b3c8-284e-490f-be39-506e7a075c6f"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.149704 4893 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.155973 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/a752b3c8-284e-490f-be39-506e7a075c6f-pod-info" (OuterVolumeSpecName: "pod-info") pod "a752b3c8-284e-490f-be39-506e7a075c6f" (UID: "a752b3c8-284e-490f-be39-506e7a075c6f"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.156022 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a752b3c8-284e-490f-be39-506e7a075c6f-kube-api-access-k5qkl" (OuterVolumeSpecName: "kube-api-access-k5qkl") pod "a752b3c8-284e-490f-be39-506e7a075c6f" (UID: "a752b3c8-284e-490f-be39-506e7a075c6f"). InnerVolumeSpecName "kube-api-access-k5qkl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.159704 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "a752b3c8-284e-490f-be39-506e7a075c6f" (UID: "a752b3c8-284e-490f-be39-506e7a075c6f"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.167100 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a752b3c8-284e-490f-be39-506e7a075c6f-config-data" (OuterVolumeSpecName: "config-data") pod "a752b3c8-284e-490f-be39-506e7a075c6f" (UID: "a752b3c8-284e-490f-be39-506e7a075c6f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.167498 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a752b3c8-284e-490f-be39-506e7a075c6f-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "a752b3c8-284e-490f-be39-506e7a075c6f" (UID: "a752b3c8-284e-490f-be39-506e7a075c6f"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.185082 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a752b3c8-284e-490f-be39-506e7a075c6f-server-conf" (OuterVolumeSpecName: "server-conf") pod "a752b3c8-284e-490f-be39-506e7a075c6f" (UID: "a752b3c8-284e-490f-be39-506e7a075c6f"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.226472 4893 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a752b3c8-284e-490f-be39-506e7a075c6f-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.226603 4893 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.226663 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5qkl\" (UniqueName: \"kubernetes.io/projected/a752b3c8-284e-490f-be39-506e7a075c6f-kube-api-access-k5qkl\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.226715 4893 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a752b3c8-284e-490f-be39-506e7a075c6f-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.226772 4893 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a752b3c8-284e-490f-be39-506e7a075c6f-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.226835 4893 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a752b3c8-284e-490f-be39-506e7a075c6f-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.226888 4893 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a752b3c8-284e-490f-be39-506e7a075c6f-pod-info\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.226944 4893 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a752b3c8-284e-490f-be39-506e7a075c6f-server-conf\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.226997 4893 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a752b3c8-284e-490f-be39-506e7a075c6f-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.227061 4893 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.227124 4893 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a752b3c8-284e-490f-be39-506e7a075c6f-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.227182 4893 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.232867 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a752b3c8-284e-490f-be39-506e7a075c6f-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "a752b3c8-284e-490f-be39-506e7a075c6f" (UID: "a752b3c8-284e-490f-be39-506e7a075c6f"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.244217 4893 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.275344 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-8rcbf_a4b44171-12ae-4a98-aac1-1adc9dff3941/ovn-controller/0.log" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.275408 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-8rcbf" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.280907 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.329361 4893 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a752b3c8-284e-490f-be39-506e7a075c6f-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.329401 4893 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.430300 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a4b44171-12ae-4a98-aac1-1adc9dff3941-var-run-ovn\") pod \"a4b44171-12ae-4a98-aac1-1adc9dff3941\" (UID: \"a4b44171-12ae-4a98-aac1-1adc9dff3941\") " Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.430419 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc7c1963-417f-453f-8983-1c03d349f76d-ceilometer-tls-certs\") pod \"dc7c1963-417f-453f-8983-1c03d349f76d\" (UID: \"dc7c1963-417f-453f-8983-1c03d349f76d\") " Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.430414 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a4b44171-12ae-4a98-aac1-1adc9dff3941-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "a4b44171-12ae-4a98-aac1-1adc9dff3941" (UID: "a4b44171-12ae-4a98-aac1-1adc9dff3941"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.430479 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4b44171-12ae-4a98-aac1-1adc9dff3941-combined-ca-bundle\") pod \"a4b44171-12ae-4a98-aac1-1adc9dff3941\" (UID: \"a4b44171-12ae-4a98-aac1-1adc9dff3941\") " Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.430504 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc7c1963-417f-453f-8983-1c03d349f76d-combined-ca-bundle\") pod \"dc7c1963-417f-453f-8983-1c03d349f76d\" (UID: \"dc7c1963-417f-453f-8983-1c03d349f76d\") " Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.430563 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc7c1963-417f-453f-8983-1c03d349f76d-sg-core-conf-yaml\") pod \"dc7c1963-417f-453f-8983-1c03d349f76d\" (UID: \"dc7c1963-417f-453f-8983-1c03d349f76d\") " Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.431006 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a4b44171-12ae-4a98-aac1-1adc9dff3941-var-log-ovn\") pod \"a4b44171-12ae-4a98-aac1-1adc9dff3941\" (UID: \"a4b44171-12ae-4a98-aac1-1adc9dff3941\") " Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.431072 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2rzg\" (UniqueName: \"kubernetes.io/projected/a4b44171-12ae-4a98-aac1-1adc9dff3941-kube-api-access-v2rzg\") pod \"a4b44171-12ae-4a98-aac1-1adc9dff3941\" (UID: \"a4b44171-12ae-4a98-aac1-1adc9dff3941\") " Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.431101 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a4b44171-12ae-4a98-aac1-1adc9dff3941-var-run\") pod \"a4b44171-12ae-4a98-aac1-1adc9dff3941\" (UID: \"a4b44171-12ae-4a98-aac1-1adc9dff3941\") " Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.431140 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a4b44171-12ae-4a98-aac1-1adc9dff3941-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "a4b44171-12ae-4a98-aac1-1adc9dff3941" (UID: "a4b44171-12ae-4a98-aac1-1adc9dff3941"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.431183 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc7c1963-417f-453f-8983-1c03d349f76d-run-httpd\") pod \"dc7c1963-417f-453f-8983-1c03d349f76d\" (UID: \"dc7c1963-417f-453f-8983-1c03d349f76d\") " Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.431218 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc7c1963-417f-453f-8983-1c03d349f76d-log-httpd\") pod \"dc7c1963-417f-453f-8983-1c03d349f76d\" (UID: \"dc7c1963-417f-453f-8983-1c03d349f76d\") " Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.431296 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc7c1963-417f-453f-8983-1c03d349f76d-scripts\") pod \"dc7c1963-417f-453f-8983-1c03d349f76d\" (UID: \"dc7c1963-417f-453f-8983-1c03d349f76d\") " Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.431341 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4b44171-12ae-4a98-aac1-1adc9dff3941-ovn-controller-tls-certs\") pod \"a4b44171-12ae-4a98-aac1-1adc9dff3941\" (UID: \"a4b44171-12ae-4a98-aac1-1adc9dff3941\") " Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.431378 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5bx6\" (UniqueName: \"kubernetes.io/projected/dc7c1963-417f-453f-8983-1c03d349f76d-kube-api-access-p5bx6\") pod \"dc7c1963-417f-453f-8983-1c03d349f76d\" (UID: \"dc7c1963-417f-453f-8983-1c03d349f76d\") " Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.431412 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc7c1963-417f-453f-8983-1c03d349f76d-config-data\") pod \"dc7c1963-417f-453f-8983-1c03d349f76d\" (UID: \"dc7c1963-417f-453f-8983-1c03d349f76d\") " Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.431446 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a4b44171-12ae-4a98-aac1-1adc9dff3941-scripts\") pod \"a4b44171-12ae-4a98-aac1-1adc9dff3941\" (UID: \"a4b44171-12ae-4a98-aac1-1adc9dff3941\") " Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.431816 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a4b44171-12ae-4a98-aac1-1adc9dff3941-var-run" (OuterVolumeSpecName: "var-run") pod "a4b44171-12ae-4a98-aac1-1adc9dff3941" (UID: "a4b44171-12ae-4a98-aac1-1adc9dff3941"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.431851 4893 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a4b44171-12ae-4a98-aac1-1adc9dff3941-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.431876 4893 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a4b44171-12ae-4a98-aac1-1adc9dff3941-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.432206 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc7c1963-417f-453f-8983-1c03d349f76d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "dc7c1963-417f-453f-8983-1c03d349f76d" (UID: "dc7c1963-417f-453f-8983-1c03d349f76d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.432248 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc7c1963-417f-453f-8983-1c03d349f76d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "dc7c1963-417f-453f-8983-1c03d349f76d" (UID: "dc7c1963-417f-453f-8983-1c03d349f76d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.434551 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4b44171-12ae-4a98-aac1-1adc9dff3941-kube-api-access-v2rzg" (OuterVolumeSpecName: "kube-api-access-v2rzg") pod "a4b44171-12ae-4a98-aac1-1adc9dff3941" (UID: "a4b44171-12ae-4a98-aac1-1adc9dff3941"). InnerVolumeSpecName "kube-api-access-v2rzg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.435299 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc7c1963-417f-453f-8983-1c03d349f76d-scripts" (OuterVolumeSpecName: "scripts") pod "dc7c1963-417f-453f-8983-1c03d349f76d" (UID: "dc7c1963-417f-453f-8983-1c03d349f76d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.436288 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc7c1963-417f-453f-8983-1c03d349f76d-kube-api-access-p5bx6" (OuterVolumeSpecName: "kube-api-access-p5bx6") pod "dc7c1963-417f-453f-8983-1c03d349f76d" (UID: "dc7c1963-417f-453f-8983-1c03d349f76d"). InnerVolumeSpecName "kube-api-access-p5bx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.438667 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4b44171-12ae-4a98-aac1-1adc9dff3941-scripts" (OuterVolumeSpecName: "scripts") pod "a4b44171-12ae-4a98-aac1-1adc9dff3941" (UID: "a4b44171-12ae-4a98-aac1-1adc9dff3941"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.452864 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4b44171-12ae-4a98-aac1-1adc9dff3941-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a4b44171-12ae-4a98-aac1-1adc9dff3941" (UID: "a4b44171-12ae-4a98-aac1-1adc9dff3941"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.461858 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc7c1963-417f-453f-8983-1c03d349f76d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "dc7c1963-417f-453f-8983-1c03d349f76d" (UID: "dc7c1963-417f-453f-8983-1c03d349f76d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.469815 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc7c1963-417f-453f-8983-1c03d349f76d-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "dc7c1963-417f-453f-8983-1c03d349f76d" (UID: "dc7c1963-417f-453f-8983-1c03d349f76d"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.501387 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc7c1963-417f-453f-8983-1c03d349f76d-config-data" (OuterVolumeSpecName: "config-data") pod "dc7c1963-417f-453f-8983-1c03d349f76d" (UID: "dc7c1963-417f-453f-8983-1c03d349f76d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.501956 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc7c1963-417f-453f-8983-1c03d349f76d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dc7c1963-417f-453f-8983-1c03d349f76d" (UID: "dc7c1963-417f-453f-8983-1c03d349f76d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.509710 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4b44171-12ae-4a98-aac1-1adc9dff3941-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "a4b44171-12ae-4a98-aac1-1adc9dff3941" (UID: "a4b44171-12ae-4a98-aac1-1adc9dff3941"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.532781 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2rzg\" (UniqueName: \"kubernetes.io/projected/a4b44171-12ae-4a98-aac1-1adc9dff3941-kube-api-access-v2rzg\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.532817 4893 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a4b44171-12ae-4a98-aac1-1adc9dff3941-var-run\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.532830 4893 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc7c1963-417f-453f-8983-1c03d349f76d-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.533023 4893 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc7c1963-417f-453f-8983-1c03d349f76d-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.533032 4893 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc7c1963-417f-453f-8983-1c03d349f76d-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.533040 4893 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4b44171-12ae-4a98-aac1-1adc9dff3941-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.533049 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5bx6\" (UniqueName: \"kubernetes.io/projected/dc7c1963-417f-453f-8983-1c03d349f76d-kube-api-access-p5bx6\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.533058 4893 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc7c1963-417f-453f-8983-1c03d349f76d-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.533066 4893 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a4b44171-12ae-4a98-aac1-1adc9dff3941-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.533073 4893 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc7c1963-417f-453f-8983-1c03d349f76d-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.533081 4893 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4b44171-12ae-4a98-aac1-1adc9dff3941-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.533089 4893 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc7c1963-417f-453f-8983-1c03d349f76d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.533097 4893 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc7c1963-417f-453f-8983-1c03d349f76d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.604733 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-8rcbf_a4b44171-12ae-4a98-aac1-1adc9dff3941/ovn-controller/0.log" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.604788 4893 generic.go:334] "Generic (PLEG): container finished" podID="a4b44171-12ae-4a98-aac1-1adc9dff3941" containerID="9e22a88384bee5ff968ae0cd7ac28fcae82795ada865fc08d38fb7e786168779" exitCode=137 Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.604850 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-8rcbf" event={"ID":"a4b44171-12ae-4a98-aac1-1adc9dff3941","Type":"ContainerDied","Data":"9e22a88384bee5ff968ae0cd7ac28fcae82795ada865fc08d38fb7e786168779"} Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.604876 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-8rcbf" event={"ID":"a4b44171-12ae-4a98-aac1-1adc9dff3941","Type":"ContainerDied","Data":"054ed7098a508e1c9b815a5bf9a718392396f2369433d55ab70a0ca01821ed1c"} Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.604894 4893 scope.go:117] "RemoveContainer" containerID="9e22a88384bee5ff968ae0cd7ac28fcae82795ada865fc08d38fb7e786168779" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.604907 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-8rcbf" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.607950 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e","Type":"ContainerDied","Data":"daa8dcd940f0cd32125e66a1b55c50274984ef532a9dc79df60326ce40eec1e0"} Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.608039 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.612093 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a752b3c8-284e-490f-be39-506e7a075c6f","Type":"ContainerDied","Data":"0ba35f303367256d7eb006970aae07499ffd749050a21d793a6fe41fb1adbe09"} Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.612113 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.615970 4893 generic.go:334] "Generic (PLEG): container finished" podID="dc7c1963-417f-453f-8983-1c03d349f76d" containerID="93e846e5581223fc8c1f1a453c95a7636bc81b66f008e4b7deba28bbe41f5000" exitCode=0 Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.616006 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc7c1963-417f-453f-8983-1c03d349f76d","Type":"ContainerDied","Data":"93e846e5581223fc8c1f1a453c95a7636bc81b66f008e4b7deba28bbe41f5000"} Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.616028 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc7c1963-417f-453f-8983-1c03d349f76d","Type":"ContainerDied","Data":"3050af06d35ad99ac22b01a1de64fe78ff0ebebb248fd8993018c3814bc273cc"} Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.616100 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.637850 4893 scope.go:117] "RemoveContainer" containerID="9e22a88384bee5ff968ae0cd7ac28fcae82795ada865fc08d38fb7e786168779" Mar 14 07:24:36 crc kubenswrapper[4893]: E0314 07:24:36.638764 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e22a88384bee5ff968ae0cd7ac28fcae82795ada865fc08d38fb7e786168779\": container with ID starting with 9e22a88384bee5ff968ae0cd7ac28fcae82795ada865fc08d38fb7e786168779 not found: ID does not exist" containerID="9e22a88384bee5ff968ae0cd7ac28fcae82795ada865fc08d38fb7e786168779" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.638795 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e22a88384bee5ff968ae0cd7ac28fcae82795ada865fc08d38fb7e786168779"} err="failed to get container status \"9e22a88384bee5ff968ae0cd7ac28fcae82795ada865fc08d38fb7e786168779\": rpc error: code = NotFound desc = could not find container \"9e22a88384bee5ff968ae0cd7ac28fcae82795ada865fc08d38fb7e786168779\": container with ID starting with 9e22a88384bee5ff968ae0cd7ac28fcae82795ada865fc08d38fb7e786168779 not found: ID does not exist" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.642230 4893 scope.go:117] "RemoveContainer" containerID="a5f24be09a6c54f86e91629c32362dd6fc01dad63e4f1a792a6a2d72329e86ae" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.652560 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.665486 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.678036 4893 scope.go:117] "RemoveContainer" containerID="682d8cd293d9dce0bae81b6b24fad5bbdb451e309a9432d29cbee73b0bde8366" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.678149 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.689987 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.699550 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.702851 4893 scope.go:117] "RemoveContainer" containerID="ee01de4008b1335a2c5408eb31930a4fb5131e254bc97793e1f6069614788d91" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.708568 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.715267 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-8rcbf"] Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.717569 4893 scope.go:117] "RemoveContainer" containerID="292f113e5f439a52265f81f58611b262cd2d04d5dfef8d95fd26c5f4c46fb3b1" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.722078 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-8rcbf"] Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.738640 4893 scope.go:117] "RemoveContainer" containerID="5de4204094b61d61b74c39b2bd96d3d51618d7bb7e19601f38fede117a170eb3" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.765612 4893 scope.go:117] "RemoveContainer" containerID="93c8ba18223ce6d9916a1a48fe599793ff7a55be5ba49065b9cec75d19d40537" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.786481 4893 scope.go:117] "RemoveContainer" containerID="93e846e5581223fc8c1f1a453c95a7636bc81b66f008e4b7deba28bbe41f5000" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.814672 4893 scope.go:117] "RemoveContainer" containerID="55f440a65215e639fbf8031519f6a32152e474e4f88918f7a1c6749a48adaca4" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.836291 4893 scope.go:117] "RemoveContainer" containerID="5de4204094b61d61b74c39b2bd96d3d51618d7bb7e19601f38fede117a170eb3" Mar 14 07:24:36 crc kubenswrapper[4893]: E0314 07:24:36.837003 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5de4204094b61d61b74c39b2bd96d3d51618d7bb7e19601f38fede117a170eb3\": container with ID starting with 5de4204094b61d61b74c39b2bd96d3d51618d7bb7e19601f38fede117a170eb3 not found: ID does not exist" containerID="5de4204094b61d61b74c39b2bd96d3d51618d7bb7e19601f38fede117a170eb3" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.837060 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5de4204094b61d61b74c39b2bd96d3d51618d7bb7e19601f38fede117a170eb3"} err="failed to get container status \"5de4204094b61d61b74c39b2bd96d3d51618d7bb7e19601f38fede117a170eb3\": rpc error: code = NotFound desc = could not find container \"5de4204094b61d61b74c39b2bd96d3d51618d7bb7e19601f38fede117a170eb3\": container with ID starting with 5de4204094b61d61b74c39b2bd96d3d51618d7bb7e19601f38fede117a170eb3 not found: ID does not exist" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.837097 4893 scope.go:117] "RemoveContainer" containerID="93c8ba18223ce6d9916a1a48fe599793ff7a55be5ba49065b9cec75d19d40537" Mar 14 07:24:36 crc kubenswrapper[4893]: E0314 07:24:36.837566 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93c8ba18223ce6d9916a1a48fe599793ff7a55be5ba49065b9cec75d19d40537\": container with ID starting with 93c8ba18223ce6d9916a1a48fe599793ff7a55be5ba49065b9cec75d19d40537 not found: ID does not exist" containerID="93c8ba18223ce6d9916a1a48fe599793ff7a55be5ba49065b9cec75d19d40537" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.837657 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93c8ba18223ce6d9916a1a48fe599793ff7a55be5ba49065b9cec75d19d40537"} err="failed to get container status \"93c8ba18223ce6d9916a1a48fe599793ff7a55be5ba49065b9cec75d19d40537\": rpc error: code = NotFound desc = could not find container \"93c8ba18223ce6d9916a1a48fe599793ff7a55be5ba49065b9cec75d19d40537\": container with ID starting with 93c8ba18223ce6d9916a1a48fe599793ff7a55be5ba49065b9cec75d19d40537 not found: ID does not exist" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.837733 4893 scope.go:117] "RemoveContainer" containerID="93e846e5581223fc8c1f1a453c95a7636bc81b66f008e4b7deba28bbe41f5000" Mar 14 07:24:36 crc kubenswrapper[4893]: E0314 07:24:36.838085 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93e846e5581223fc8c1f1a453c95a7636bc81b66f008e4b7deba28bbe41f5000\": container with ID starting with 93e846e5581223fc8c1f1a453c95a7636bc81b66f008e4b7deba28bbe41f5000 not found: ID does not exist" containerID="93e846e5581223fc8c1f1a453c95a7636bc81b66f008e4b7deba28bbe41f5000" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.838116 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93e846e5581223fc8c1f1a453c95a7636bc81b66f008e4b7deba28bbe41f5000"} err="failed to get container status \"93e846e5581223fc8c1f1a453c95a7636bc81b66f008e4b7deba28bbe41f5000\": rpc error: code = NotFound desc = could not find container \"93e846e5581223fc8c1f1a453c95a7636bc81b66f008e4b7deba28bbe41f5000\": container with ID starting with 93e846e5581223fc8c1f1a453c95a7636bc81b66f008e4b7deba28bbe41f5000 not found: ID does not exist" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.838156 4893 scope.go:117] "RemoveContainer" containerID="55f440a65215e639fbf8031519f6a32152e474e4f88918f7a1c6749a48adaca4" Mar 14 07:24:36 crc kubenswrapper[4893]: E0314 07:24:36.838486 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55f440a65215e639fbf8031519f6a32152e474e4f88918f7a1c6749a48adaca4\": container with ID starting with 55f440a65215e639fbf8031519f6a32152e474e4f88918f7a1c6749a48adaca4 not found: ID does not exist" containerID="55f440a65215e639fbf8031519f6a32152e474e4f88918f7a1c6749a48adaca4" Mar 14 07:24:36 crc kubenswrapper[4893]: I0314 07:24:36.838551 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55f440a65215e639fbf8031519f6a32152e474e4f88918f7a1c6749a48adaca4"} err="failed to get container status \"55f440a65215e639fbf8031519f6a32152e474e4f88918f7a1c6749a48adaca4\": rpc error: code = NotFound desc = could not find container \"55f440a65215e639fbf8031519f6a32152e474e4f88918f7a1c6749a48adaca4\": container with ID starting with 55f440a65215e639fbf8031519f6a32152e474e4f88918f7a1c6749a48adaca4 not found: ID does not exist" Mar 14 07:24:37 crc kubenswrapper[4893]: I0314 07:24:37.391424 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a" path="/var/lib/kubelet/pods/0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a/volumes" Mar 14 07:24:37 crc kubenswrapper[4893]: I0314 07:24:37.392406 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e" path="/var/lib/kubelet/pods/7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e/volumes" Mar 14 07:24:37 crc kubenswrapper[4893]: I0314 07:24:37.392977 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4b44171-12ae-4a98-aac1-1adc9dff3941" path="/var/lib/kubelet/pods/a4b44171-12ae-4a98-aac1-1adc9dff3941/volumes" Mar 14 07:24:37 crc kubenswrapper[4893]: I0314 07:24:37.394133 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a752b3c8-284e-490f-be39-506e7a075c6f" path="/var/lib/kubelet/pods/a752b3c8-284e-490f-be39-506e7a075c6f/volumes" Mar 14 07:24:37 crc kubenswrapper[4893]: I0314 07:24:37.394684 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc7c1963-417f-453f-8983-1c03d349f76d" path="/var/lib/kubelet/pods/dc7c1963-417f-453f-8983-1c03d349f76d/volumes" Mar 14 07:24:37 crc kubenswrapper[4893]: I0314 07:24:37.579765 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 07:24:37 crc kubenswrapper[4893]: I0314 07:24:37.580165 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="6dd135e7-b208-4d7f-85f5-05baa2819788" containerName="nova-scheduler-scheduler" containerID="cri-o://34fd8df3d0beb69b7fc8e1a539f4034f7bd7a4fd0c322b7a5fd50447ba69c38c" gracePeriod=30 Mar 14 07:24:37 crc kubenswrapper[4893]: I0314 07:24:37.592429 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Mar 14 07:24:37 crc kubenswrapper[4893]: I0314 07:24:37.592702 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="6de6c560-1e2c-4dca-b4c2-be4e51a5300f" containerName="memcached" containerID="cri-o://3976a4aa22870f7b24f354814138dccf9148f7886eec69f805fe6b26c9098749" gracePeriod=30 Mar 14 07:24:39 crc kubenswrapper[4893]: I0314 07:24:39.351953 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 14 07:24:39 crc kubenswrapper[4893]: I0314 07:24:39.414185 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5cdc5f965f-t6wfv" Mar 14 07:24:39 crc kubenswrapper[4893]: I0314 07:24:39.475491 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6de6c560-1e2c-4dca-b4c2-be4e51a5300f-combined-ca-bundle\") pod \"6de6c560-1e2c-4dca-b4c2-be4e51a5300f\" (UID: \"6de6c560-1e2c-4dca-b4c2-be4e51a5300f\") " Mar 14 07:24:39 crc kubenswrapper[4893]: I0314 07:24:39.475587 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/6de6c560-1e2c-4dca-b4c2-be4e51a5300f-memcached-tls-certs\") pod \"6de6c560-1e2c-4dca-b4c2-be4e51a5300f\" (UID: \"6de6c560-1e2c-4dca-b4c2-be4e51a5300f\") " Mar 14 07:24:39 crc kubenswrapper[4893]: I0314 07:24:39.475647 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6de6c560-1e2c-4dca-b4c2-be4e51a5300f-kolla-config\") pod \"6de6c560-1e2c-4dca-b4c2-be4e51a5300f\" (UID: \"6de6c560-1e2c-4dca-b4c2-be4e51a5300f\") " Mar 14 07:24:39 crc kubenswrapper[4893]: I0314 07:24:39.475699 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgfn5\" (UniqueName: \"kubernetes.io/projected/6de6c560-1e2c-4dca-b4c2-be4e51a5300f-kube-api-access-dgfn5\") pod \"6de6c560-1e2c-4dca-b4c2-be4e51a5300f\" (UID: \"6de6c560-1e2c-4dca-b4c2-be4e51a5300f\") " Mar 14 07:24:39 crc kubenswrapper[4893]: I0314 07:24:39.475778 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6de6c560-1e2c-4dca-b4c2-be4e51a5300f-config-data\") pod \"6de6c560-1e2c-4dca-b4c2-be4e51a5300f\" (UID: \"6de6c560-1e2c-4dca-b4c2-be4e51a5300f\") " Mar 14 07:24:39 crc kubenswrapper[4893]: I0314 07:24:39.476462 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6de6c560-1e2c-4dca-b4c2-be4e51a5300f-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "6de6c560-1e2c-4dca-b4c2-be4e51a5300f" (UID: "6de6c560-1e2c-4dca-b4c2-be4e51a5300f"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:24:39 crc kubenswrapper[4893]: I0314 07:24:39.477393 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6de6c560-1e2c-4dca-b4c2-be4e51a5300f-config-data" (OuterVolumeSpecName: "config-data") pod "6de6c560-1e2c-4dca-b4c2-be4e51a5300f" (UID: "6de6c560-1e2c-4dca-b4c2-be4e51a5300f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:24:39 crc kubenswrapper[4893]: I0314 07:24:39.481655 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6de6c560-1e2c-4dca-b4c2-be4e51a5300f-kube-api-access-dgfn5" (OuterVolumeSpecName: "kube-api-access-dgfn5") pod "6de6c560-1e2c-4dca-b4c2-be4e51a5300f" (UID: "6de6c560-1e2c-4dca-b4c2-be4e51a5300f"). InnerVolumeSpecName "kube-api-access-dgfn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:24:39 crc kubenswrapper[4893]: I0314 07:24:39.509991 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6de6c560-1e2c-4dca-b4c2-be4e51a5300f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6de6c560-1e2c-4dca-b4c2-be4e51a5300f" (UID: "6de6c560-1e2c-4dca-b4c2-be4e51a5300f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:39 crc kubenswrapper[4893]: I0314 07:24:39.526200 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6de6c560-1e2c-4dca-b4c2-be4e51a5300f-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "6de6c560-1e2c-4dca-b4c2-be4e51a5300f" (UID: "6de6c560-1e2c-4dca-b4c2-be4e51a5300f"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:39 crc kubenswrapper[4893]: I0314 07:24:39.576755 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2bee0811-3177-4034-aa99-39158e55c44f-config\") pod \"2bee0811-3177-4034-aa99-39158e55c44f\" (UID: \"2bee0811-3177-4034-aa99-39158e55c44f\") " Mar 14 07:24:39 crc kubenswrapper[4893]: I0314 07:24:39.576827 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bee0811-3177-4034-aa99-39158e55c44f-public-tls-certs\") pod \"2bee0811-3177-4034-aa99-39158e55c44f\" (UID: \"2bee0811-3177-4034-aa99-39158e55c44f\") " Mar 14 07:24:39 crc kubenswrapper[4893]: I0314 07:24:39.576910 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2stgm\" (UniqueName: \"kubernetes.io/projected/2bee0811-3177-4034-aa99-39158e55c44f-kube-api-access-2stgm\") pod \"2bee0811-3177-4034-aa99-39158e55c44f\" (UID: \"2bee0811-3177-4034-aa99-39158e55c44f\") " Mar 14 07:24:39 crc kubenswrapper[4893]: I0314 07:24:39.576968 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bee0811-3177-4034-aa99-39158e55c44f-ovndb-tls-certs\") pod \"2bee0811-3177-4034-aa99-39158e55c44f\" (UID: \"2bee0811-3177-4034-aa99-39158e55c44f\") " Mar 14 07:24:39 crc kubenswrapper[4893]: I0314 07:24:39.577015 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bee0811-3177-4034-aa99-39158e55c44f-combined-ca-bundle\") pod \"2bee0811-3177-4034-aa99-39158e55c44f\" (UID: \"2bee0811-3177-4034-aa99-39158e55c44f\") " Mar 14 07:24:39 crc kubenswrapper[4893]: I0314 07:24:39.577081 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bee0811-3177-4034-aa99-39158e55c44f-internal-tls-certs\") pod \"2bee0811-3177-4034-aa99-39158e55c44f\" (UID: \"2bee0811-3177-4034-aa99-39158e55c44f\") " Mar 14 07:24:39 crc kubenswrapper[4893]: I0314 07:24:39.577125 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2bee0811-3177-4034-aa99-39158e55c44f-httpd-config\") pod \"2bee0811-3177-4034-aa99-39158e55c44f\" (UID: \"2bee0811-3177-4034-aa99-39158e55c44f\") " Mar 14 07:24:39 crc kubenswrapper[4893]: I0314 07:24:39.577632 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dgfn5\" (UniqueName: \"kubernetes.io/projected/6de6c560-1e2c-4dca-b4c2-be4e51a5300f-kube-api-access-dgfn5\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:39 crc kubenswrapper[4893]: I0314 07:24:39.577666 4893 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6de6c560-1e2c-4dca-b4c2-be4e51a5300f-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:39 crc kubenswrapper[4893]: I0314 07:24:39.577687 4893 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6de6c560-1e2c-4dca-b4c2-be4e51a5300f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:39 crc kubenswrapper[4893]: I0314 07:24:39.577707 4893 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/6de6c560-1e2c-4dca-b4c2-be4e51a5300f-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:39 crc kubenswrapper[4893]: I0314 07:24:39.577725 4893 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6de6c560-1e2c-4dca-b4c2-be4e51a5300f-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:39 crc kubenswrapper[4893]: I0314 07:24:39.580176 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bee0811-3177-4034-aa99-39158e55c44f-kube-api-access-2stgm" (OuterVolumeSpecName: "kube-api-access-2stgm") pod "2bee0811-3177-4034-aa99-39158e55c44f" (UID: "2bee0811-3177-4034-aa99-39158e55c44f"). InnerVolumeSpecName "kube-api-access-2stgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:24:39 crc kubenswrapper[4893]: I0314 07:24:39.582269 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bee0811-3177-4034-aa99-39158e55c44f-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "2bee0811-3177-4034-aa99-39158e55c44f" (UID: "2bee0811-3177-4034-aa99-39158e55c44f"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:39 crc kubenswrapper[4893]: I0314 07:24:39.622065 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bee0811-3177-4034-aa99-39158e55c44f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2bee0811-3177-4034-aa99-39158e55c44f" (UID: "2bee0811-3177-4034-aa99-39158e55c44f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:39 crc kubenswrapper[4893]: I0314 07:24:39.630069 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bee0811-3177-4034-aa99-39158e55c44f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2bee0811-3177-4034-aa99-39158e55c44f" (UID: "2bee0811-3177-4034-aa99-39158e55c44f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:39 crc kubenswrapper[4893]: I0314 07:24:39.639116 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bee0811-3177-4034-aa99-39158e55c44f-config" (OuterVolumeSpecName: "config") pod "2bee0811-3177-4034-aa99-39158e55c44f" (UID: "2bee0811-3177-4034-aa99-39158e55c44f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:39 crc kubenswrapper[4893]: I0314 07:24:39.640150 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bee0811-3177-4034-aa99-39158e55c44f-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "2bee0811-3177-4034-aa99-39158e55c44f" (UID: "2bee0811-3177-4034-aa99-39158e55c44f"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:39 crc kubenswrapper[4893]: I0314 07:24:39.651327 4893 generic.go:334] "Generic (PLEG): container finished" podID="2bee0811-3177-4034-aa99-39158e55c44f" containerID="14a32fe9b5caad93b19511764a6626f209dfafffc629a6f33c688c048a09bb5d" exitCode=0 Mar 14 07:24:39 crc kubenswrapper[4893]: I0314 07:24:39.651399 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5cdc5f965f-t6wfv" Mar 14 07:24:39 crc kubenswrapper[4893]: I0314 07:24:39.651403 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5cdc5f965f-t6wfv" event={"ID":"2bee0811-3177-4034-aa99-39158e55c44f","Type":"ContainerDied","Data":"14a32fe9b5caad93b19511764a6626f209dfafffc629a6f33c688c048a09bb5d"} Mar 14 07:24:39 crc kubenswrapper[4893]: I0314 07:24:39.651589 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5cdc5f965f-t6wfv" event={"ID":"2bee0811-3177-4034-aa99-39158e55c44f","Type":"ContainerDied","Data":"b198a4813a42a64738b956734b9eed27d901c2fb3560a8a6d19529237f69b1ef"} Mar 14 07:24:39 crc kubenswrapper[4893]: I0314 07:24:39.651635 4893 scope.go:117] "RemoveContainer" containerID="7a372b079b7d68e2f8f76d74a0d939fbb7c8e14fb0e7f1258273cfbe77ae9b8b" Mar 14 07:24:39 crc kubenswrapper[4893]: I0314 07:24:39.654156 4893 generic.go:334] "Generic (PLEG): container finished" podID="6de6c560-1e2c-4dca-b4c2-be4e51a5300f" containerID="3976a4aa22870f7b24f354814138dccf9148f7886eec69f805fe6b26c9098749" exitCode=0 Mar 14 07:24:39 crc kubenswrapper[4893]: I0314 07:24:39.654188 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 14 07:24:39 crc kubenswrapper[4893]: I0314 07:24:39.654198 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"6de6c560-1e2c-4dca-b4c2-be4e51a5300f","Type":"ContainerDied","Data":"3976a4aa22870f7b24f354814138dccf9148f7886eec69f805fe6b26c9098749"} Mar 14 07:24:39 crc kubenswrapper[4893]: I0314 07:24:39.654249 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"6de6c560-1e2c-4dca-b4c2-be4e51a5300f","Type":"ContainerDied","Data":"83202805168c64d677b4c93e1f0226f435335bc812fe65a507a45275611928bf"} Mar 14 07:24:39 crc kubenswrapper[4893]: I0314 07:24:39.655077 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bee0811-3177-4034-aa99-39158e55c44f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2bee0811-3177-4034-aa99-39158e55c44f" (UID: "2bee0811-3177-4034-aa99-39158e55c44f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:39 crc kubenswrapper[4893]: I0314 07:24:39.681628 4893 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bee0811-3177-4034-aa99-39158e55c44f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:39 crc kubenswrapper[4893]: I0314 07:24:39.681834 4893 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bee0811-3177-4034-aa99-39158e55c44f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:39 crc kubenswrapper[4893]: I0314 07:24:39.681892 4893 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2bee0811-3177-4034-aa99-39158e55c44f-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:39 crc kubenswrapper[4893]: I0314 07:24:39.681953 4893 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/2bee0811-3177-4034-aa99-39158e55c44f-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:39 crc kubenswrapper[4893]: I0314 07:24:39.682033 4893 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bee0811-3177-4034-aa99-39158e55c44f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:39 crc kubenswrapper[4893]: I0314 07:24:39.682139 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2stgm\" (UniqueName: \"kubernetes.io/projected/2bee0811-3177-4034-aa99-39158e55c44f-kube-api-access-2stgm\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:39 crc kubenswrapper[4893]: I0314 07:24:39.682207 4893 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bee0811-3177-4034-aa99-39158e55c44f-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:39 crc kubenswrapper[4893]: I0314 07:24:39.691178 4893 scope.go:117] "RemoveContainer" containerID="14a32fe9b5caad93b19511764a6626f209dfafffc629a6f33c688c048a09bb5d" Mar 14 07:24:39 crc kubenswrapper[4893]: I0314 07:24:39.695501 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Mar 14 07:24:39 crc kubenswrapper[4893]: I0314 07:24:39.703362 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Mar 14 07:24:39 crc kubenswrapper[4893]: I0314 07:24:39.731854 4893 scope.go:117] "RemoveContainer" containerID="7a372b079b7d68e2f8f76d74a0d939fbb7c8e14fb0e7f1258273cfbe77ae9b8b" Mar 14 07:24:39 crc kubenswrapper[4893]: E0314 07:24:39.732377 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a372b079b7d68e2f8f76d74a0d939fbb7c8e14fb0e7f1258273cfbe77ae9b8b\": container with ID starting with 7a372b079b7d68e2f8f76d74a0d939fbb7c8e14fb0e7f1258273cfbe77ae9b8b not found: ID does not exist" containerID="7a372b079b7d68e2f8f76d74a0d939fbb7c8e14fb0e7f1258273cfbe77ae9b8b" Mar 14 07:24:39 crc kubenswrapper[4893]: I0314 07:24:39.732417 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a372b079b7d68e2f8f76d74a0d939fbb7c8e14fb0e7f1258273cfbe77ae9b8b"} err="failed to get container status \"7a372b079b7d68e2f8f76d74a0d939fbb7c8e14fb0e7f1258273cfbe77ae9b8b\": rpc error: code = NotFound desc = could not find container \"7a372b079b7d68e2f8f76d74a0d939fbb7c8e14fb0e7f1258273cfbe77ae9b8b\": container with ID starting with 7a372b079b7d68e2f8f76d74a0d939fbb7c8e14fb0e7f1258273cfbe77ae9b8b not found: ID does not exist" Mar 14 07:24:39 crc kubenswrapper[4893]: I0314 07:24:39.732443 4893 scope.go:117] "RemoveContainer" containerID="14a32fe9b5caad93b19511764a6626f209dfafffc629a6f33c688c048a09bb5d" Mar 14 07:24:39 crc kubenswrapper[4893]: E0314 07:24:39.733056 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14a32fe9b5caad93b19511764a6626f209dfafffc629a6f33c688c048a09bb5d\": container with ID starting with 14a32fe9b5caad93b19511764a6626f209dfafffc629a6f33c688c048a09bb5d not found: ID does not exist" containerID="14a32fe9b5caad93b19511764a6626f209dfafffc629a6f33c688c048a09bb5d" Mar 14 07:24:39 crc kubenswrapper[4893]: I0314 07:24:39.733150 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14a32fe9b5caad93b19511764a6626f209dfafffc629a6f33c688c048a09bb5d"} err="failed to get container status \"14a32fe9b5caad93b19511764a6626f209dfafffc629a6f33c688c048a09bb5d\": rpc error: code = NotFound desc = could not find container \"14a32fe9b5caad93b19511764a6626f209dfafffc629a6f33c688c048a09bb5d\": container with ID starting with 14a32fe9b5caad93b19511764a6626f209dfafffc629a6f33c688c048a09bb5d not found: ID does not exist" Mar 14 07:24:39 crc kubenswrapper[4893]: I0314 07:24:39.733236 4893 scope.go:117] "RemoveContainer" containerID="3976a4aa22870f7b24f354814138dccf9148f7886eec69f805fe6b26c9098749" Mar 14 07:24:39 crc kubenswrapper[4893]: I0314 07:24:39.819041 4893 scope.go:117] "RemoveContainer" containerID="3976a4aa22870f7b24f354814138dccf9148f7886eec69f805fe6b26c9098749" Mar 14 07:24:39 crc kubenswrapper[4893]: E0314 07:24:39.821604 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3976a4aa22870f7b24f354814138dccf9148f7886eec69f805fe6b26c9098749\": container with ID starting with 3976a4aa22870f7b24f354814138dccf9148f7886eec69f805fe6b26c9098749 not found: ID does not exist" containerID="3976a4aa22870f7b24f354814138dccf9148f7886eec69f805fe6b26c9098749" Mar 14 07:24:39 crc kubenswrapper[4893]: I0314 07:24:39.821640 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3976a4aa22870f7b24f354814138dccf9148f7886eec69f805fe6b26c9098749"} err="failed to get container status \"3976a4aa22870f7b24f354814138dccf9148f7886eec69f805fe6b26c9098749\": rpc error: code = NotFound desc = could not find container \"3976a4aa22870f7b24f354814138dccf9148f7886eec69f805fe6b26c9098749\": container with ID starting with 3976a4aa22870f7b24f354814138dccf9148f7886eec69f805fe6b26c9098749 not found: ID does not exist" Mar 14 07:24:40 crc kubenswrapper[4893]: I0314 07:24:40.012885 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5cdc5f965f-t6wfv"] Mar 14 07:24:40 crc kubenswrapper[4893]: I0314 07:24:40.022780 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5cdc5f965f-t6wfv"] Mar 14 07:24:40 crc kubenswrapper[4893]: E0314 07:24:40.555980 4893 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 14dfeeb1ac493f8c868ceb12cb9f2623644d493f702d4b8d551c54f2d4fd82cb is running failed: container process not found" containerID="14dfeeb1ac493f8c868ceb12cb9f2623644d493f702d4b8d551c54f2d4fd82cb" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 14 07:24:40 crc kubenswrapper[4893]: E0314 07:24:40.557056 4893 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 14dfeeb1ac493f8c868ceb12cb9f2623644d493f702d4b8d551c54f2d4fd82cb is running failed: container process not found" containerID="14dfeeb1ac493f8c868ceb12cb9f2623644d493f702d4b8d551c54f2d4fd82cb" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 14 07:24:40 crc kubenswrapper[4893]: E0314 07:24:40.557609 4893 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 14dfeeb1ac493f8c868ceb12cb9f2623644d493f702d4b8d551c54f2d4fd82cb is running failed: container process not found" containerID="14dfeeb1ac493f8c868ceb12cb9f2623644d493f702d4b8d551c54f2d4fd82cb" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 14 07:24:40 crc kubenswrapper[4893]: E0314 07:24:40.557715 4893 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 14dfeeb1ac493f8c868ceb12cb9f2623644d493f702d4b8d551c54f2d4fd82cb is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-bwq2l" podUID="ec3a7835-99ba-4d0d-b81d-2dea0dc7128b" containerName="ovsdb-server" Mar 14 07:24:40 crc kubenswrapper[4893]: E0314 07:24:40.558487 4893 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="815433c66f3f469063c8f960c990cc45531d8077d9256e3db6d30e4aeb47f852" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 14 07:24:40 crc kubenswrapper[4893]: E0314 07:24:40.561894 4893 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="815433c66f3f469063c8f960c990cc45531d8077d9256e3db6d30e4aeb47f852" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 14 07:24:40 crc kubenswrapper[4893]: E0314 07:24:40.563352 4893 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="815433c66f3f469063c8f960c990cc45531d8077d9256e3db6d30e4aeb47f852" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 14 07:24:40 crc kubenswrapper[4893]: E0314 07:24:40.563407 4893 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-bwq2l" podUID="ec3a7835-99ba-4d0d-b81d-2dea0dc7128b" containerName="ovs-vswitchd" Mar 14 07:24:41 crc kubenswrapper[4893]: E0314 07:24:41.231647 4893 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="34fd8df3d0beb69b7fc8e1a539f4034f7bd7a4fd0c322b7a5fd50447ba69c38c" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 14 07:24:41 crc kubenswrapper[4893]: E0314 07:24:41.233568 4893 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="34fd8df3d0beb69b7fc8e1a539f4034f7bd7a4fd0c322b7a5fd50447ba69c38c" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 14 07:24:41 crc kubenswrapper[4893]: E0314 07:24:41.234920 4893 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="34fd8df3d0beb69b7fc8e1a539f4034f7bd7a4fd0c322b7a5fd50447ba69c38c" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 14 07:24:41 crc kubenswrapper[4893]: E0314 07:24:41.234970 4893 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="6dd135e7-b208-4d7f-85f5-05baa2819788" containerName="nova-scheduler-scheduler" Mar 14 07:24:41 crc kubenswrapper[4893]: I0314 07:24:41.393570 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bee0811-3177-4034-aa99-39158e55c44f" path="/var/lib/kubelet/pods/2bee0811-3177-4034-aa99-39158e55c44f/volumes" Mar 14 07:24:41 crc kubenswrapper[4893]: I0314 07:24:41.394356 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6de6c560-1e2c-4dca-b4c2-be4e51a5300f" path="/var/lib/kubelet/pods/6de6c560-1e2c-4dca-b4c2-be4e51a5300f/volumes" Mar 14 07:24:43 crc kubenswrapper[4893]: I0314 07:24:43.170586 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qdfzm" Mar 14 07:24:43 crc kubenswrapper[4893]: I0314 07:24:43.221285 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qdfzm" Mar 14 07:24:43 crc kubenswrapper[4893]: I0314 07:24:43.416816 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qdfzm"] Mar 14 07:24:44 crc kubenswrapper[4893]: I0314 07:24:44.343630 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 14 07:24:44 crc kubenswrapper[4893]: I0314 07:24:44.465531 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ng759\" (UniqueName: \"kubernetes.io/projected/6dd135e7-b208-4d7f-85f5-05baa2819788-kube-api-access-ng759\") pod \"6dd135e7-b208-4d7f-85f5-05baa2819788\" (UID: \"6dd135e7-b208-4d7f-85f5-05baa2819788\") " Mar 14 07:24:44 crc kubenswrapper[4893]: I0314 07:24:44.465915 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dd135e7-b208-4d7f-85f5-05baa2819788-config-data\") pod \"6dd135e7-b208-4d7f-85f5-05baa2819788\" (UID: \"6dd135e7-b208-4d7f-85f5-05baa2819788\") " Mar 14 07:24:44 crc kubenswrapper[4893]: I0314 07:24:44.466034 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dd135e7-b208-4d7f-85f5-05baa2819788-combined-ca-bundle\") pod \"6dd135e7-b208-4d7f-85f5-05baa2819788\" (UID: \"6dd135e7-b208-4d7f-85f5-05baa2819788\") " Mar 14 07:24:44 crc kubenswrapper[4893]: I0314 07:24:44.470734 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6dd135e7-b208-4d7f-85f5-05baa2819788-kube-api-access-ng759" (OuterVolumeSpecName: "kube-api-access-ng759") pod "6dd135e7-b208-4d7f-85f5-05baa2819788" (UID: "6dd135e7-b208-4d7f-85f5-05baa2819788"). InnerVolumeSpecName "kube-api-access-ng759". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:24:44 crc kubenswrapper[4893]: I0314 07:24:44.486360 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dd135e7-b208-4d7f-85f5-05baa2819788-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6dd135e7-b208-4d7f-85f5-05baa2819788" (UID: "6dd135e7-b208-4d7f-85f5-05baa2819788"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:44 crc kubenswrapper[4893]: I0314 07:24:44.495864 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dd135e7-b208-4d7f-85f5-05baa2819788-config-data" (OuterVolumeSpecName: "config-data") pod "6dd135e7-b208-4d7f-85f5-05baa2819788" (UID: "6dd135e7-b208-4d7f-85f5-05baa2819788"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:44 crc kubenswrapper[4893]: I0314 07:24:44.567473 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ng759\" (UniqueName: \"kubernetes.io/projected/6dd135e7-b208-4d7f-85f5-05baa2819788-kube-api-access-ng759\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:44 crc kubenswrapper[4893]: I0314 07:24:44.567509 4893 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dd135e7-b208-4d7f-85f5-05baa2819788-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:44 crc kubenswrapper[4893]: I0314 07:24:44.567536 4893 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dd135e7-b208-4d7f-85f5-05baa2819788-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:44 crc kubenswrapper[4893]: I0314 07:24:44.712475 4893 generic.go:334] "Generic (PLEG): container finished" podID="6dd135e7-b208-4d7f-85f5-05baa2819788" containerID="34fd8df3d0beb69b7fc8e1a539f4034f7bd7a4fd0c322b7a5fd50447ba69c38c" exitCode=0 Mar 14 07:24:44 crc kubenswrapper[4893]: I0314 07:24:44.712781 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qdfzm" podUID="6354f443-b3e8-4932-a319-315187cebac7" containerName="registry-server" containerID="cri-o://cfd46e26e2417c4760a87b5ea2ca9a866f488b5a7b73d8dc52caaa1ec2cc8320" gracePeriod=2 Mar 14 07:24:44 crc kubenswrapper[4893]: I0314 07:24:44.712782 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6dd135e7-b208-4d7f-85f5-05baa2819788","Type":"ContainerDied","Data":"34fd8df3d0beb69b7fc8e1a539f4034f7bd7a4fd0c322b7a5fd50447ba69c38c"} Mar 14 07:24:44 crc kubenswrapper[4893]: I0314 07:24:44.712840 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6dd135e7-b208-4d7f-85f5-05baa2819788","Type":"ContainerDied","Data":"a73bc026e01dea0510554fd14e663dd76f64073e52963f45bd43cdbbc28f22ce"} Mar 14 07:24:44 crc kubenswrapper[4893]: I0314 07:24:44.712864 4893 scope.go:117] "RemoveContainer" containerID="34fd8df3d0beb69b7fc8e1a539f4034f7bd7a4fd0c322b7a5fd50447ba69c38c" Mar 14 07:24:44 crc kubenswrapper[4893]: I0314 07:24:44.713408 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 14 07:24:44 crc kubenswrapper[4893]: I0314 07:24:44.766108 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 07:24:44 crc kubenswrapper[4893]: I0314 07:24:44.773122 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 07:24:44 crc kubenswrapper[4893]: I0314 07:24:44.774979 4893 scope.go:117] "RemoveContainer" containerID="34fd8df3d0beb69b7fc8e1a539f4034f7bd7a4fd0c322b7a5fd50447ba69c38c" Mar 14 07:24:44 crc kubenswrapper[4893]: E0314 07:24:44.775432 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34fd8df3d0beb69b7fc8e1a539f4034f7bd7a4fd0c322b7a5fd50447ba69c38c\": container with ID starting with 34fd8df3d0beb69b7fc8e1a539f4034f7bd7a4fd0c322b7a5fd50447ba69c38c not found: ID does not exist" containerID="34fd8df3d0beb69b7fc8e1a539f4034f7bd7a4fd0c322b7a5fd50447ba69c38c" Mar 14 07:24:44 crc kubenswrapper[4893]: I0314 07:24:44.775471 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34fd8df3d0beb69b7fc8e1a539f4034f7bd7a4fd0c322b7a5fd50447ba69c38c"} err="failed to get container status \"34fd8df3d0beb69b7fc8e1a539f4034f7bd7a4fd0c322b7a5fd50447ba69c38c\": rpc error: code = NotFound desc = could not find container \"34fd8df3d0beb69b7fc8e1a539f4034f7bd7a4fd0c322b7a5fd50447ba69c38c\": container with ID starting with 34fd8df3d0beb69b7fc8e1a539f4034f7bd7a4fd0c322b7a5fd50447ba69c38c not found: ID does not exist" Mar 14 07:24:45 crc kubenswrapper[4893]: I0314 07:24:45.228019 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qdfzm" Mar 14 07:24:45 crc kubenswrapper[4893]: I0314 07:24:45.382382 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6354f443-b3e8-4932-a319-315187cebac7-utilities\") pod \"6354f443-b3e8-4932-a319-315187cebac7\" (UID: \"6354f443-b3e8-4932-a319-315187cebac7\") " Mar 14 07:24:45 crc kubenswrapper[4893]: I0314 07:24:45.382554 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6354f443-b3e8-4932-a319-315187cebac7-catalog-content\") pod \"6354f443-b3e8-4932-a319-315187cebac7\" (UID: \"6354f443-b3e8-4932-a319-315187cebac7\") " Mar 14 07:24:45 crc kubenswrapper[4893]: I0314 07:24:45.382597 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxzlx\" (UniqueName: \"kubernetes.io/projected/6354f443-b3e8-4932-a319-315187cebac7-kube-api-access-nxzlx\") pod \"6354f443-b3e8-4932-a319-315187cebac7\" (UID: \"6354f443-b3e8-4932-a319-315187cebac7\") " Mar 14 07:24:45 crc kubenswrapper[4893]: I0314 07:24:45.383365 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6354f443-b3e8-4932-a319-315187cebac7-utilities" (OuterVolumeSpecName: "utilities") pod "6354f443-b3e8-4932-a319-315187cebac7" (UID: "6354f443-b3e8-4932-a319-315187cebac7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:24:45 crc kubenswrapper[4893]: I0314 07:24:45.388182 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6354f443-b3e8-4932-a319-315187cebac7-kube-api-access-nxzlx" (OuterVolumeSpecName: "kube-api-access-nxzlx") pod "6354f443-b3e8-4932-a319-315187cebac7" (UID: "6354f443-b3e8-4932-a319-315187cebac7"). InnerVolumeSpecName "kube-api-access-nxzlx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:24:45 crc kubenswrapper[4893]: I0314 07:24:45.388821 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6dd135e7-b208-4d7f-85f5-05baa2819788" path="/var/lib/kubelet/pods/6dd135e7-b208-4d7f-85f5-05baa2819788/volumes" Mar 14 07:24:45 crc kubenswrapper[4893]: I0314 07:24:45.484169 4893 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6354f443-b3e8-4932-a319-315187cebac7-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:45 crc kubenswrapper[4893]: I0314 07:24:45.484201 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxzlx\" (UniqueName: \"kubernetes.io/projected/6354f443-b3e8-4932-a319-315187cebac7-kube-api-access-nxzlx\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:45 crc kubenswrapper[4893]: E0314 07:24:45.556496 4893 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 14dfeeb1ac493f8c868ceb12cb9f2623644d493f702d4b8d551c54f2d4fd82cb is running failed: container process not found" containerID="14dfeeb1ac493f8c868ceb12cb9f2623644d493f702d4b8d551c54f2d4fd82cb" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 14 07:24:45 crc kubenswrapper[4893]: E0314 07:24:45.556843 4893 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 14dfeeb1ac493f8c868ceb12cb9f2623644d493f702d4b8d551c54f2d4fd82cb is running failed: container process not found" containerID="14dfeeb1ac493f8c868ceb12cb9f2623644d493f702d4b8d551c54f2d4fd82cb" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 14 07:24:45 crc kubenswrapper[4893]: E0314 07:24:45.557049 4893 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="815433c66f3f469063c8f960c990cc45531d8077d9256e3db6d30e4aeb47f852" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 14 07:24:45 crc kubenswrapper[4893]: E0314 07:24:45.557596 4893 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 14dfeeb1ac493f8c868ceb12cb9f2623644d493f702d4b8d551c54f2d4fd82cb is running failed: container process not found" containerID="14dfeeb1ac493f8c868ceb12cb9f2623644d493f702d4b8d551c54f2d4fd82cb" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 14 07:24:45 crc kubenswrapper[4893]: E0314 07:24:45.557636 4893 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 14dfeeb1ac493f8c868ceb12cb9f2623644d493f702d4b8d551c54f2d4fd82cb is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-bwq2l" podUID="ec3a7835-99ba-4d0d-b81d-2dea0dc7128b" containerName="ovsdb-server" Mar 14 07:24:45 crc kubenswrapper[4893]: E0314 07:24:45.558385 4893 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="815433c66f3f469063c8f960c990cc45531d8077d9256e3db6d30e4aeb47f852" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 14 07:24:45 crc kubenswrapper[4893]: E0314 07:24:45.559444 4893 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="815433c66f3f469063c8f960c990cc45531d8077d9256e3db6d30e4aeb47f852" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 14 07:24:45 crc kubenswrapper[4893]: E0314 07:24:45.559476 4893 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-bwq2l" podUID="ec3a7835-99ba-4d0d-b81d-2dea0dc7128b" containerName="ovs-vswitchd" Mar 14 07:24:45 crc kubenswrapper[4893]: I0314 07:24:45.588257 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6354f443-b3e8-4932-a319-315187cebac7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6354f443-b3e8-4932-a319-315187cebac7" (UID: "6354f443-b3e8-4932-a319-315187cebac7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:24:45 crc kubenswrapper[4893]: I0314 07:24:45.686690 4893 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6354f443-b3e8-4932-a319-315187cebac7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:45 crc kubenswrapper[4893]: I0314 07:24:45.724883 4893 generic.go:334] "Generic (PLEG): container finished" podID="6354f443-b3e8-4932-a319-315187cebac7" containerID="cfd46e26e2417c4760a87b5ea2ca9a866f488b5a7b73d8dc52caaa1ec2cc8320" exitCode=0 Mar 14 07:24:45 crc kubenswrapper[4893]: I0314 07:24:45.724934 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qdfzm" Mar 14 07:24:45 crc kubenswrapper[4893]: I0314 07:24:45.724938 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qdfzm" event={"ID":"6354f443-b3e8-4932-a319-315187cebac7","Type":"ContainerDied","Data":"cfd46e26e2417c4760a87b5ea2ca9a866f488b5a7b73d8dc52caaa1ec2cc8320"} Mar 14 07:24:45 crc kubenswrapper[4893]: I0314 07:24:45.725073 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qdfzm" event={"ID":"6354f443-b3e8-4932-a319-315187cebac7","Type":"ContainerDied","Data":"7dd45de766400d5e4e9294f0ecec545522417997e7d7f797447d1064f473c43e"} Mar 14 07:24:45 crc kubenswrapper[4893]: I0314 07:24:45.725107 4893 scope.go:117] "RemoveContainer" containerID="cfd46e26e2417c4760a87b5ea2ca9a866f488b5a7b73d8dc52caaa1ec2cc8320" Mar 14 07:24:45 crc kubenswrapper[4893]: I0314 07:24:45.757367 4893 scope.go:117] "RemoveContainer" containerID="d78a88db10bea0199bec89514dcb2f2dc2495ab3ecf23dfd2f0fa4a8292fcd76" Mar 14 07:24:45 crc kubenswrapper[4893]: I0314 07:24:45.762466 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qdfzm"] Mar 14 07:24:45 crc kubenswrapper[4893]: I0314 07:24:45.780077 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qdfzm"] Mar 14 07:24:45 crc kubenswrapper[4893]: I0314 07:24:45.783199 4893 scope.go:117] "RemoveContainer" containerID="ed6a5148c6226117649b26539c8c901ac8fbb8d23015579f9313a9d1a00f84e4" Mar 14 07:24:45 crc kubenswrapper[4893]: I0314 07:24:45.815047 4893 scope.go:117] "RemoveContainer" containerID="cfd46e26e2417c4760a87b5ea2ca9a866f488b5a7b73d8dc52caaa1ec2cc8320" Mar 14 07:24:45 crc kubenswrapper[4893]: E0314 07:24:45.815652 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfd46e26e2417c4760a87b5ea2ca9a866f488b5a7b73d8dc52caaa1ec2cc8320\": container with ID starting with cfd46e26e2417c4760a87b5ea2ca9a866f488b5a7b73d8dc52caaa1ec2cc8320 not found: ID does not exist" containerID="cfd46e26e2417c4760a87b5ea2ca9a866f488b5a7b73d8dc52caaa1ec2cc8320" Mar 14 07:24:45 crc kubenswrapper[4893]: I0314 07:24:45.815699 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfd46e26e2417c4760a87b5ea2ca9a866f488b5a7b73d8dc52caaa1ec2cc8320"} err="failed to get container status \"cfd46e26e2417c4760a87b5ea2ca9a866f488b5a7b73d8dc52caaa1ec2cc8320\": rpc error: code = NotFound desc = could not find container \"cfd46e26e2417c4760a87b5ea2ca9a866f488b5a7b73d8dc52caaa1ec2cc8320\": container with ID starting with cfd46e26e2417c4760a87b5ea2ca9a866f488b5a7b73d8dc52caaa1ec2cc8320 not found: ID does not exist" Mar 14 07:24:45 crc kubenswrapper[4893]: I0314 07:24:45.815724 4893 scope.go:117] "RemoveContainer" containerID="d78a88db10bea0199bec89514dcb2f2dc2495ab3ecf23dfd2f0fa4a8292fcd76" Mar 14 07:24:45 crc kubenswrapper[4893]: E0314 07:24:45.816015 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d78a88db10bea0199bec89514dcb2f2dc2495ab3ecf23dfd2f0fa4a8292fcd76\": container with ID starting with d78a88db10bea0199bec89514dcb2f2dc2495ab3ecf23dfd2f0fa4a8292fcd76 not found: ID does not exist" containerID="d78a88db10bea0199bec89514dcb2f2dc2495ab3ecf23dfd2f0fa4a8292fcd76" Mar 14 07:24:45 crc kubenswrapper[4893]: I0314 07:24:45.816043 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d78a88db10bea0199bec89514dcb2f2dc2495ab3ecf23dfd2f0fa4a8292fcd76"} err="failed to get container status \"d78a88db10bea0199bec89514dcb2f2dc2495ab3ecf23dfd2f0fa4a8292fcd76\": rpc error: code = NotFound desc = could not find container \"d78a88db10bea0199bec89514dcb2f2dc2495ab3ecf23dfd2f0fa4a8292fcd76\": container with ID starting with d78a88db10bea0199bec89514dcb2f2dc2495ab3ecf23dfd2f0fa4a8292fcd76 not found: ID does not exist" Mar 14 07:24:45 crc kubenswrapper[4893]: I0314 07:24:45.816058 4893 scope.go:117] "RemoveContainer" containerID="ed6a5148c6226117649b26539c8c901ac8fbb8d23015579f9313a9d1a00f84e4" Mar 14 07:24:45 crc kubenswrapper[4893]: E0314 07:24:45.816478 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed6a5148c6226117649b26539c8c901ac8fbb8d23015579f9313a9d1a00f84e4\": container with ID starting with ed6a5148c6226117649b26539c8c901ac8fbb8d23015579f9313a9d1a00f84e4 not found: ID does not exist" containerID="ed6a5148c6226117649b26539c8c901ac8fbb8d23015579f9313a9d1a00f84e4" Mar 14 07:24:45 crc kubenswrapper[4893]: I0314 07:24:45.816512 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed6a5148c6226117649b26539c8c901ac8fbb8d23015579f9313a9d1a00f84e4"} err="failed to get container status \"ed6a5148c6226117649b26539c8c901ac8fbb8d23015579f9313a9d1a00f84e4\": rpc error: code = NotFound desc = could not find container \"ed6a5148c6226117649b26539c8c901ac8fbb8d23015579f9313a9d1a00f84e4\": container with ID starting with ed6a5148c6226117649b26539c8c901ac8fbb8d23015579f9313a9d1a00f84e4 not found: ID does not exist" Mar 14 07:24:47 crc kubenswrapper[4893]: I0314 07:24:47.424182 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6354f443-b3e8-4932-a319-315187cebac7" path="/var/lib/kubelet/pods/6354f443-b3e8-4932-a319-315187cebac7/volumes" Mar 14 07:24:50 crc kubenswrapper[4893]: E0314 07:24:50.556178 4893 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 14dfeeb1ac493f8c868ceb12cb9f2623644d493f702d4b8d551c54f2d4fd82cb is running failed: container process not found" containerID="14dfeeb1ac493f8c868ceb12cb9f2623644d493f702d4b8d551c54f2d4fd82cb" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 14 07:24:50 crc kubenswrapper[4893]: E0314 07:24:50.557218 4893 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 14dfeeb1ac493f8c868ceb12cb9f2623644d493f702d4b8d551c54f2d4fd82cb is running failed: container process not found" containerID="14dfeeb1ac493f8c868ceb12cb9f2623644d493f702d4b8d551c54f2d4fd82cb" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 14 07:24:50 crc kubenswrapper[4893]: E0314 07:24:50.557670 4893 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 14dfeeb1ac493f8c868ceb12cb9f2623644d493f702d4b8d551c54f2d4fd82cb is running failed: container process not found" containerID="14dfeeb1ac493f8c868ceb12cb9f2623644d493f702d4b8d551c54f2d4fd82cb" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 14 07:24:50 crc kubenswrapper[4893]: E0314 07:24:50.557697 4893 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 14dfeeb1ac493f8c868ceb12cb9f2623644d493f702d4b8d551c54f2d4fd82cb is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-bwq2l" podUID="ec3a7835-99ba-4d0d-b81d-2dea0dc7128b" containerName="ovsdb-server" Mar 14 07:24:50 crc kubenswrapper[4893]: E0314 07:24:50.558501 4893 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="815433c66f3f469063c8f960c990cc45531d8077d9256e3db6d30e4aeb47f852" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 14 07:24:50 crc kubenswrapper[4893]: E0314 07:24:50.560021 4893 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="815433c66f3f469063c8f960c990cc45531d8077d9256e3db6d30e4aeb47f852" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 14 07:24:50 crc kubenswrapper[4893]: E0314 07:24:50.561290 4893 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="815433c66f3f469063c8f960c990cc45531d8077d9256e3db6d30e4aeb47f852" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 14 07:24:50 crc kubenswrapper[4893]: E0314 07:24:50.561329 4893 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-bwq2l" podUID="ec3a7835-99ba-4d0d-b81d-2dea0dc7128b" containerName="ovs-vswitchd" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.557190 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lskvp"] Mar 14 07:24:52 crc kubenswrapper[4893]: E0314 07:24:52.560481 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4b44171-12ae-4a98-aac1-1adc9dff3941" containerName="ovn-controller" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.560554 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4b44171-12ae-4a98-aac1-1adc9dff3941" containerName="ovn-controller" Mar 14 07:24:52 crc kubenswrapper[4893]: E0314 07:24:52.560580 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fefe82b2-447a-4f97-8221-7050b61ef60c" containerName="barbican-api" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.560592 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="fefe82b2-447a-4f97-8221-7050b61ef60c" containerName="barbican-api" Mar 14 07:24:52 crc kubenswrapper[4893]: E0314 07:24:52.560607 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bee0811-3177-4034-aa99-39158e55c44f" containerName="neutron-httpd" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.560618 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bee0811-3177-4034-aa99-39158e55c44f" containerName="neutron-httpd" Mar 14 07:24:52 crc kubenswrapper[4893]: E0314 07:24:52.560636 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1239ac87-7084-45c6-9eef-ecab07108656" containerName="probe" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.560647 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="1239ac87-7084-45c6-9eef-ecab07108656" containerName="probe" Mar 14 07:24:52 crc kubenswrapper[4893]: E0314 07:24:52.560667 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0d5e368-4fac-48b9-a64c-717f3acf9388" containerName="nova-api-api" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.560678 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0d5e368-4fac-48b9-a64c-717f3acf9388" containerName="nova-api-api" Mar 14 07:24:52 crc kubenswrapper[4893]: E0314 07:24:52.560702 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dd135e7-b208-4d7f-85f5-05baa2819788" containerName="nova-scheduler-scheduler" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.560713 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dd135e7-b208-4d7f-85f5-05baa2819788" containerName="nova-scheduler-scheduler" Mar 14 07:24:52 crc kubenswrapper[4893]: E0314 07:24:52.560729 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7a6e4cc-c5a8-4551-bd39-ab4eb11331df" containerName="ovn-northd" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.560739 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7a6e4cc-c5a8-4551-bd39-ab4eb11331df" containerName="ovn-northd" Mar 14 07:24:52 crc kubenswrapper[4893]: E0314 07:24:52.560755 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e" containerName="rabbitmq" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.560765 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e" containerName="rabbitmq" Mar 14 07:24:52 crc kubenswrapper[4893]: E0314 07:24:52.560784 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f6afb83-bfaa-41a6-8429-b8588d82c7a7" containerName="galera" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.560795 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f6afb83-bfaa-41a6-8429-b8588d82c7a7" containerName="galera" Mar 14 07:24:52 crc kubenswrapper[4893]: E0314 07:24:52.560817 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7a6e4cc-c5a8-4551-bd39-ab4eb11331df" containerName="openstack-network-exporter" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.560828 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7a6e4cc-c5a8-4551-bd39-ab4eb11331df" containerName="openstack-network-exporter" Mar 14 07:24:52 crc kubenswrapper[4893]: E0314 07:24:52.560852 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6354f443-b3e8-4932-a319-315187cebac7" containerName="extract-utilities" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.560889 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="6354f443-b3e8-4932-a319-315187cebac7" containerName="extract-utilities" Mar 14 07:24:52 crc kubenswrapper[4893]: E0314 07:24:52.560907 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e" containerName="setup-container" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.560920 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e" containerName="setup-container" Mar 14 07:24:52 crc kubenswrapper[4893]: E0314 07:24:52.560938 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc7c1963-417f-453f-8983-1c03d349f76d" containerName="ceilometer-central-agent" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.560949 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc7c1963-417f-453f-8983-1c03d349f76d" containerName="ceilometer-central-agent" Mar 14 07:24:52 crc kubenswrapper[4893]: E0314 07:24:52.560966 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0f57646-651c-4b8f-b73d-6606d06fa3a3" containerName="nova-cell1-conductor-conductor" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.561121 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0f57646-651c-4b8f-b73d-6606d06fa3a3" containerName="nova-cell1-conductor-conductor" Mar 14 07:24:52 crc kubenswrapper[4893]: E0314 07:24:52.561146 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2195ecfb-6eeb-48f1-8b55-c57520974663" containerName="placement-api" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.561158 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="2195ecfb-6eeb-48f1-8b55-c57520974663" containerName="placement-api" Mar 14 07:24:52 crc kubenswrapper[4893]: E0314 07:24:52.561194 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc7c1963-417f-453f-8983-1c03d349f76d" containerName="proxy-httpd" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.561206 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc7c1963-417f-453f-8983-1c03d349f76d" containerName="proxy-httpd" Mar 14 07:24:52 crc kubenswrapper[4893]: E0314 07:24:52.561223 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2195ecfb-6eeb-48f1-8b55-c57520974663" containerName="placement-log" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.561234 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="2195ecfb-6eeb-48f1-8b55-c57520974663" containerName="placement-log" Mar 14 07:24:52 crc kubenswrapper[4893]: E0314 07:24:52.561253 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cad381a-c3bf-4fc8-a314-6f45028f3482" containerName="kube-state-metrics" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.561265 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cad381a-c3bf-4fc8-a314-6f45028f3482" containerName="kube-state-metrics" Mar 14 07:24:52 crc kubenswrapper[4893]: E0314 07:24:52.561285 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="444c0d3d-4ad4-47a3-9281-b7028d69a78a" containerName="mysql-bootstrap" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.561298 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="444c0d3d-4ad4-47a3-9281-b7028d69a78a" containerName="mysql-bootstrap" Mar 14 07:24:52 crc kubenswrapper[4893]: E0314 07:24:52.561315 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab16027b-4fcf-42bf-b586-a7b8ff348305" containerName="cinder-api-log" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.561328 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab16027b-4fcf-42bf-b586-a7b8ff348305" containerName="cinder-api-log" Mar 14 07:24:52 crc kubenswrapper[4893]: E0314 07:24:52.561344 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="444c0d3d-4ad4-47a3-9281-b7028d69a78a" containerName="galera" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.561355 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="444c0d3d-4ad4-47a3-9281-b7028d69a78a" containerName="galera" Mar 14 07:24:52 crc kubenswrapper[4893]: E0314 07:24:52.561376 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f6afb83-bfaa-41a6-8429-b8588d82c7a7" containerName="mysql-bootstrap" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.561387 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f6afb83-bfaa-41a6-8429-b8588d82c7a7" containerName="mysql-bootstrap" Mar 14 07:24:52 crc kubenswrapper[4893]: E0314 07:24:52.561408 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fefe82b2-447a-4f97-8221-7050b61ef60c" containerName="barbican-api-log" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.561419 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="fefe82b2-447a-4f97-8221-7050b61ef60c" containerName="barbican-api-log" Mar 14 07:24:52 crc kubenswrapper[4893]: E0314 07:24:52.561432 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a752b3c8-284e-490f-be39-506e7a075c6f" containerName="setup-container" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.561444 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="a752b3c8-284e-490f-be39-506e7a075c6f" containerName="setup-container" Mar 14 07:24:52 crc kubenswrapper[4893]: E0314 07:24:52.561461 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a752b3c8-284e-490f-be39-506e7a075c6f" containerName="rabbitmq" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.561472 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="a752b3c8-284e-490f-be39-506e7a075c6f" containerName="rabbitmq" Mar 14 07:24:52 crc kubenswrapper[4893]: E0314 07:24:52.561564 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6ca04fa-accd-437a-ab63-d39d14a49777" containerName="nova-metadata-log" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.561576 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6ca04fa-accd-437a-ab63-d39d14a49777" containerName="nova-metadata-log" Mar 14 07:24:52 crc kubenswrapper[4893]: E0314 07:24:52.561598 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a07f9759-5cdb-4e42-b7c6-714d0e34ee55" containerName="glance-log" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.561610 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="a07f9759-5cdb-4e42-b7c6-714d0e34ee55" containerName="glance-log" Mar 14 07:24:52 crc kubenswrapper[4893]: E0314 07:24:52.561630 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6354f443-b3e8-4932-a319-315187cebac7" containerName="extract-content" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.561641 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="6354f443-b3e8-4932-a319-315187cebac7" containerName="extract-content" Mar 14 07:24:52 crc kubenswrapper[4893]: E0314 07:24:52.561661 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6be5c8e-c381-4e29-90e7-069d902c1805" containerName="mariadb-account-create-update" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.561672 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6be5c8e-c381-4e29-90e7-069d902c1805" containerName="mariadb-account-create-update" Mar 14 07:24:52 crc kubenswrapper[4893]: E0314 07:24:52.561690 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6de6c560-1e2c-4dca-b4c2-be4e51a5300f" containerName="memcached" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.561700 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="6de6c560-1e2c-4dca-b4c2-be4e51a5300f" containerName="memcached" Mar 14 07:24:52 crc kubenswrapper[4893]: E0314 07:24:52.561721 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1239ac87-7084-45c6-9eef-ecab07108656" containerName="cinder-scheduler" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.561734 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="1239ac87-7084-45c6-9eef-ecab07108656" containerName="cinder-scheduler" Mar 14 07:24:52 crc kubenswrapper[4893]: E0314 07:24:52.561751 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc7c1963-417f-453f-8983-1c03d349f76d" containerName="sg-core" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.561761 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc7c1963-417f-453f-8983-1c03d349f76d" containerName="sg-core" Mar 14 07:24:52 crc kubenswrapper[4893]: E0314 07:24:52.561774 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6ca04fa-accd-437a-ab63-d39d14a49777" containerName="nova-metadata-metadata" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.561785 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6ca04fa-accd-437a-ab63-d39d14a49777" containerName="nova-metadata-metadata" Mar 14 07:24:52 crc kubenswrapper[4893]: E0314 07:24:52.561802 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc7c1963-417f-453f-8983-1c03d349f76d" containerName="ceilometer-notification-agent" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.561812 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc7c1963-417f-453f-8983-1c03d349f76d" containerName="ceilometer-notification-agent" Mar 14 07:24:52 crc kubenswrapper[4893]: E0314 07:24:52.561830 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6354f443-b3e8-4932-a319-315187cebac7" containerName="registry-server" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.561841 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="6354f443-b3e8-4932-a319-315187cebac7" containerName="registry-server" Mar 14 07:24:52 crc kubenswrapper[4893]: E0314 07:24:52.561861 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb5d1a2a-ad9f-4eb7-8a70-37a1523a1b6f" containerName="nova-cell0-conductor-conductor" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.561872 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb5d1a2a-ad9f-4eb7-8a70-37a1523a1b6f" containerName="nova-cell0-conductor-conductor" Mar 14 07:24:52 crc kubenswrapper[4893]: E0314 07:24:52.561890 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a07f9759-5cdb-4e42-b7c6-714d0e34ee55" containerName="glance-httpd" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.561900 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="a07f9759-5cdb-4e42-b7c6-714d0e34ee55" containerName="glance-httpd" Mar 14 07:24:52 crc kubenswrapper[4893]: E0314 07:24:52.561918 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6be5c8e-c381-4e29-90e7-069d902c1805" containerName="mariadb-account-create-update" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.561929 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6be5c8e-c381-4e29-90e7-069d902c1805" containerName="mariadb-account-create-update" Mar 14 07:24:52 crc kubenswrapper[4893]: E0314 07:24:52.561950 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0d5e368-4fac-48b9-a64c-717f3acf9388" containerName="nova-api-log" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.561962 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0d5e368-4fac-48b9-a64c-717f3acf9388" containerName="nova-api-log" Mar 14 07:24:52 crc kubenswrapper[4893]: E0314 07:24:52.561980 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc1eed68-2de8-46ed-91c9-3eb4fe897d3a" containerName="glance-httpd" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.561990 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc1eed68-2de8-46ed-91c9-3eb4fe897d3a" containerName="glance-httpd" Mar 14 07:24:52 crc kubenswrapper[4893]: E0314 07:24:52.562009 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc1eed68-2de8-46ed-91c9-3eb4fe897d3a" containerName="glance-log" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.562019 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc1eed68-2de8-46ed-91c9-3eb4fe897d3a" containerName="glance-log" Mar 14 07:24:52 crc kubenswrapper[4893]: E0314 07:24:52.562032 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab16027b-4fcf-42bf-b586-a7b8ff348305" containerName="cinder-api" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.562044 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab16027b-4fcf-42bf-b586-a7b8ff348305" containerName="cinder-api" Mar 14 07:24:52 crc kubenswrapper[4893]: E0314 07:24:52.562063 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a" containerName="keystone-api" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.562075 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a" containerName="keystone-api" Mar 14 07:24:52 crc kubenswrapper[4893]: E0314 07:24:52.562093 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bee0811-3177-4034-aa99-39158e55c44f" containerName="neutron-api" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.562105 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bee0811-3177-4034-aa99-39158e55c44f" containerName="neutron-api" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.562416 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc7c1963-417f-453f-8983-1c03d349f76d" containerName="sg-core" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.562439 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="fefe82b2-447a-4f97-8221-7050b61ef60c" containerName="barbican-api-log" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.562455 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f6afb83-bfaa-41a6-8429-b8588d82c7a7" containerName="galera" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.562472 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab16027b-4fcf-42bf-b586-a7b8ff348305" containerName="cinder-api" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.562492 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="444c0d3d-4ad4-47a3-9281-b7028d69a78a" containerName="galera" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.562503 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d5eee3d-8b3a-40c0-9e1a-8ed21212dc1e" containerName="rabbitmq" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.562541 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7a6e4cc-c5a8-4551-bd39-ab4eb11331df" containerName="openstack-network-exporter" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.562555 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0f57646-651c-4b8f-b73d-6606d06fa3a3" containerName="nova-cell1-conductor-conductor" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.562571 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="1239ac87-7084-45c6-9eef-ecab07108656" containerName="cinder-scheduler" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.562587 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc7c1963-417f-453f-8983-1c03d349f76d" containerName="proxy-httpd" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.562601 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="fefe82b2-447a-4f97-8221-7050b61ef60c" containerName="barbican-api" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.562614 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bee0811-3177-4034-aa99-39158e55c44f" containerName="neutron-api" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.562631 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc1eed68-2de8-46ed-91c9-3eb4fe897d3a" containerName="glance-log" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.562649 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6be5c8e-c381-4e29-90e7-069d902c1805" containerName="mariadb-account-create-update" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.562831 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="a07f9759-5cdb-4e42-b7c6-714d0e34ee55" containerName="glance-httpd" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.562853 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="a07f9759-5cdb-4e42-b7c6-714d0e34ee55" containerName="glance-log" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.562874 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="6354f443-b3e8-4932-a319-315187cebac7" containerName="registry-server" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.562896 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="6dd135e7-b208-4d7f-85f5-05baa2819788" containerName="nova-scheduler-scheduler" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.562911 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cad381a-c3bf-4fc8-a314-6f45028f3482" containerName="kube-state-metrics" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.562924 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="1239ac87-7084-45c6-9eef-ecab07108656" containerName="probe" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.562943 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc7c1963-417f-453f-8983-1c03d349f76d" containerName="ceilometer-central-agent" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.562963 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7a6e4cc-c5a8-4551-bd39-ab4eb11331df" containerName="ovn-northd" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.562981 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="2195ecfb-6eeb-48f1-8b55-c57520974663" containerName="placement-log" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.562995 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc7c1963-417f-453f-8983-1c03d349f76d" containerName="ceilometer-notification-agent" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.563059 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="2195ecfb-6eeb-48f1-8b55-c57520974663" containerName="placement-api" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.563080 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0d5e368-4fac-48b9-a64c-717f3acf9388" containerName="nova-api-api" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.563103 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0d5e368-4fac-48b9-a64c-717f3acf9388" containerName="nova-api-log" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.563171 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="a752b3c8-284e-490f-be39-506e7a075c6f" containerName="rabbitmq" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.563194 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="0193b07f-cfa8-4721-bc4c-ef7f3f0d2d2a" containerName="keystone-api" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.563210 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4b44171-12ae-4a98-aac1-1adc9dff3941" containerName="ovn-controller" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.563227 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="6de6c560-1e2c-4dca-b4c2-be4e51a5300f" containerName="memcached" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.563242 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bee0811-3177-4034-aa99-39158e55c44f" containerName="neutron-httpd" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.563261 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc1eed68-2de8-46ed-91c9-3eb4fe897d3a" containerName="glance-httpd" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.563274 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab16027b-4fcf-42bf-b586-a7b8ff348305" containerName="cinder-api-log" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.563292 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6ca04fa-accd-437a-ab63-d39d14a49777" containerName="nova-metadata-log" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.563306 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb5d1a2a-ad9f-4eb7-8a70-37a1523a1b6f" containerName="nova-cell0-conductor-conductor" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.563324 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6ca04fa-accd-437a-ab63-d39d14a49777" containerName="nova-metadata-metadata" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.564248 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6be5c8e-c381-4e29-90e7-069d902c1805" containerName="mariadb-account-create-update" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.565652 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lskvp" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.574199 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lskvp"] Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.600234 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/591ee530-a0a9-4d02-81e1-c6b319538818-catalog-content\") pod \"certified-operators-lskvp\" (UID: \"591ee530-a0a9-4d02-81e1-c6b319538818\") " pod="openshift-marketplace/certified-operators-lskvp" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.600334 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjfkw\" (UniqueName: \"kubernetes.io/projected/591ee530-a0a9-4d02-81e1-c6b319538818-kube-api-access-kjfkw\") pod \"certified-operators-lskvp\" (UID: \"591ee530-a0a9-4d02-81e1-c6b319538818\") " pod="openshift-marketplace/certified-operators-lskvp" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.600396 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/591ee530-a0a9-4d02-81e1-c6b319538818-utilities\") pod \"certified-operators-lskvp\" (UID: \"591ee530-a0a9-4d02-81e1-c6b319538818\") " pod="openshift-marketplace/certified-operators-lskvp" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.702123 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/591ee530-a0a9-4d02-81e1-c6b319538818-catalog-content\") pod \"certified-operators-lskvp\" (UID: \"591ee530-a0a9-4d02-81e1-c6b319538818\") " pod="openshift-marketplace/certified-operators-lskvp" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.702205 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjfkw\" (UniqueName: \"kubernetes.io/projected/591ee530-a0a9-4d02-81e1-c6b319538818-kube-api-access-kjfkw\") pod \"certified-operators-lskvp\" (UID: \"591ee530-a0a9-4d02-81e1-c6b319538818\") " pod="openshift-marketplace/certified-operators-lskvp" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.702264 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/591ee530-a0a9-4d02-81e1-c6b319538818-utilities\") pod \"certified-operators-lskvp\" (UID: \"591ee530-a0a9-4d02-81e1-c6b319538818\") " pod="openshift-marketplace/certified-operators-lskvp" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.702657 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/591ee530-a0a9-4d02-81e1-c6b319538818-catalog-content\") pod \"certified-operators-lskvp\" (UID: \"591ee530-a0a9-4d02-81e1-c6b319538818\") " pod="openshift-marketplace/certified-operators-lskvp" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.702841 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/591ee530-a0a9-4d02-81e1-c6b319538818-utilities\") pod \"certified-operators-lskvp\" (UID: \"591ee530-a0a9-4d02-81e1-c6b319538818\") " pod="openshift-marketplace/certified-operators-lskvp" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.727574 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjfkw\" (UniqueName: \"kubernetes.io/projected/591ee530-a0a9-4d02-81e1-c6b319538818-kube-api-access-kjfkw\") pod \"certified-operators-lskvp\" (UID: \"591ee530-a0a9-4d02-81e1-c6b319538818\") " pod="openshift-marketplace/certified-operators-lskvp" Mar 14 07:24:52 crc kubenswrapper[4893]: I0314 07:24:52.902276 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lskvp" Mar 14 07:24:53 crc kubenswrapper[4893]: I0314 07:24:53.386757 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lskvp"] Mar 14 07:24:53 crc kubenswrapper[4893]: W0314 07:24:53.391570 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod591ee530_a0a9_4d02_81e1_c6b319538818.slice/crio-ab4bdf606cd983524407b6f07069e09a962c270eeca0bb00e3cbd96da61b0a4a WatchSource:0}: Error finding container ab4bdf606cd983524407b6f07069e09a962c270eeca0bb00e3cbd96da61b0a4a: Status 404 returned error can't find the container with id ab4bdf606cd983524407b6f07069e09a962c270eeca0bb00e3cbd96da61b0a4a Mar 14 07:24:53 crc kubenswrapper[4893]: I0314 07:24:53.810727 4893 generic.go:334] "Generic (PLEG): container finished" podID="591ee530-a0a9-4d02-81e1-c6b319538818" containerID="4f844164e42aa2116315b5e86c4bd4b988de56eae5a30eac39ecd10731a9b0bf" exitCode=0 Mar 14 07:24:53 crc kubenswrapper[4893]: I0314 07:24:53.810793 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lskvp" event={"ID":"591ee530-a0a9-4d02-81e1-c6b319538818","Type":"ContainerDied","Data":"4f844164e42aa2116315b5e86c4bd4b988de56eae5a30eac39ecd10731a9b0bf"} Mar 14 07:24:53 crc kubenswrapper[4893]: I0314 07:24:53.810833 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lskvp" event={"ID":"591ee530-a0a9-4d02-81e1-c6b319538818","Type":"ContainerStarted","Data":"ab4bdf606cd983524407b6f07069e09a962c270eeca0bb00e3cbd96da61b0a4a"} Mar 14 07:24:54 crc kubenswrapper[4893]: I0314 07:24:54.822585 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lskvp" event={"ID":"591ee530-a0a9-4d02-81e1-c6b319538818","Type":"ContainerStarted","Data":"cf3704ab8cb20fd9bcc707621442037d171a3addf4c2fd921977d25807c827f9"} Mar 14 07:24:55 crc kubenswrapper[4893]: E0314 07:24:55.556402 4893 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 14dfeeb1ac493f8c868ceb12cb9f2623644d493f702d4b8d551c54f2d4fd82cb is running failed: container process not found" containerID="14dfeeb1ac493f8c868ceb12cb9f2623644d493f702d4b8d551c54f2d4fd82cb" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 14 07:24:55 crc kubenswrapper[4893]: E0314 07:24:55.556910 4893 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 14dfeeb1ac493f8c868ceb12cb9f2623644d493f702d4b8d551c54f2d4fd82cb is running failed: container process not found" containerID="14dfeeb1ac493f8c868ceb12cb9f2623644d493f702d4b8d551c54f2d4fd82cb" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 14 07:24:55 crc kubenswrapper[4893]: E0314 07:24:55.557289 4893 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 14dfeeb1ac493f8c868ceb12cb9f2623644d493f702d4b8d551c54f2d4fd82cb is running failed: container process not found" containerID="14dfeeb1ac493f8c868ceb12cb9f2623644d493f702d4b8d551c54f2d4fd82cb" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 14 07:24:55 crc kubenswrapper[4893]: E0314 07:24:55.557327 4893 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 14dfeeb1ac493f8c868ceb12cb9f2623644d493f702d4b8d551c54f2d4fd82cb is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-bwq2l" podUID="ec3a7835-99ba-4d0d-b81d-2dea0dc7128b" containerName="ovsdb-server" Mar 14 07:24:55 crc kubenswrapper[4893]: E0314 07:24:55.557750 4893 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="815433c66f3f469063c8f960c990cc45531d8077d9256e3db6d30e4aeb47f852" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 14 07:24:55 crc kubenswrapper[4893]: E0314 07:24:55.560002 4893 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="815433c66f3f469063c8f960c990cc45531d8077d9256e3db6d30e4aeb47f852" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 14 07:24:55 crc kubenswrapper[4893]: E0314 07:24:55.561243 4893 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="815433c66f3f469063c8f960c990cc45531d8077d9256e3db6d30e4aeb47f852" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 14 07:24:55 crc kubenswrapper[4893]: E0314 07:24:55.561276 4893 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-bwq2l" podUID="ec3a7835-99ba-4d0d-b81d-2dea0dc7128b" containerName="ovs-vswitchd" Mar 14 07:24:55 crc kubenswrapper[4893]: I0314 07:24:55.838096 4893 generic.go:334] "Generic (PLEG): container finished" podID="591ee530-a0a9-4d02-81e1-c6b319538818" containerID="cf3704ab8cb20fd9bcc707621442037d171a3addf4c2fd921977d25807c827f9" exitCode=0 Mar 14 07:24:55 crc kubenswrapper[4893]: I0314 07:24:55.838242 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lskvp" event={"ID":"591ee530-a0a9-4d02-81e1-c6b319538818","Type":"ContainerDied","Data":"cf3704ab8cb20fd9bcc707621442037d171a3addf4c2fd921977d25807c827f9"} Mar 14 07:24:57 crc kubenswrapper[4893]: I0314 07:24:57.860889 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lskvp" event={"ID":"591ee530-a0a9-4d02-81e1-c6b319538818","Type":"ContainerStarted","Data":"374f4419381c981b99844482ad64e5359e7761ec72c7f14b70e6936b009e2516"} Mar 14 07:24:57 crc kubenswrapper[4893]: I0314 07:24:57.862715 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bwq2l_ec3a7835-99ba-4d0d-b81d-2dea0dc7128b/ovs-vswitchd/0.log" Mar 14 07:24:57 crc kubenswrapper[4893]: I0314 07:24:57.863375 4893 generic.go:334] "Generic (PLEG): container finished" podID="ec3a7835-99ba-4d0d-b81d-2dea0dc7128b" containerID="815433c66f3f469063c8f960c990cc45531d8077d9256e3db6d30e4aeb47f852" exitCode=137 Mar 14 07:24:57 crc kubenswrapper[4893]: I0314 07:24:57.863410 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-bwq2l" event={"ID":"ec3a7835-99ba-4d0d-b81d-2dea0dc7128b","Type":"ContainerDied","Data":"815433c66f3f469063c8f960c990cc45531d8077d9256e3db6d30e4aeb47f852"} Mar 14 07:24:57 crc kubenswrapper[4893]: I0314 07:24:57.880442 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lskvp" podStartSLOduration=2.751617587 podStartE2EDuration="5.880426348s" podCreationTimestamp="2026-03-14 07:24:52 +0000 UTC" firstStartedPulling="2026-03-14 07:24:53.813892622 +0000 UTC m=+1573.076069454" lastFinishedPulling="2026-03-14 07:24:56.942701403 +0000 UTC m=+1576.204878215" observedRunningTime="2026-03-14 07:24:57.880005798 +0000 UTC m=+1577.142182610" watchObservedRunningTime="2026-03-14 07:24:57.880426348 +0000 UTC m=+1577.142603130" Mar 14 07:24:58 crc kubenswrapper[4893]: I0314 07:24:58.530190 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bwq2l_ec3a7835-99ba-4d0d-b81d-2dea0dc7128b/ovs-vswitchd/0.log" Mar 14 07:24:58 crc kubenswrapper[4893]: I0314 07:24:58.531543 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-bwq2l" Mar 14 07:24:58 crc kubenswrapper[4893]: I0314 07:24:58.532238 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 14 07:24:58 crc kubenswrapper[4893]: I0314 07:24:58.625773 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ec3a7835-99ba-4d0d-b81d-2dea0dc7128b-scripts\") pod \"ec3a7835-99ba-4d0d-b81d-2dea0dc7128b\" (UID: \"ec3a7835-99ba-4d0d-b81d-2dea0dc7128b\") " Mar 14 07:24:58 crc kubenswrapper[4893]: I0314 07:24:58.625846 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hc7q\" (UniqueName: \"kubernetes.io/projected/ec3a7835-99ba-4d0d-b81d-2dea0dc7128b-kube-api-access-4hc7q\") pod \"ec3a7835-99ba-4d0d-b81d-2dea0dc7128b\" (UID: \"ec3a7835-99ba-4d0d-b81d-2dea0dc7128b\") " Mar 14 07:24:58 crc kubenswrapper[4893]: I0314 07:24:58.625877 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/079232b7-87bb-42cf-96ff-1eb2d1cfe2b5-lock\") pod \"079232b7-87bb-42cf-96ff-1eb2d1cfe2b5\" (UID: \"079232b7-87bb-42cf-96ff-1eb2d1cfe2b5\") " Mar 14 07:24:58 crc kubenswrapper[4893]: I0314 07:24:58.626019 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/ec3a7835-99ba-4d0d-b81d-2dea0dc7128b-etc-ovs\") pod \"ec3a7835-99ba-4d0d-b81d-2dea0dc7128b\" (UID: \"ec3a7835-99ba-4d0d-b81d-2dea0dc7128b\") " Mar 14 07:24:58 crc kubenswrapper[4893]: I0314 07:24:58.626055 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/079232b7-87bb-42cf-96ff-1eb2d1cfe2b5-combined-ca-bundle\") pod \"079232b7-87bb-42cf-96ff-1eb2d1cfe2b5\" (UID: \"079232b7-87bb-42cf-96ff-1eb2d1cfe2b5\") " Mar 14 07:24:58 crc kubenswrapper[4893]: I0314 07:24:58.626079 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87cp7\" (UniqueName: \"kubernetes.io/projected/079232b7-87bb-42cf-96ff-1eb2d1cfe2b5-kube-api-access-87cp7\") pod \"079232b7-87bb-42cf-96ff-1eb2d1cfe2b5\" (UID: \"079232b7-87bb-42cf-96ff-1eb2d1cfe2b5\") " Mar 14 07:24:58 crc kubenswrapper[4893]: I0314 07:24:58.626098 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/079232b7-87bb-42cf-96ff-1eb2d1cfe2b5-cache\") pod \"079232b7-87bb-42cf-96ff-1eb2d1cfe2b5\" (UID: \"079232b7-87bb-42cf-96ff-1eb2d1cfe2b5\") " Mar 14 07:24:58 crc kubenswrapper[4893]: I0314 07:24:58.626115 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/079232b7-87bb-42cf-96ff-1eb2d1cfe2b5-etc-swift\") pod \"079232b7-87bb-42cf-96ff-1eb2d1cfe2b5\" (UID: \"079232b7-87bb-42cf-96ff-1eb2d1cfe2b5\") " Mar 14 07:24:58 crc kubenswrapper[4893]: I0314 07:24:58.626138 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/ec3a7835-99ba-4d0d-b81d-2dea0dc7128b-var-lib\") pod \"ec3a7835-99ba-4d0d-b81d-2dea0dc7128b\" (UID: \"ec3a7835-99ba-4d0d-b81d-2dea0dc7128b\") " Mar 14 07:24:58 crc kubenswrapper[4893]: I0314 07:24:58.626164 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"079232b7-87bb-42cf-96ff-1eb2d1cfe2b5\" (UID: \"079232b7-87bb-42cf-96ff-1eb2d1cfe2b5\") " Mar 14 07:24:58 crc kubenswrapper[4893]: I0314 07:24:58.626195 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ec3a7835-99ba-4d0d-b81d-2dea0dc7128b-var-run\") pod \"ec3a7835-99ba-4d0d-b81d-2dea0dc7128b\" (UID: \"ec3a7835-99ba-4d0d-b81d-2dea0dc7128b\") " Mar 14 07:24:58 crc kubenswrapper[4893]: I0314 07:24:58.626238 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/ec3a7835-99ba-4d0d-b81d-2dea0dc7128b-var-log\") pod \"ec3a7835-99ba-4d0d-b81d-2dea0dc7128b\" (UID: \"ec3a7835-99ba-4d0d-b81d-2dea0dc7128b\") " Mar 14 07:24:58 crc kubenswrapper[4893]: I0314 07:24:58.626583 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ec3a7835-99ba-4d0d-b81d-2dea0dc7128b-var-log" (OuterVolumeSpecName: "var-log") pod "ec3a7835-99ba-4d0d-b81d-2dea0dc7128b" (UID: "ec3a7835-99ba-4d0d-b81d-2dea0dc7128b"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:24:58 crc kubenswrapper[4893]: I0314 07:24:58.627463 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ec3a7835-99ba-4d0d-b81d-2dea0dc7128b-var-lib" (OuterVolumeSpecName: "var-lib") pod "ec3a7835-99ba-4d0d-b81d-2dea0dc7128b" (UID: "ec3a7835-99ba-4d0d-b81d-2dea0dc7128b"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:24:58 crc kubenswrapper[4893]: I0314 07:24:58.627930 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ec3a7835-99ba-4d0d-b81d-2dea0dc7128b-var-run" (OuterVolumeSpecName: "var-run") pod "ec3a7835-99ba-4d0d-b81d-2dea0dc7128b" (UID: "ec3a7835-99ba-4d0d-b81d-2dea0dc7128b"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:24:58 crc kubenswrapper[4893]: I0314 07:24:58.627963 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/079232b7-87bb-42cf-96ff-1eb2d1cfe2b5-lock" (OuterVolumeSpecName: "lock") pod "079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" (UID: "079232b7-87bb-42cf-96ff-1eb2d1cfe2b5"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:24:58 crc kubenswrapper[4893]: I0314 07:24:58.628013 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ec3a7835-99ba-4d0d-b81d-2dea0dc7128b-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "ec3a7835-99ba-4d0d-b81d-2dea0dc7128b" (UID: "ec3a7835-99ba-4d0d-b81d-2dea0dc7128b"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:24:58 crc kubenswrapper[4893]: I0314 07:24:58.628169 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/079232b7-87bb-42cf-96ff-1eb2d1cfe2b5-cache" (OuterVolumeSpecName: "cache") pod "079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" (UID: "079232b7-87bb-42cf-96ff-1eb2d1cfe2b5"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:24:58 crc kubenswrapper[4893]: I0314 07:24:58.628550 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec3a7835-99ba-4d0d-b81d-2dea0dc7128b-scripts" (OuterVolumeSpecName: "scripts") pod "ec3a7835-99ba-4d0d-b81d-2dea0dc7128b" (UID: "ec3a7835-99ba-4d0d-b81d-2dea0dc7128b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:24:58 crc kubenswrapper[4893]: I0314 07:24:58.633254 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/079232b7-87bb-42cf-96ff-1eb2d1cfe2b5-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" (UID: "079232b7-87bb-42cf-96ff-1eb2d1cfe2b5"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:24:58 crc kubenswrapper[4893]: I0314 07:24:58.633296 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec3a7835-99ba-4d0d-b81d-2dea0dc7128b-kube-api-access-4hc7q" (OuterVolumeSpecName: "kube-api-access-4hc7q") pod "ec3a7835-99ba-4d0d-b81d-2dea0dc7128b" (UID: "ec3a7835-99ba-4d0d-b81d-2dea0dc7128b"). InnerVolumeSpecName "kube-api-access-4hc7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:24:58 crc kubenswrapper[4893]: I0314 07:24:58.634038 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/079232b7-87bb-42cf-96ff-1eb2d1cfe2b5-kube-api-access-87cp7" (OuterVolumeSpecName: "kube-api-access-87cp7") pod "079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" (UID: "079232b7-87bb-42cf-96ff-1eb2d1cfe2b5"). InnerVolumeSpecName "kube-api-access-87cp7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:24:58 crc kubenswrapper[4893]: I0314 07:24:58.636474 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "swift") pod "079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" (UID: "079232b7-87bb-42cf-96ff-1eb2d1cfe2b5"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 14 07:24:58 crc kubenswrapper[4893]: I0314 07:24:58.728490 4893 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/079232b7-87bb-42cf-96ff-1eb2d1cfe2b5-lock\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:58 crc kubenswrapper[4893]: I0314 07:24:58.728826 4893 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/ec3a7835-99ba-4d0d-b81d-2dea0dc7128b-etc-ovs\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:58 crc kubenswrapper[4893]: I0314 07:24:58.728840 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87cp7\" (UniqueName: \"kubernetes.io/projected/079232b7-87bb-42cf-96ff-1eb2d1cfe2b5-kube-api-access-87cp7\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:58 crc kubenswrapper[4893]: I0314 07:24:58.728854 4893 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/079232b7-87bb-42cf-96ff-1eb2d1cfe2b5-cache\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:58 crc kubenswrapper[4893]: I0314 07:24:58.728865 4893 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/079232b7-87bb-42cf-96ff-1eb2d1cfe2b5-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:58 crc kubenswrapper[4893]: I0314 07:24:58.728876 4893 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/ec3a7835-99ba-4d0d-b81d-2dea0dc7128b-var-lib\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:58 crc kubenswrapper[4893]: I0314 07:24:58.728911 4893 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Mar 14 07:24:58 crc kubenswrapper[4893]: I0314 07:24:58.728925 4893 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ec3a7835-99ba-4d0d-b81d-2dea0dc7128b-var-run\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:58 crc kubenswrapper[4893]: I0314 07:24:58.728938 4893 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/ec3a7835-99ba-4d0d-b81d-2dea0dc7128b-var-log\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:58 crc kubenswrapper[4893]: I0314 07:24:58.728949 4893 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ec3a7835-99ba-4d0d-b81d-2dea0dc7128b-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:58 crc kubenswrapper[4893]: I0314 07:24:58.728959 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hc7q\" (UniqueName: \"kubernetes.io/projected/ec3a7835-99ba-4d0d-b81d-2dea0dc7128b-kube-api-access-4hc7q\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:58 crc kubenswrapper[4893]: I0314 07:24:58.744332 4893 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Mar 14 07:24:58 crc kubenswrapper[4893]: I0314 07:24:58.831588 4893 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:58 crc kubenswrapper[4893]: I0314 07:24:58.880317 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bwq2l_ec3a7835-99ba-4d0d-b81d-2dea0dc7128b/ovs-vswitchd/0.log" Mar 14 07:24:58 crc kubenswrapper[4893]: I0314 07:24:58.881566 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-bwq2l" event={"ID":"ec3a7835-99ba-4d0d-b81d-2dea0dc7128b","Type":"ContainerDied","Data":"4272f8f5af088c355e4ceba4df8758997438611eb4234c6ad57c12eec356d5bb"} Mar 14 07:24:58 crc kubenswrapper[4893]: I0314 07:24:58.881944 4893 scope.go:117] "RemoveContainer" containerID="815433c66f3f469063c8f960c990cc45531d8077d9256e3db6d30e4aeb47f852" Mar 14 07:24:58 crc kubenswrapper[4893]: I0314 07:24:58.881602 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-bwq2l" Mar 14 07:24:58 crc kubenswrapper[4893]: I0314 07:24:58.894103 4893 generic.go:334] "Generic (PLEG): container finished" podID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerID="50280f9973eaf24cfaecf1b0325f81232bba8db39fef91743adc8159634ae0a5" exitCode=137 Mar 14 07:24:58 crc kubenswrapper[4893]: I0314 07:24:58.895432 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 14 07:24:58 crc kubenswrapper[4893]: I0314 07:24:58.896705 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"079232b7-87bb-42cf-96ff-1eb2d1cfe2b5","Type":"ContainerDied","Data":"50280f9973eaf24cfaecf1b0325f81232bba8db39fef91743adc8159634ae0a5"} Mar 14 07:24:58 crc kubenswrapper[4893]: I0314 07:24:58.896756 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"079232b7-87bb-42cf-96ff-1eb2d1cfe2b5","Type":"ContainerDied","Data":"6e826cc186438c3a3cd288097294739cb23b36306cb0e69a85d3829ae1ec9589"} Mar 14 07:24:58 crc kubenswrapper[4893]: I0314 07:24:58.927671 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-bwq2l"] Mar 14 07:24:58 crc kubenswrapper[4893]: I0314 07:24:58.930748 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/079232b7-87bb-42cf-96ff-1eb2d1cfe2b5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" (UID: "079232b7-87bb-42cf-96ff-1eb2d1cfe2b5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:58 crc kubenswrapper[4893]: I0314 07:24:58.932454 4893 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/079232b7-87bb-42cf-96ff-1eb2d1cfe2b5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:58 crc kubenswrapper[4893]: I0314 07:24:58.933875 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-bwq2l"] Mar 14 07:24:58 crc kubenswrapper[4893]: I0314 07:24:58.942413 4893 scope.go:117] "RemoveContainer" containerID="14dfeeb1ac493f8c868ceb12cb9f2623644d493f702d4b8d551c54f2d4fd82cb" Mar 14 07:24:58 crc kubenswrapper[4893]: I0314 07:24:58.965751 4893 scope.go:117] "RemoveContainer" containerID="56801d7dbcb1352157e545571b9cb4eda563977d1b3b6342a302dc72311e1b23" Mar 14 07:24:58 crc kubenswrapper[4893]: I0314 07:24:58.995376 4893 scope.go:117] "RemoveContainer" containerID="50280f9973eaf24cfaecf1b0325f81232bba8db39fef91743adc8159634ae0a5" Mar 14 07:24:59 crc kubenswrapper[4893]: I0314 07:24:59.015150 4893 scope.go:117] "RemoveContainer" containerID="61f20b3d153fc2b7e2f88fb820282348d684b7c7d0ab7a0173efe70cf59b854d" Mar 14 07:24:59 crc kubenswrapper[4893]: I0314 07:24:59.031630 4893 scope.go:117] "RemoveContainer" containerID="42879a04fa13be12602354e7d2287565ac8c111d2ae1985e6be2c424ccdde936" Mar 14 07:24:59 crc kubenswrapper[4893]: I0314 07:24:59.061346 4893 scope.go:117] "RemoveContainer" containerID="8e99f57da5e8b525a1c282c63776aab48b212ab715d4fbb62b57b9f55dd89177" Mar 14 07:24:59 crc kubenswrapper[4893]: I0314 07:24:59.089934 4893 scope.go:117] "RemoveContainer" containerID="b2ee065d0fc6e1370be16e3b3da813d71fb595937dc649af8cca1aaacddcfda3" Mar 14 07:24:59 crc kubenswrapper[4893]: I0314 07:24:59.110623 4893 scope.go:117] "RemoveContainer" containerID="c81f7949be58dba02aa8c25ce6d88e2e1c6c71c62d11b3a1abcfa0c71bd14098" Mar 14 07:24:59 crc kubenswrapper[4893]: I0314 07:24:59.130569 4893 scope.go:117] "RemoveContainer" containerID="5caf4411272c4d0599dad02231d0a82c29d93699fd274a70d9030405888f8c29" Mar 14 07:24:59 crc kubenswrapper[4893]: I0314 07:24:59.152423 4893 scope.go:117] "RemoveContainer" containerID="273a5c886b9d88d838c2d721fc03cc49f0a45e38b5a0c201ed139fec6b5f6ee1" Mar 14 07:24:59 crc kubenswrapper[4893]: I0314 07:24:59.172100 4893 scope.go:117] "RemoveContainer" containerID="c23ee49f390ead1ae4f7fca83deb5f84581ceae59f8e8ae8455a1874dfe7b051" Mar 14 07:24:59 crc kubenswrapper[4893]: I0314 07:24:59.192558 4893 scope.go:117] "RemoveContainer" containerID="68d10b4deb6f11784dbbc34682c64f702addc8c78d192179a6921060f8643b7d" Mar 14 07:24:59 crc kubenswrapper[4893]: I0314 07:24:59.210475 4893 scope.go:117] "RemoveContainer" containerID="9c507561a2ac6d068928170405d1a9bf41f0375a26767e40cbfa1697910cfdd9" Mar 14 07:24:59 crc kubenswrapper[4893]: I0314 07:24:59.236845 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Mar 14 07:24:59 crc kubenswrapper[4893]: I0314 07:24:59.241483 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Mar 14 07:24:59 crc kubenswrapper[4893]: I0314 07:24:59.254972 4893 scope.go:117] "RemoveContainer" containerID="265624aa91f386fecc5aba702935646c1883c30507f6b5633f11a004e1345c10" Mar 14 07:24:59 crc kubenswrapper[4893]: I0314 07:24:59.272931 4893 scope.go:117] "RemoveContainer" containerID="10212c44819536c44b3626faff9ed9fd4a4329b9deab21102b0375f256bb23d2" Mar 14 07:24:59 crc kubenswrapper[4893]: I0314 07:24:59.288586 4893 scope.go:117] "RemoveContainer" containerID="8181ccf3acc345fc42f8615ca13010757b2d59b421f286383ec54a73844363c3" Mar 14 07:24:59 crc kubenswrapper[4893]: I0314 07:24:59.312713 4893 scope.go:117] "RemoveContainer" containerID="e6185392ccc78feefed6979a46060f7ab1bf26d3d5dd3f3682205a3151cc587f" Mar 14 07:24:59 crc kubenswrapper[4893]: I0314 07:24:59.350800 4893 scope.go:117] "RemoveContainer" containerID="50280f9973eaf24cfaecf1b0325f81232bba8db39fef91743adc8159634ae0a5" Mar 14 07:24:59 crc kubenswrapper[4893]: E0314 07:24:59.351289 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50280f9973eaf24cfaecf1b0325f81232bba8db39fef91743adc8159634ae0a5\": container with ID starting with 50280f9973eaf24cfaecf1b0325f81232bba8db39fef91743adc8159634ae0a5 not found: ID does not exist" containerID="50280f9973eaf24cfaecf1b0325f81232bba8db39fef91743adc8159634ae0a5" Mar 14 07:24:59 crc kubenswrapper[4893]: I0314 07:24:59.351319 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50280f9973eaf24cfaecf1b0325f81232bba8db39fef91743adc8159634ae0a5"} err="failed to get container status \"50280f9973eaf24cfaecf1b0325f81232bba8db39fef91743adc8159634ae0a5\": rpc error: code = NotFound desc = could not find container \"50280f9973eaf24cfaecf1b0325f81232bba8db39fef91743adc8159634ae0a5\": container with ID starting with 50280f9973eaf24cfaecf1b0325f81232bba8db39fef91743adc8159634ae0a5 not found: ID does not exist" Mar 14 07:24:59 crc kubenswrapper[4893]: I0314 07:24:59.351339 4893 scope.go:117] "RemoveContainer" containerID="61f20b3d153fc2b7e2f88fb820282348d684b7c7d0ab7a0173efe70cf59b854d" Mar 14 07:24:59 crc kubenswrapper[4893]: E0314 07:24:59.351663 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61f20b3d153fc2b7e2f88fb820282348d684b7c7d0ab7a0173efe70cf59b854d\": container with ID starting with 61f20b3d153fc2b7e2f88fb820282348d684b7c7d0ab7a0173efe70cf59b854d not found: ID does not exist" containerID="61f20b3d153fc2b7e2f88fb820282348d684b7c7d0ab7a0173efe70cf59b854d" Mar 14 07:24:59 crc kubenswrapper[4893]: I0314 07:24:59.351721 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61f20b3d153fc2b7e2f88fb820282348d684b7c7d0ab7a0173efe70cf59b854d"} err="failed to get container status \"61f20b3d153fc2b7e2f88fb820282348d684b7c7d0ab7a0173efe70cf59b854d\": rpc error: code = NotFound desc = could not find container \"61f20b3d153fc2b7e2f88fb820282348d684b7c7d0ab7a0173efe70cf59b854d\": container with ID starting with 61f20b3d153fc2b7e2f88fb820282348d684b7c7d0ab7a0173efe70cf59b854d not found: ID does not exist" Mar 14 07:24:59 crc kubenswrapper[4893]: I0314 07:24:59.351755 4893 scope.go:117] "RemoveContainer" containerID="42879a04fa13be12602354e7d2287565ac8c111d2ae1985e6be2c424ccdde936" Mar 14 07:24:59 crc kubenswrapper[4893]: E0314 07:24:59.352158 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42879a04fa13be12602354e7d2287565ac8c111d2ae1985e6be2c424ccdde936\": container with ID starting with 42879a04fa13be12602354e7d2287565ac8c111d2ae1985e6be2c424ccdde936 not found: ID does not exist" containerID="42879a04fa13be12602354e7d2287565ac8c111d2ae1985e6be2c424ccdde936" Mar 14 07:24:59 crc kubenswrapper[4893]: I0314 07:24:59.352181 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42879a04fa13be12602354e7d2287565ac8c111d2ae1985e6be2c424ccdde936"} err="failed to get container status \"42879a04fa13be12602354e7d2287565ac8c111d2ae1985e6be2c424ccdde936\": rpc error: code = NotFound desc = could not find container \"42879a04fa13be12602354e7d2287565ac8c111d2ae1985e6be2c424ccdde936\": container with ID starting with 42879a04fa13be12602354e7d2287565ac8c111d2ae1985e6be2c424ccdde936 not found: ID does not exist" Mar 14 07:24:59 crc kubenswrapper[4893]: I0314 07:24:59.352196 4893 scope.go:117] "RemoveContainer" containerID="8e99f57da5e8b525a1c282c63776aab48b212ab715d4fbb62b57b9f55dd89177" Mar 14 07:24:59 crc kubenswrapper[4893]: E0314 07:24:59.352495 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e99f57da5e8b525a1c282c63776aab48b212ab715d4fbb62b57b9f55dd89177\": container with ID starting with 8e99f57da5e8b525a1c282c63776aab48b212ab715d4fbb62b57b9f55dd89177 not found: ID does not exist" containerID="8e99f57da5e8b525a1c282c63776aab48b212ab715d4fbb62b57b9f55dd89177" Mar 14 07:24:59 crc kubenswrapper[4893]: I0314 07:24:59.352538 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e99f57da5e8b525a1c282c63776aab48b212ab715d4fbb62b57b9f55dd89177"} err="failed to get container status \"8e99f57da5e8b525a1c282c63776aab48b212ab715d4fbb62b57b9f55dd89177\": rpc error: code = NotFound desc = could not find container \"8e99f57da5e8b525a1c282c63776aab48b212ab715d4fbb62b57b9f55dd89177\": container with ID starting with 8e99f57da5e8b525a1c282c63776aab48b212ab715d4fbb62b57b9f55dd89177 not found: ID does not exist" Mar 14 07:24:59 crc kubenswrapper[4893]: I0314 07:24:59.352557 4893 scope.go:117] "RemoveContainer" containerID="b2ee065d0fc6e1370be16e3b3da813d71fb595937dc649af8cca1aaacddcfda3" Mar 14 07:24:59 crc kubenswrapper[4893]: E0314 07:24:59.352849 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2ee065d0fc6e1370be16e3b3da813d71fb595937dc649af8cca1aaacddcfda3\": container with ID starting with b2ee065d0fc6e1370be16e3b3da813d71fb595937dc649af8cca1aaacddcfda3 not found: ID does not exist" containerID="b2ee065d0fc6e1370be16e3b3da813d71fb595937dc649af8cca1aaacddcfda3" Mar 14 07:24:59 crc kubenswrapper[4893]: I0314 07:24:59.352877 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2ee065d0fc6e1370be16e3b3da813d71fb595937dc649af8cca1aaacddcfda3"} err="failed to get container status \"b2ee065d0fc6e1370be16e3b3da813d71fb595937dc649af8cca1aaacddcfda3\": rpc error: code = NotFound desc = could not find container \"b2ee065d0fc6e1370be16e3b3da813d71fb595937dc649af8cca1aaacddcfda3\": container with ID starting with b2ee065d0fc6e1370be16e3b3da813d71fb595937dc649af8cca1aaacddcfda3 not found: ID does not exist" Mar 14 07:24:59 crc kubenswrapper[4893]: I0314 07:24:59.352893 4893 scope.go:117] "RemoveContainer" containerID="c81f7949be58dba02aa8c25ce6d88e2e1c6c71c62d11b3a1abcfa0c71bd14098" Mar 14 07:24:59 crc kubenswrapper[4893]: E0314 07:24:59.353147 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c81f7949be58dba02aa8c25ce6d88e2e1c6c71c62d11b3a1abcfa0c71bd14098\": container with ID starting with c81f7949be58dba02aa8c25ce6d88e2e1c6c71c62d11b3a1abcfa0c71bd14098 not found: ID does not exist" containerID="c81f7949be58dba02aa8c25ce6d88e2e1c6c71c62d11b3a1abcfa0c71bd14098" Mar 14 07:24:59 crc kubenswrapper[4893]: I0314 07:24:59.353172 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c81f7949be58dba02aa8c25ce6d88e2e1c6c71c62d11b3a1abcfa0c71bd14098"} err="failed to get container status \"c81f7949be58dba02aa8c25ce6d88e2e1c6c71c62d11b3a1abcfa0c71bd14098\": rpc error: code = NotFound desc = could not find container \"c81f7949be58dba02aa8c25ce6d88e2e1c6c71c62d11b3a1abcfa0c71bd14098\": container with ID starting with c81f7949be58dba02aa8c25ce6d88e2e1c6c71c62d11b3a1abcfa0c71bd14098 not found: ID does not exist" Mar 14 07:24:59 crc kubenswrapper[4893]: I0314 07:24:59.353188 4893 scope.go:117] "RemoveContainer" containerID="5caf4411272c4d0599dad02231d0a82c29d93699fd274a70d9030405888f8c29" Mar 14 07:24:59 crc kubenswrapper[4893]: E0314 07:24:59.353509 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5caf4411272c4d0599dad02231d0a82c29d93699fd274a70d9030405888f8c29\": container with ID starting with 5caf4411272c4d0599dad02231d0a82c29d93699fd274a70d9030405888f8c29 not found: ID does not exist" containerID="5caf4411272c4d0599dad02231d0a82c29d93699fd274a70d9030405888f8c29" Mar 14 07:24:59 crc kubenswrapper[4893]: I0314 07:24:59.353551 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5caf4411272c4d0599dad02231d0a82c29d93699fd274a70d9030405888f8c29"} err="failed to get container status \"5caf4411272c4d0599dad02231d0a82c29d93699fd274a70d9030405888f8c29\": rpc error: code = NotFound desc = could not find container \"5caf4411272c4d0599dad02231d0a82c29d93699fd274a70d9030405888f8c29\": container with ID starting with 5caf4411272c4d0599dad02231d0a82c29d93699fd274a70d9030405888f8c29 not found: ID does not exist" Mar 14 07:24:59 crc kubenswrapper[4893]: I0314 07:24:59.353570 4893 scope.go:117] "RemoveContainer" containerID="273a5c886b9d88d838c2d721fc03cc49f0a45e38b5a0c201ed139fec6b5f6ee1" Mar 14 07:24:59 crc kubenswrapper[4893]: E0314 07:24:59.353870 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"273a5c886b9d88d838c2d721fc03cc49f0a45e38b5a0c201ed139fec6b5f6ee1\": container with ID starting with 273a5c886b9d88d838c2d721fc03cc49f0a45e38b5a0c201ed139fec6b5f6ee1 not found: ID does not exist" containerID="273a5c886b9d88d838c2d721fc03cc49f0a45e38b5a0c201ed139fec6b5f6ee1" Mar 14 07:24:59 crc kubenswrapper[4893]: I0314 07:24:59.353927 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"273a5c886b9d88d838c2d721fc03cc49f0a45e38b5a0c201ed139fec6b5f6ee1"} err="failed to get container status \"273a5c886b9d88d838c2d721fc03cc49f0a45e38b5a0c201ed139fec6b5f6ee1\": rpc error: code = NotFound desc = could not find container \"273a5c886b9d88d838c2d721fc03cc49f0a45e38b5a0c201ed139fec6b5f6ee1\": container with ID starting with 273a5c886b9d88d838c2d721fc03cc49f0a45e38b5a0c201ed139fec6b5f6ee1 not found: ID does not exist" Mar 14 07:24:59 crc kubenswrapper[4893]: I0314 07:24:59.353970 4893 scope.go:117] "RemoveContainer" containerID="c23ee49f390ead1ae4f7fca83deb5f84581ceae59f8e8ae8455a1874dfe7b051" Mar 14 07:24:59 crc kubenswrapper[4893]: E0314 07:24:59.354411 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c23ee49f390ead1ae4f7fca83deb5f84581ceae59f8e8ae8455a1874dfe7b051\": container with ID starting with c23ee49f390ead1ae4f7fca83deb5f84581ceae59f8e8ae8455a1874dfe7b051 not found: ID does not exist" containerID="c23ee49f390ead1ae4f7fca83deb5f84581ceae59f8e8ae8455a1874dfe7b051" Mar 14 07:24:59 crc kubenswrapper[4893]: I0314 07:24:59.354442 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c23ee49f390ead1ae4f7fca83deb5f84581ceae59f8e8ae8455a1874dfe7b051"} err="failed to get container status \"c23ee49f390ead1ae4f7fca83deb5f84581ceae59f8e8ae8455a1874dfe7b051\": rpc error: code = NotFound desc = could not find container \"c23ee49f390ead1ae4f7fca83deb5f84581ceae59f8e8ae8455a1874dfe7b051\": container with ID starting with c23ee49f390ead1ae4f7fca83deb5f84581ceae59f8e8ae8455a1874dfe7b051 not found: ID does not exist" Mar 14 07:24:59 crc kubenswrapper[4893]: I0314 07:24:59.354459 4893 scope.go:117] "RemoveContainer" containerID="68d10b4deb6f11784dbbc34682c64f702addc8c78d192179a6921060f8643b7d" Mar 14 07:24:59 crc kubenswrapper[4893]: E0314 07:24:59.354883 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68d10b4deb6f11784dbbc34682c64f702addc8c78d192179a6921060f8643b7d\": container with ID starting with 68d10b4deb6f11784dbbc34682c64f702addc8c78d192179a6921060f8643b7d not found: ID does not exist" containerID="68d10b4deb6f11784dbbc34682c64f702addc8c78d192179a6921060f8643b7d" Mar 14 07:24:59 crc kubenswrapper[4893]: I0314 07:24:59.354919 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68d10b4deb6f11784dbbc34682c64f702addc8c78d192179a6921060f8643b7d"} err="failed to get container status \"68d10b4deb6f11784dbbc34682c64f702addc8c78d192179a6921060f8643b7d\": rpc error: code = NotFound desc = could not find container \"68d10b4deb6f11784dbbc34682c64f702addc8c78d192179a6921060f8643b7d\": container with ID starting with 68d10b4deb6f11784dbbc34682c64f702addc8c78d192179a6921060f8643b7d not found: ID does not exist" Mar 14 07:24:59 crc kubenswrapper[4893]: I0314 07:24:59.354939 4893 scope.go:117] "RemoveContainer" containerID="9c507561a2ac6d068928170405d1a9bf41f0375a26767e40cbfa1697910cfdd9" Mar 14 07:24:59 crc kubenswrapper[4893]: E0314 07:24:59.355179 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c507561a2ac6d068928170405d1a9bf41f0375a26767e40cbfa1697910cfdd9\": container with ID starting with 9c507561a2ac6d068928170405d1a9bf41f0375a26767e40cbfa1697910cfdd9 not found: ID does not exist" containerID="9c507561a2ac6d068928170405d1a9bf41f0375a26767e40cbfa1697910cfdd9" Mar 14 07:24:59 crc kubenswrapper[4893]: I0314 07:24:59.355202 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c507561a2ac6d068928170405d1a9bf41f0375a26767e40cbfa1697910cfdd9"} err="failed to get container status \"9c507561a2ac6d068928170405d1a9bf41f0375a26767e40cbfa1697910cfdd9\": rpc error: code = NotFound desc = could not find container \"9c507561a2ac6d068928170405d1a9bf41f0375a26767e40cbfa1697910cfdd9\": container with ID starting with 9c507561a2ac6d068928170405d1a9bf41f0375a26767e40cbfa1697910cfdd9 not found: ID does not exist" Mar 14 07:24:59 crc kubenswrapper[4893]: I0314 07:24:59.355216 4893 scope.go:117] "RemoveContainer" containerID="265624aa91f386fecc5aba702935646c1883c30507f6b5633f11a004e1345c10" Mar 14 07:24:59 crc kubenswrapper[4893]: E0314 07:24:59.355429 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"265624aa91f386fecc5aba702935646c1883c30507f6b5633f11a004e1345c10\": container with ID starting with 265624aa91f386fecc5aba702935646c1883c30507f6b5633f11a004e1345c10 not found: ID does not exist" containerID="265624aa91f386fecc5aba702935646c1883c30507f6b5633f11a004e1345c10" Mar 14 07:24:59 crc kubenswrapper[4893]: I0314 07:24:59.355453 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"265624aa91f386fecc5aba702935646c1883c30507f6b5633f11a004e1345c10"} err="failed to get container status \"265624aa91f386fecc5aba702935646c1883c30507f6b5633f11a004e1345c10\": rpc error: code = NotFound desc = could not find container \"265624aa91f386fecc5aba702935646c1883c30507f6b5633f11a004e1345c10\": container with ID starting with 265624aa91f386fecc5aba702935646c1883c30507f6b5633f11a004e1345c10 not found: ID does not exist" Mar 14 07:24:59 crc kubenswrapper[4893]: I0314 07:24:59.355466 4893 scope.go:117] "RemoveContainer" containerID="10212c44819536c44b3626faff9ed9fd4a4329b9deab21102b0375f256bb23d2" Mar 14 07:24:59 crc kubenswrapper[4893]: E0314 07:24:59.355664 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10212c44819536c44b3626faff9ed9fd4a4329b9deab21102b0375f256bb23d2\": container with ID starting with 10212c44819536c44b3626faff9ed9fd4a4329b9deab21102b0375f256bb23d2 not found: ID does not exist" containerID="10212c44819536c44b3626faff9ed9fd4a4329b9deab21102b0375f256bb23d2" Mar 14 07:24:59 crc kubenswrapper[4893]: I0314 07:24:59.355683 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10212c44819536c44b3626faff9ed9fd4a4329b9deab21102b0375f256bb23d2"} err="failed to get container status \"10212c44819536c44b3626faff9ed9fd4a4329b9deab21102b0375f256bb23d2\": rpc error: code = NotFound desc = could not find container \"10212c44819536c44b3626faff9ed9fd4a4329b9deab21102b0375f256bb23d2\": container with ID starting with 10212c44819536c44b3626faff9ed9fd4a4329b9deab21102b0375f256bb23d2 not found: ID does not exist" Mar 14 07:24:59 crc kubenswrapper[4893]: I0314 07:24:59.355695 4893 scope.go:117] "RemoveContainer" containerID="8181ccf3acc345fc42f8615ca13010757b2d59b421f286383ec54a73844363c3" Mar 14 07:24:59 crc kubenswrapper[4893]: E0314 07:24:59.355935 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8181ccf3acc345fc42f8615ca13010757b2d59b421f286383ec54a73844363c3\": container with ID starting with 8181ccf3acc345fc42f8615ca13010757b2d59b421f286383ec54a73844363c3 not found: ID does not exist" containerID="8181ccf3acc345fc42f8615ca13010757b2d59b421f286383ec54a73844363c3" Mar 14 07:24:59 crc kubenswrapper[4893]: I0314 07:24:59.355961 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8181ccf3acc345fc42f8615ca13010757b2d59b421f286383ec54a73844363c3"} err="failed to get container status \"8181ccf3acc345fc42f8615ca13010757b2d59b421f286383ec54a73844363c3\": rpc error: code = NotFound desc = could not find container \"8181ccf3acc345fc42f8615ca13010757b2d59b421f286383ec54a73844363c3\": container with ID starting with 8181ccf3acc345fc42f8615ca13010757b2d59b421f286383ec54a73844363c3 not found: ID does not exist" Mar 14 07:24:59 crc kubenswrapper[4893]: I0314 07:24:59.355979 4893 scope.go:117] "RemoveContainer" containerID="e6185392ccc78feefed6979a46060f7ab1bf26d3d5dd3f3682205a3151cc587f" Mar 14 07:24:59 crc kubenswrapper[4893]: E0314 07:24:59.356264 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6185392ccc78feefed6979a46060f7ab1bf26d3d5dd3f3682205a3151cc587f\": container with ID starting with e6185392ccc78feefed6979a46060f7ab1bf26d3d5dd3f3682205a3151cc587f not found: ID does not exist" containerID="e6185392ccc78feefed6979a46060f7ab1bf26d3d5dd3f3682205a3151cc587f" Mar 14 07:24:59 crc kubenswrapper[4893]: I0314 07:24:59.356319 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6185392ccc78feefed6979a46060f7ab1bf26d3d5dd3f3682205a3151cc587f"} err="failed to get container status \"e6185392ccc78feefed6979a46060f7ab1bf26d3d5dd3f3682205a3151cc587f\": rpc error: code = NotFound desc = could not find container \"e6185392ccc78feefed6979a46060f7ab1bf26d3d5dd3f3682205a3151cc587f\": container with ID starting with e6185392ccc78feefed6979a46060f7ab1bf26d3d5dd3f3682205a3151cc587f not found: ID does not exist" Mar 14 07:24:59 crc kubenswrapper[4893]: I0314 07:24:59.385901 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" path="/var/lib/kubelet/pods/079232b7-87bb-42cf-96ff-1eb2d1cfe2b5/volumes" Mar 14 07:24:59 crc kubenswrapper[4893]: I0314 07:24:59.387963 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec3a7835-99ba-4d0d-b81d-2dea0dc7128b" path="/var/lib/kubelet/pods/ec3a7835-99ba-4d0d-b81d-2dea0dc7128b/volumes" Mar 14 07:24:59 crc kubenswrapper[4893]: I0314 07:24:59.512600 4893 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod457f660f-9b87-4d37-a92e-0c30bb2a2fea"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod457f660f-9b87-4d37-a92e-0c30bb2a2fea] : Timed out while waiting for systemd to remove kubepods-besteffort-pod457f660f_9b87_4d37_a92e_0c30bb2a2fea.slice" Mar 14 07:24:59 crc kubenswrapper[4893]: E0314 07:24:59.512690 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod457f660f-9b87-4d37-a92e-0c30bb2a2fea] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod457f660f-9b87-4d37-a92e-0c30bb2a2fea] : Timed out while waiting for systemd to remove kubepods-besteffort-pod457f660f_9b87_4d37_a92e_0c30bb2a2fea.slice" pod="openstack/ovn-controller-metrics-7bbf4" podUID="457f660f-9b87-4d37-a92e-0c30bb2a2fea" Mar 14 07:24:59 crc kubenswrapper[4893]: I0314 07:24:59.517722 4893 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod5241cf43-f60b-4499-ae07-6b449f6ef57e"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod5241cf43-f60b-4499-ae07-6b449f6ef57e] : Timed out while waiting for systemd to remove kubepods-besteffort-pod5241cf43_f60b_4499_ae07_6b449f6ef57e.slice" Mar 14 07:24:59 crc kubenswrapper[4893]: E0314 07:24:59.517765 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod5241cf43-f60b-4499-ae07-6b449f6ef57e] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod5241cf43-f60b-4499-ae07-6b449f6ef57e] : Timed out while waiting for systemd to remove kubepods-besteffort-pod5241cf43_f60b_4499_ae07_6b449f6ef57e.slice" pod="openstack/dnsmasq-dns-69ffc749-7qsrf" podUID="5241cf43-f60b-4499-ae07-6b449f6ef57e" Mar 14 07:24:59 crc kubenswrapper[4893]: I0314 07:24:59.731061 4893 patch_prober.go:28] interesting pod/machine-config-daemon-d4x6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:24:59 crc kubenswrapper[4893]: I0314 07:24:59.731139 4893 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:24:59 crc kubenswrapper[4893]: I0314 07:24:59.913210 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69ffc749-7qsrf" Mar 14 07:24:59 crc kubenswrapper[4893]: I0314 07:24:59.914273 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-7bbf4" Mar 14 07:24:59 crc kubenswrapper[4893]: I0314 07:24:59.941973 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-7bbf4"] Mar 14 07:24:59 crc kubenswrapper[4893]: I0314 07:24:59.951366 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-7bbf4"] Mar 14 07:24:59 crc kubenswrapper[4893]: I0314 07:24:59.958099 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69ffc749-7qsrf"] Mar 14 07:24:59 crc kubenswrapper[4893]: I0314 07:24:59.964421 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-69ffc749-7qsrf"] Mar 14 07:25:01 crc kubenswrapper[4893]: I0314 07:25:01.397955 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="457f660f-9b87-4d37-a92e-0c30bb2a2fea" path="/var/lib/kubelet/pods/457f660f-9b87-4d37-a92e-0c30bb2a2fea/volumes" Mar 14 07:25:01 crc kubenswrapper[4893]: I0314 07:25:01.399071 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5241cf43-f60b-4499-ae07-6b449f6ef57e" path="/var/lib/kubelet/pods/5241cf43-f60b-4499-ae07-6b449f6ef57e/volumes" Mar 14 07:25:01 crc kubenswrapper[4893]: I0314 07:25:01.525755 4893 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod5296cc7d-3008-44e6-ae0b-f88c333e13aa"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod5296cc7d-3008-44e6-ae0b-f88c333e13aa] : Timed out while waiting for systemd to remove kubepods-besteffort-pod5296cc7d_3008_44e6_ae0b_f88c333e13aa.slice" Mar 14 07:25:01 crc kubenswrapper[4893]: E0314 07:25:01.525855 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod5296cc7d-3008-44e6-ae0b-f88c333e13aa] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod5296cc7d-3008-44e6-ae0b-f88c333e13aa] : Timed out while waiting for systemd to remove kubepods-besteffort-pod5296cc7d_3008_44e6_ae0b_f88c333e13aa.slice" pod="openstack/nova-cell0-259c-account-create-update-5jsnr" podUID="5296cc7d-3008-44e6-ae0b-f88c333e13aa" Mar 14 07:25:01 crc kubenswrapper[4893]: I0314 07:25:01.932039 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-259c-account-create-update-5jsnr" Mar 14 07:25:01 crc kubenswrapper[4893]: I0314 07:25:01.999633 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-259c-account-create-update-5jsnr"] Mar 14 07:25:02 crc kubenswrapper[4893]: I0314 07:25:02.007307 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-259c-account-create-update-5jsnr"] Mar 14 07:25:02 crc kubenswrapper[4893]: I0314 07:25:02.903050 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lskvp" Mar 14 07:25:02 crc kubenswrapper[4893]: I0314 07:25:02.903106 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lskvp" Mar 14 07:25:02 crc kubenswrapper[4893]: I0314 07:25:02.975072 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lskvp" Mar 14 07:25:03 crc kubenswrapper[4893]: I0314 07:25:03.389963 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5296cc7d-3008-44e6-ae0b-f88c333e13aa" path="/var/lib/kubelet/pods/5296cc7d-3008-44e6-ae0b-f88c333e13aa/volumes" Mar 14 07:25:03 crc kubenswrapper[4893]: I0314 07:25:03.793042 4893 scope.go:117] "RemoveContainer" containerID="748fde2a5a1f03fcf4446be0f3aae719eb6b6ddc7af7b25240b22a1f8bfac2ea" Mar 14 07:25:03 crc kubenswrapper[4893]: I0314 07:25:03.993867 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lskvp" Mar 14 07:25:04 crc kubenswrapper[4893]: I0314 07:25:04.045298 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lskvp"] Mar 14 07:25:05 crc kubenswrapper[4893]: I0314 07:25:05.969871 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-lskvp" podUID="591ee530-a0a9-4d02-81e1-c6b319538818" containerName="registry-server" containerID="cri-o://374f4419381c981b99844482ad64e5359e7761ec72c7f14b70e6936b009e2516" gracePeriod=2 Mar 14 07:25:07 crc kubenswrapper[4893]: I0314 07:25:07.991734 4893 generic.go:334] "Generic (PLEG): container finished" podID="591ee530-a0a9-4d02-81e1-c6b319538818" containerID="374f4419381c981b99844482ad64e5359e7761ec72c7f14b70e6936b009e2516" exitCode=0 Mar 14 07:25:07 crc kubenswrapper[4893]: I0314 07:25:07.991814 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lskvp" event={"ID":"591ee530-a0a9-4d02-81e1-c6b319538818","Type":"ContainerDied","Data":"374f4419381c981b99844482ad64e5359e7761ec72c7f14b70e6936b009e2516"} Mar 14 07:25:08 crc kubenswrapper[4893]: I0314 07:25:08.612060 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lskvp" Mar 14 07:25:08 crc kubenswrapper[4893]: I0314 07:25:08.795534 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjfkw\" (UniqueName: \"kubernetes.io/projected/591ee530-a0a9-4d02-81e1-c6b319538818-kube-api-access-kjfkw\") pod \"591ee530-a0a9-4d02-81e1-c6b319538818\" (UID: \"591ee530-a0a9-4d02-81e1-c6b319538818\") " Mar 14 07:25:08 crc kubenswrapper[4893]: I0314 07:25:08.795585 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/591ee530-a0a9-4d02-81e1-c6b319538818-catalog-content\") pod \"591ee530-a0a9-4d02-81e1-c6b319538818\" (UID: \"591ee530-a0a9-4d02-81e1-c6b319538818\") " Mar 14 07:25:08 crc kubenswrapper[4893]: I0314 07:25:08.795645 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/591ee530-a0a9-4d02-81e1-c6b319538818-utilities\") pod \"591ee530-a0a9-4d02-81e1-c6b319538818\" (UID: \"591ee530-a0a9-4d02-81e1-c6b319538818\") " Mar 14 07:25:08 crc kubenswrapper[4893]: I0314 07:25:08.796705 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/591ee530-a0a9-4d02-81e1-c6b319538818-utilities" (OuterVolumeSpecName: "utilities") pod "591ee530-a0a9-4d02-81e1-c6b319538818" (UID: "591ee530-a0a9-4d02-81e1-c6b319538818"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:25:08 crc kubenswrapper[4893]: I0314 07:25:08.803232 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/591ee530-a0a9-4d02-81e1-c6b319538818-kube-api-access-kjfkw" (OuterVolumeSpecName: "kube-api-access-kjfkw") pod "591ee530-a0a9-4d02-81e1-c6b319538818" (UID: "591ee530-a0a9-4d02-81e1-c6b319538818"). InnerVolumeSpecName "kube-api-access-kjfkw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:25:08 crc kubenswrapper[4893]: I0314 07:25:08.845564 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/591ee530-a0a9-4d02-81e1-c6b319538818-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "591ee530-a0a9-4d02-81e1-c6b319538818" (UID: "591ee530-a0a9-4d02-81e1-c6b319538818"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:25:08 crc kubenswrapper[4893]: I0314 07:25:08.897988 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjfkw\" (UniqueName: \"kubernetes.io/projected/591ee530-a0a9-4d02-81e1-c6b319538818-kube-api-access-kjfkw\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:08 crc kubenswrapper[4893]: I0314 07:25:08.898041 4893 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/591ee530-a0a9-4d02-81e1-c6b319538818-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:08 crc kubenswrapper[4893]: I0314 07:25:08.898060 4893 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/591ee530-a0a9-4d02-81e1-c6b319538818-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:09 crc kubenswrapper[4893]: I0314 07:25:09.001670 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lskvp" event={"ID":"591ee530-a0a9-4d02-81e1-c6b319538818","Type":"ContainerDied","Data":"ab4bdf606cd983524407b6f07069e09a962c270eeca0bb00e3cbd96da61b0a4a"} Mar 14 07:25:09 crc kubenswrapper[4893]: I0314 07:25:09.001717 4893 scope.go:117] "RemoveContainer" containerID="374f4419381c981b99844482ad64e5359e7761ec72c7f14b70e6936b009e2516" Mar 14 07:25:09 crc kubenswrapper[4893]: I0314 07:25:09.001737 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lskvp" Mar 14 07:25:09 crc kubenswrapper[4893]: I0314 07:25:09.027101 4893 scope.go:117] "RemoveContainer" containerID="cf3704ab8cb20fd9bcc707621442037d171a3addf4c2fd921977d25807c827f9" Mar 14 07:25:09 crc kubenswrapper[4893]: I0314 07:25:09.041374 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lskvp"] Mar 14 07:25:09 crc kubenswrapper[4893]: I0314 07:25:09.071349 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-lskvp"] Mar 14 07:25:09 crc kubenswrapper[4893]: I0314 07:25:09.080173 4893 scope.go:117] "RemoveContainer" containerID="4f844164e42aa2116315b5e86c4bd4b988de56eae5a30eac39ecd10731a9b0bf" Mar 14 07:25:09 crc kubenswrapper[4893]: I0314 07:25:09.394391 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="591ee530-a0a9-4d02-81e1-c6b319538818" path="/var/lib/kubelet/pods/591ee530-a0a9-4d02-81e1-c6b319538818/volumes" Mar 14 07:25:12 crc kubenswrapper[4893]: I0314 07:25:12.321815 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7wkjd"] Mar 14 07:25:12 crc kubenswrapper[4893]: E0314 07:25:12.323188 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec3a7835-99ba-4d0d-b81d-2dea0dc7128b" containerName="ovsdb-server-init" Mar 14 07:25:12 crc kubenswrapper[4893]: I0314 07:25:12.323287 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec3a7835-99ba-4d0d-b81d-2dea0dc7128b" containerName="ovsdb-server-init" Mar 14 07:25:12 crc kubenswrapper[4893]: E0314 07:25:12.323391 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerName="swift-recon-cron" Mar 14 07:25:12 crc kubenswrapper[4893]: I0314 07:25:12.323512 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerName="swift-recon-cron" Mar 14 07:25:12 crc kubenswrapper[4893]: E0314 07:25:12.323634 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec3a7835-99ba-4d0d-b81d-2dea0dc7128b" containerName="ovsdb-server" Mar 14 07:25:12 crc kubenswrapper[4893]: I0314 07:25:12.323721 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec3a7835-99ba-4d0d-b81d-2dea0dc7128b" containerName="ovsdb-server" Mar 14 07:25:12 crc kubenswrapper[4893]: E0314 07:25:12.323856 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="591ee530-a0a9-4d02-81e1-c6b319538818" containerName="extract-utilities" Mar 14 07:25:12 crc kubenswrapper[4893]: I0314 07:25:12.323939 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="591ee530-a0a9-4d02-81e1-c6b319538818" containerName="extract-utilities" Mar 14 07:25:12 crc kubenswrapper[4893]: E0314 07:25:12.324013 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerName="object-expirer" Mar 14 07:25:12 crc kubenswrapper[4893]: I0314 07:25:12.324095 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerName="object-expirer" Mar 14 07:25:12 crc kubenswrapper[4893]: E0314 07:25:12.324183 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerName="object-replicator" Mar 14 07:25:12 crc kubenswrapper[4893]: I0314 07:25:12.324263 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerName="object-replicator" Mar 14 07:25:12 crc kubenswrapper[4893]: E0314 07:25:12.324344 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerName="container-replicator" Mar 14 07:25:12 crc kubenswrapper[4893]: I0314 07:25:12.324459 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerName="container-replicator" Mar 14 07:25:12 crc kubenswrapper[4893]: E0314 07:25:12.324597 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerName="object-server" Mar 14 07:25:12 crc kubenswrapper[4893]: I0314 07:25:12.324728 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerName="object-server" Mar 14 07:25:12 crc kubenswrapper[4893]: E0314 07:25:12.324857 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerName="account-auditor" Mar 14 07:25:12 crc kubenswrapper[4893]: I0314 07:25:12.325004 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerName="account-auditor" Mar 14 07:25:12 crc kubenswrapper[4893]: E0314 07:25:12.325093 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerName="container-auditor" Mar 14 07:25:12 crc kubenswrapper[4893]: I0314 07:25:12.325177 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerName="container-auditor" Mar 14 07:25:12 crc kubenswrapper[4893]: E0314 07:25:12.325261 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerName="container-updater" Mar 14 07:25:12 crc kubenswrapper[4893]: I0314 07:25:12.325334 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerName="container-updater" Mar 14 07:25:12 crc kubenswrapper[4893]: E0314 07:25:12.325413 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerName="account-replicator" Mar 14 07:25:12 crc kubenswrapper[4893]: I0314 07:25:12.325509 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerName="account-replicator" Mar 14 07:25:12 crc kubenswrapper[4893]: E0314 07:25:12.325665 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerName="object-updater" Mar 14 07:25:12 crc kubenswrapper[4893]: I0314 07:25:12.325772 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerName="object-updater" Mar 14 07:25:12 crc kubenswrapper[4893]: E0314 07:25:12.325906 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerName="account-reaper" Mar 14 07:25:12 crc kubenswrapper[4893]: I0314 07:25:12.326011 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerName="account-reaper" Mar 14 07:25:12 crc kubenswrapper[4893]: E0314 07:25:12.326171 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec3a7835-99ba-4d0d-b81d-2dea0dc7128b" containerName="ovs-vswitchd" Mar 14 07:25:12 crc kubenswrapper[4893]: I0314 07:25:12.326295 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec3a7835-99ba-4d0d-b81d-2dea0dc7128b" containerName="ovs-vswitchd" Mar 14 07:25:12 crc kubenswrapper[4893]: E0314 07:25:12.326439 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerName="object-auditor" Mar 14 07:25:12 crc kubenswrapper[4893]: I0314 07:25:12.326577 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerName="object-auditor" Mar 14 07:25:12 crc kubenswrapper[4893]: E0314 07:25:12.326685 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerName="account-server" Mar 14 07:25:12 crc kubenswrapper[4893]: I0314 07:25:12.326824 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerName="account-server" Mar 14 07:25:12 crc kubenswrapper[4893]: E0314 07:25:12.326960 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerName="rsync" Mar 14 07:25:12 crc kubenswrapper[4893]: I0314 07:25:12.327082 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerName="rsync" Mar 14 07:25:12 crc kubenswrapper[4893]: E0314 07:25:12.327200 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerName="container-server" Mar 14 07:25:12 crc kubenswrapper[4893]: I0314 07:25:12.327314 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerName="container-server" Mar 14 07:25:12 crc kubenswrapper[4893]: E0314 07:25:12.327434 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="591ee530-a0a9-4d02-81e1-c6b319538818" containerName="extract-content" Mar 14 07:25:12 crc kubenswrapper[4893]: I0314 07:25:12.327568 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="591ee530-a0a9-4d02-81e1-c6b319538818" containerName="extract-content" Mar 14 07:25:12 crc kubenswrapper[4893]: E0314 07:25:12.327686 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="591ee530-a0a9-4d02-81e1-c6b319538818" containerName="registry-server" Mar 14 07:25:12 crc kubenswrapper[4893]: I0314 07:25:12.327816 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="591ee530-a0a9-4d02-81e1-c6b319538818" containerName="registry-server" Mar 14 07:25:12 crc kubenswrapper[4893]: I0314 07:25:12.328160 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerName="object-replicator" Mar 14 07:25:12 crc kubenswrapper[4893]: I0314 07:25:12.328290 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerName="container-updater" Mar 14 07:25:12 crc kubenswrapper[4893]: I0314 07:25:12.328412 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerName="account-reaper" Mar 14 07:25:12 crc kubenswrapper[4893]: I0314 07:25:12.328553 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerName="object-server" Mar 14 07:25:12 crc kubenswrapper[4893]: I0314 07:25:12.328676 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerName="object-expirer" Mar 14 07:25:12 crc kubenswrapper[4893]: I0314 07:25:12.328811 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec3a7835-99ba-4d0d-b81d-2dea0dc7128b" containerName="ovsdb-server" Mar 14 07:25:12 crc kubenswrapper[4893]: I0314 07:25:12.328935 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="591ee530-a0a9-4d02-81e1-c6b319538818" containerName="registry-server" Mar 14 07:25:12 crc kubenswrapper[4893]: I0314 07:25:12.329045 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerName="swift-recon-cron" Mar 14 07:25:12 crc kubenswrapper[4893]: I0314 07:25:12.329181 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerName="object-updater" Mar 14 07:25:12 crc kubenswrapper[4893]: I0314 07:25:12.329299 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerName="container-auditor" Mar 14 07:25:12 crc kubenswrapper[4893]: I0314 07:25:12.329423 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerName="container-server" Mar 14 07:25:12 crc kubenswrapper[4893]: I0314 07:25:12.329578 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerName="container-replicator" Mar 14 07:25:12 crc kubenswrapper[4893]: I0314 07:25:12.329713 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerName="account-replicator" Mar 14 07:25:12 crc kubenswrapper[4893]: I0314 07:25:12.329825 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec3a7835-99ba-4d0d-b81d-2dea0dc7128b" containerName="ovs-vswitchd" Mar 14 07:25:12 crc kubenswrapper[4893]: I0314 07:25:12.329933 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerName="rsync" Mar 14 07:25:12 crc kubenswrapper[4893]: I0314 07:25:12.330052 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerName="account-auditor" Mar 14 07:25:12 crc kubenswrapper[4893]: I0314 07:25:12.330190 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerName="object-auditor" Mar 14 07:25:12 crc kubenswrapper[4893]: I0314 07:25:12.331207 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="079232b7-87bb-42cf-96ff-1eb2d1cfe2b5" containerName="account-server" Mar 14 07:25:12 crc kubenswrapper[4893]: I0314 07:25:12.332624 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7wkjd"] Mar 14 07:25:12 crc kubenswrapper[4893]: I0314 07:25:12.332811 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7wkjd" Mar 14 07:25:12 crc kubenswrapper[4893]: I0314 07:25:12.452786 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfdjx\" (UniqueName: \"kubernetes.io/projected/3ab6f1ea-cecc-4ad8-bb05-9fc300a98109-kube-api-access-wfdjx\") pod \"redhat-marketplace-7wkjd\" (UID: \"3ab6f1ea-cecc-4ad8-bb05-9fc300a98109\") " pod="openshift-marketplace/redhat-marketplace-7wkjd" Mar 14 07:25:12 crc kubenswrapper[4893]: I0314 07:25:12.452916 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ab6f1ea-cecc-4ad8-bb05-9fc300a98109-catalog-content\") pod \"redhat-marketplace-7wkjd\" (UID: \"3ab6f1ea-cecc-4ad8-bb05-9fc300a98109\") " pod="openshift-marketplace/redhat-marketplace-7wkjd" Mar 14 07:25:12 crc kubenswrapper[4893]: I0314 07:25:12.452995 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ab6f1ea-cecc-4ad8-bb05-9fc300a98109-utilities\") pod \"redhat-marketplace-7wkjd\" (UID: \"3ab6f1ea-cecc-4ad8-bb05-9fc300a98109\") " pod="openshift-marketplace/redhat-marketplace-7wkjd" Mar 14 07:25:12 crc kubenswrapper[4893]: I0314 07:25:12.554831 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfdjx\" (UniqueName: \"kubernetes.io/projected/3ab6f1ea-cecc-4ad8-bb05-9fc300a98109-kube-api-access-wfdjx\") pod \"redhat-marketplace-7wkjd\" (UID: \"3ab6f1ea-cecc-4ad8-bb05-9fc300a98109\") " pod="openshift-marketplace/redhat-marketplace-7wkjd" Mar 14 07:25:12 crc kubenswrapper[4893]: I0314 07:25:12.554918 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ab6f1ea-cecc-4ad8-bb05-9fc300a98109-catalog-content\") pod \"redhat-marketplace-7wkjd\" (UID: \"3ab6f1ea-cecc-4ad8-bb05-9fc300a98109\") " pod="openshift-marketplace/redhat-marketplace-7wkjd" Mar 14 07:25:12 crc kubenswrapper[4893]: I0314 07:25:12.554990 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ab6f1ea-cecc-4ad8-bb05-9fc300a98109-utilities\") pod \"redhat-marketplace-7wkjd\" (UID: \"3ab6f1ea-cecc-4ad8-bb05-9fc300a98109\") " pod="openshift-marketplace/redhat-marketplace-7wkjd" Mar 14 07:25:12 crc kubenswrapper[4893]: I0314 07:25:12.555668 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ab6f1ea-cecc-4ad8-bb05-9fc300a98109-utilities\") pod \"redhat-marketplace-7wkjd\" (UID: \"3ab6f1ea-cecc-4ad8-bb05-9fc300a98109\") " pod="openshift-marketplace/redhat-marketplace-7wkjd" Mar 14 07:25:12 crc kubenswrapper[4893]: I0314 07:25:12.557939 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ab6f1ea-cecc-4ad8-bb05-9fc300a98109-catalog-content\") pod \"redhat-marketplace-7wkjd\" (UID: \"3ab6f1ea-cecc-4ad8-bb05-9fc300a98109\") " pod="openshift-marketplace/redhat-marketplace-7wkjd" Mar 14 07:25:12 crc kubenswrapper[4893]: I0314 07:25:12.578830 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfdjx\" (UniqueName: \"kubernetes.io/projected/3ab6f1ea-cecc-4ad8-bb05-9fc300a98109-kube-api-access-wfdjx\") pod \"redhat-marketplace-7wkjd\" (UID: \"3ab6f1ea-cecc-4ad8-bb05-9fc300a98109\") " pod="openshift-marketplace/redhat-marketplace-7wkjd" Mar 14 07:25:12 crc kubenswrapper[4893]: I0314 07:25:12.677972 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7wkjd" Mar 14 07:25:12 crc kubenswrapper[4893]: I0314 07:25:12.935558 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7wkjd"] Mar 14 07:25:13 crc kubenswrapper[4893]: I0314 07:25:13.042745 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7wkjd" event={"ID":"3ab6f1ea-cecc-4ad8-bb05-9fc300a98109","Type":"ContainerStarted","Data":"9897e2c941788b9cc5b6b557f1b8ab0abf4a7218b883ab637abc6bac1c3e74cc"} Mar 14 07:25:14 crc kubenswrapper[4893]: I0314 07:25:14.058616 4893 generic.go:334] "Generic (PLEG): container finished" podID="3ab6f1ea-cecc-4ad8-bb05-9fc300a98109" containerID="4cb3540582d217a10706cae5e1c500de894dcf321ff8541002d6dcb7fcf955b0" exitCode=0 Mar 14 07:25:14 crc kubenswrapper[4893]: I0314 07:25:14.058678 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7wkjd" event={"ID":"3ab6f1ea-cecc-4ad8-bb05-9fc300a98109","Type":"ContainerDied","Data":"4cb3540582d217a10706cae5e1c500de894dcf321ff8541002d6dcb7fcf955b0"} Mar 14 07:25:16 crc kubenswrapper[4893]: I0314 07:25:16.087966 4893 generic.go:334] "Generic (PLEG): container finished" podID="3ab6f1ea-cecc-4ad8-bb05-9fc300a98109" containerID="ee11db7d932fbfffb823e9c557bd5b86aa603adf418a0bec30eb5a1ce07e5bbc" exitCode=0 Mar 14 07:25:16 crc kubenswrapper[4893]: I0314 07:25:16.088036 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7wkjd" event={"ID":"3ab6f1ea-cecc-4ad8-bb05-9fc300a98109","Type":"ContainerDied","Data":"ee11db7d932fbfffb823e9c557bd5b86aa603adf418a0bec30eb5a1ce07e5bbc"} Mar 14 07:25:17 crc kubenswrapper[4893]: I0314 07:25:17.096769 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7wkjd" event={"ID":"3ab6f1ea-cecc-4ad8-bb05-9fc300a98109","Type":"ContainerStarted","Data":"8c07446fe90ae5c66cebd23b1154be70617eb5db8900fedf62384918f4d4b49c"} Mar 14 07:25:17 crc kubenswrapper[4893]: I0314 07:25:17.117228 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7wkjd" podStartSLOduration=2.5794251299999997 podStartE2EDuration="5.117212263s" podCreationTimestamp="2026-03-14 07:25:12 +0000 UTC" firstStartedPulling="2026-03-14 07:25:14.060428765 +0000 UTC m=+1593.322605557" lastFinishedPulling="2026-03-14 07:25:16.598215898 +0000 UTC m=+1595.860392690" observedRunningTime="2026-03-14 07:25:17.115472401 +0000 UTC m=+1596.377649203" watchObservedRunningTime="2026-03-14 07:25:17.117212263 +0000 UTC m=+1596.379389055" Mar 14 07:25:22 crc kubenswrapper[4893]: I0314 07:25:22.678975 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7wkjd" Mar 14 07:25:22 crc kubenswrapper[4893]: I0314 07:25:22.679804 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7wkjd" Mar 14 07:25:22 crc kubenswrapper[4893]: I0314 07:25:22.746120 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7wkjd" Mar 14 07:25:23 crc kubenswrapper[4893]: I0314 07:25:23.230944 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7wkjd" Mar 14 07:25:23 crc kubenswrapper[4893]: I0314 07:25:23.755785 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7wkjd"] Mar 14 07:25:25 crc kubenswrapper[4893]: I0314 07:25:25.173237 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7wkjd" podUID="3ab6f1ea-cecc-4ad8-bb05-9fc300a98109" containerName="registry-server" containerID="cri-o://8c07446fe90ae5c66cebd23b1154be70617eb5db8900fedf62384918f4d4b49c" gracePeriod=2 Mar 14 07:25:25 crc kubenswrapper[4893]: I0314 07:25:25.617670 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7wkjd" Mar 14 07:25:25 crc kubenswrapper[4893]: I0314 07:25:25.675031 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ab6f1ea-cecc-4ad8-bb05-9fc300a98109-utilities\") pod \"3ab6f1ea-cecc-4ad8-bb05-9fc300a98109\" (UID: \"3ab6f1ea-cecc-4ad8-bb05-9fc300a98109\") " Mar 14 07:25:25 crc kubenswrapper[4893]: I0314 07:25:25.675167 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ab6f1ea-cecc-4ad8-bb05-9fc300a98109-catalog-content\") pod \"3ab6f1ea-cecc-4ad8-bb05-9fc300a98109\" (UID: \"3ab6f1ea-cecc-4ad8-bb05-9fc300a98109\") " Mar 14 07:25:25 crc kubenswrapper[4893]: I0314 07:25:25.675248 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfdjx\" (UniqueName: \"kubernetes.io/projected/3ab6f1ea-cecc-4ad8-bb05-9fc300a98109-kube-api-access-wfdjx\") pod \"3ab6f1ea-cecc-4ad8-bb05-9fc300a98109\" (UID: \"3ab6f1ea-cecc-4ad8-bb05-9fc300a98109\") " Mar 14 07:25:25 crc kubenswrapper[4893]: I0314 07:25:25.676568 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ab6f1ea-cecc-4ad8-bb05-9fc300a98109-utilities" (OuterVolumeSpecName: "utilities") pod "3ab6f1ea-cecc-4ad8-bb05-9fc300a98109" (UID: "3ab6f1ea-cecc-4ad8-bb05-9fc300a98109"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:25:25 crc kubenswrapper[4893]: I0314 07:25:25.681804 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab6f1ea-cecc-4ad8-bb05-9fc300a98109-kube-api-access-wfdjx" (OuterVolumeSpecName: "kube-api-access-wfdjx") pod "3ab6f1ea-cecc-4ad8-bb05-9fc300a98109" (UID: "3ab6f1ea-cecc-4ad8-bb05-9fc300a98109"). InnerVolumeSpecName "kube-api-access-wfdjx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:25:25 crc kubenswrapper[4893]: I0314 07:25:25.711776 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ab6f1ea-cecc-4ad8-bb05-9fc300a98109-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3ab6f1ea-cecc-4ad8-bb05-9fc300a98109" (UID: "3ab6f1ea-cecc-4ad8-bb05-9fc300a98109"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:25:25 crc kubenswrapper[4893]: I0314 07:25:25.776816 4893 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ab6f1ea-cecc-4ad8-bb05-9fc300a98109-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:25 crc kubenswrapper[4893]: I0314 07:25:25.776846 4893 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ab6f1ea-cecc-4ad8-bb05-9fc300a98109-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:25 crc kubenswrapper[4893]: I0314 07:25:25.776859 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wfdjx\" (UniqueName: \"kubernetes.io/projected/3ab6f1ea-cecc-4ad8-bb05-9fc300a98109-kube-api-access-wfdjx\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:26 crc kubenswrapper[4893]: I0314 07:25:26.184097 4893 generic.go:334] "Generic (PLEG): container finished" podID="3ab6f1ea-cecc-4ad8-bb05-9fc300a98109" containerID="8c07446fe90ae5c66cebd23b1154be70617eb5db8900fedf62384918f4d4b49c" exitCode=0 Mar 14 07:25:26 crc kubenswrapper[4893]: I0314 07:25:26.184182 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7wkjd" event={"ID":"3ab6f1ea-cecc-4ad8-bb05-9fc300a98109","Type":"ContainerDied","Data":"8c07446fe90ae5c66cebd23b1154be70617eb5db8900fedf62384918f4d4b49c"} Mar 14 07:25:26 crc kubenswrapper[4893]: I0314 07:25:26.184234 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7wkjd" event={"ID":"3ab6f1ea-cecc-4ad8-bb05-9fc300a98109","Type":"ContainerDied","Data":"9897e2c941788b9cc5b6b557f1b8ab0abf4a7218b883ab637abc6bac1c3e74cc"} Mar 14 07:25:26 crc kubenswrapper[4893]: I0314 07:25:26.184274 4893 scope.go:117] "RemoveContainer" containerID="8c07446fe90ae5c66cebd23b1154be70617eb5db8900fedf62384918f4d4b49c" Mar 14 07:25:26 crc kubenswrapper[4893]: I0314 07:25:26.184561 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7wkjd" Mar 14 07:25:26 crc kubenswrapper[4893]: I0314 07:25:26.205284 4893 scope.go:117] "RemoveContainer" containerID="ee11db7d932fbfffb823e9c557bd5b86aa603adf418a0bec30eb5a1ce07e5bbc" Mar 14 07:25:26 crc kubenswrapper[4893]: I0314 07:25:26.226754 4893 scope.go:117] "RemoveContainer" containerID="4cb3540582d217a10706cae5e1c500de894dcf321ff8541002d6dcb7fcf955b0" Mar 14 07:25:26 crc kubenswrapper[4893]: I0314 07:25:26.231343 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7wkjd"] Mar 14 07:25:26 crc kubenswrapper[4893]: I0314 07:25:26.236441 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7wkjd"] Mar 14 07:25:26 crc kubenswrapper[4893]: I0314 07:25:26.273249 4893 scope.go:117] "RemoveContainer" containerID="8c07446fe90ae5c66cebd23b1154be70617eb5db8900fedf62384918f4d4b49c" Mar 14 07:25:26 crc kubenswrapper[4893]: E0314 07:25:26.273681 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c07446fe90ae5c66cebd23b1154be70617eb5db8900fedf62384918f4d4b49c\": container with ID starting with 8c07446fe90ae5c66cebd23b1154be70617eb5db8900fedf62384918f4d4b49c not found: ID does not exist" containerID="8c07446fe90ae5c66cebd23b1154be70617eb5db8900fedf62384918f4d4b49c" Mar 14 07:25:26 crc kubenswrapper[4893]: I0314 07:25:26.273752 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c07446fe90ae5c66cebd23b1154be70617eb5db8900fedf62384918f4d4b49c"} err="failed to get container status \"8c07446fe90ae5c66cebd23b1154be70617eb5db8900fedf62384918f4d4b49c\": rpc error: code = NotFound desc = could not find container \"8c07446fe90ae5c66cebd23b1154be70617eb5db8900fedf62384918f4d4b49c\": container with ID starting with 8c07446fe90ae5c66cebd23b1154be70617eb5db8900fedf62384918f4d4b49c not found: ID does not exist" Mar 14 07:25:26 crc kubenswrapper[4893]: I0314 07:25:26.273780 4893 scope.go:117] "RemoveContainer" containerID="ee11db7d932fbfffb823e9c557bd5b86aa603adf418a0bec30eb5a1ce07e5bbc" Mar 14 07:25:26 crc kubenswrapper[4893]: E0314 07:25:26.274235 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee11db7d932fbfffb823e9c557bd5b86aa603adf418a0bec30eb5a1ce07e5bbc\": container with ID starting with ee11db7d932fbfffb823e9c557bd5b86aa603adf418a0bec30eb5a1ce07e5bbc not found: ID does not exist" containerID="ee11db7d932fbfffb823e9c557bd5b86aa603adf418a0bec30eb5a1ce07e5bbc" Mar 14 07:25:26 crc kubenswrapper[4893]: I0314 07:25:26.274363 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee11db7d932fbfffb823e9c557bd5b86aa603adf418a0bec30eb5a1ce07e5bbc"} err="failed to get container status \"ee11db7d932fbfffb823e9c557bd5b86aa603adf418a0bec30eb5a1ce07e5bbc\": rpc error: code = NotFound desc = could not find container \"ee11db7d932fbfffb823e9c557bd5b86aa603adf418a0bec30eb5a1ce07e5bbc\": container with ID starting with ee11db7d932fbfffb823e9c557bd5b86aa603adf418a0bec30eb5a1ce07e5bbc not found: ID does not exist" Mar 14 07:25:26 crc kubenswrapper[4893]: I0314 07:25:26.274479 4893 scope.go:117] "RemoveContainer" containerID="4cb3540582d217a10706cae5e1c500de894dcf321ff8541002d6dcb7fcf955b0" Mar 14 07:25:26 crc kubenswrapper[4893]: E0314 07:25:26.274928 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cb3540582d217a10706cae5e1c500de894dcf321ff8541002d6dcb7fcf955b0\": container with ID starting with 4cb3540582d217a10706cae5e1c500de894dcf321ff8541002d6dcb7fcf955b0 not found: ID does not exist" containerID="4cb3540582d217a10706cae5e1c500de894dcf321ff8541002d6dcb7fcf955b0" Mar 14 07:25:26 crc kubenswrapper[4893]: I0314 07:25:26.274974 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cb3540582d217a10706cae5e1c500de894dcf321ff8541002d6dcb7fcf955b0"} err="failed to get container status \"4cb3540582d217a10706cae5e1c500de894dcf321ff8541002d6dcb7fcf955b0\": rpc error: code = NotFound desc = could not find container \"4cb3540582d217a10706cae5e1c500de894dcf321ff8541002d6dcb7fcf955b0\": container with ID starting with 4cb3540582d217a10706cae5e1c500de894dcf321ff8541002d6dcb7fcf955b0 not found: ID does not exist" Mar 14 07:25:27 crc kubenswrapper[4893]: I0314 07:25:27.393968 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab6f1ea-cecc-4ad8-bb05-9fc300a98109" path="/var/lib/kubelet/pods/3ab6f1ea-cecc-4ad8-bb05-9fc300a98109/volumes" Mar 14 07:25:29 crc kubenswrapper[4893]: I0314 07:25:29.731357 4893 patch_prober.go:28] interesting pod/machine-config-daemon-d4x6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:25:29 crc kubenswrapper[4893]: I0314 07:25:29.731461 4893 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:25:59 crc kubenswrapper[4893]: I0314 07:25:59.731877 4893 patch_prober.go:28] interesting pod/machine-config-daemon-d4x6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:25:59 crc kubenswrapper[4893]: I0314 07:25:59.732479 4893 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:25:59 crc kubenswrapper[4893]: I0314 07:25:59.732548 4893 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" Mar 14 07:25:59 crc kubenswrapper[4893]: I0314 07:25:59.733186 4893 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"95504669fee57337f728128f8e2d8bc7bfed00823cb8a0e77ef2c491f6b7fa2c"} pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 07:25:59 crc kubenswrapper[4893]: I0314 07:25:59.733242 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" containerID="cri-o://95504669fee57337f728128f8e2d8bc7bfed00823cb8a0e77ef2c491f6b7fa2c" gracePeriod=600 Mar 14 07:26:00 crc kubenswrapper[4893]: I0314 07:26:00.145473 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557886-8hpcz"] Mar 14 07:26:00 crc kubenswrapper[4893]: E0314 07:26:00.146127 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ab6f1ea-cecc-4ad8-bb05-9fc300a98109" containerName="registry-server" Mar 14 07:26:00 crc kubenswrapper[4893]: I0314 07:26:00.146150 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ab6f1ea-cecc-4ad8-bb05-9fc300a98109" containerName="registry-server" Mar 14 07:26:00 crc kubenswrapper[4893]: E0314 07:26:00.146161 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ab6f1ea-cecc-4ad8-bb05-9fc300a98109" containerName="extract-content" Mar 14 07:26:00 crc kubenswrapper[4893]: I0314 07:26:00.146169 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ab6f1ea-cecc-4ad8-bb05-9fc300a98109" containerName="extract-content" Mar 14 07:26:00 crc kubenswrapper[4893]: E0314 07:26:00.146182 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ab6f1ea-cecc-4ad8-bb05-9fc300a98109" containerName="extract-utilities" Mar 14 07:26:00 crc kubenswrapper[4893]: I0314 07:26:00.146190 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ab6f1ea-cecc-4ad8-bb05-9fc300a98109" containerName="extract-utilities" Mar 14 07:26:00 crc kubenswrapper[4893]: I0314 07:26:00.146400 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ab6f1ea-cecc-4ad8-bb05-9fc300a98109" containerName="registry-server" Mar 14 07:26:00 crc kubenswrapper[4893]: I0314 07:26:00.146934 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557886-8hpcz" Mar 14 07:26:00 crc kubenswrapper[4893]: I0314 07:26:00.150244 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 07:26:00 crc kubenswrapper[4893]: I0314 07:26:00.151167 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 07:26:00 crc kubenswrapper[4893]: I0314 07:26:00.151191 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-44qb7" Mar 14 07:26:00 crc kubenswrapper[4893]: I0314 07:26:00.154124 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557886-8hpcz"] Mar 14 07:26:00 crc kubenswrapper[4893]: I0314 07:26:00.284843 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78mjk\" (UniqueName: \"kubernetes.io/projected/d370b3df-c71b-4f67-a02c-d617e3a09ac2-kube-api-access-78mjk\") pod \"auto-csr-approver-29557886-8hpcz\" (UID: \"d370b3df-c71b-4f67-a02c-d617e3a09ac2\") " pod="openshift-infra/auto-csr-approver-29557886-8hpcz" Mar 14 07:26:00 crc kubenswrapper[4893]: I0314 07:26:00.386140 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78mjk\" (UniqueName: \"kubernetes.io/projected/d370b3df-c71b-4f67-a02c-d617e3a09ac2-kube-api-access-78mjk\") pod \"auto-csr-approver-29557886-8hpcz\" (UID: \"d370b3df-c71b-4f67-a02c-d617e3a09ac2\") " pod="openshift-infra/auto-csr-approver-29557886-8hpcz" Mar 14 07:26:00 crc kubenswrapper[4893]: E0314 07:26:00.407202 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:26:00 crc kubenswrapper[4893]: I0314 07:26:00.408357 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78mjk\" (UniqueName: \"kubernetes.io/projected/d370b3df-c71b-4f67-a02c-d617e3a09ac2-kube-api-access-78mjk\") pod \"auto-csr-approver-29557886-8hpcz\" (UID: \"d370b3df-c71b-4f67-a02c-d617e3a09ac2\") " pod="openshift-infra/auto-csr-approver-29557886-8hpcz" Mar 14 07:26:00 crc kubenswrapper[4893]: I0314 07:26:00.464226 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557886-8hpcz" Mar 14 07:26:00 crc kubenswrapper[4893]: I0314 07:26:00.519638 4893 generic.go:334] "Generic (PLEG): container finished" podID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerID="95504669fee57337f728128f8e2d8bc7bfed00823cb8a0e77ef2c491f6b7fa2c" exitCode=0 Mar 14 07:26:00 crc kubenswrapper[4893]: I0314 07:26:00.519691 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" event={"ID":"ad6724e5-48cf-4417-ae51-b1cb8c6af70d","Type":"ContainerDied","Data":"95504669fee57337f728128f8e2d8bc7bfed00823cb8a0e77ef2c491f6b7fa2c"} Mar 14 07:26:00 crc kubenswrapper[4893]: I0314 07:26:00.519730 4893 scope.go:117] "RemoveContainer" containerID="65067d2744ce3683d92ff7c636321367aa0c4ec520d4ca1606a1f744b31b6656" Mar 14 07:26:00 crc kubenswrapper[4893]: I0314 07:26:00.520257 4893 scope.go:117] "RemoveContainer" containerID="95504669fee57337f728128f8e2d8bc7bfed00823cb8a0e77ef2c491f6b7fa2c" Mar 14 07:26:00 crc kubenswrapper[4893]: E0314 07:26:00.520623 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:26:00 crc kubenswrapper[4893]: I0314 07:26:00.912006 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557886-8hpcz"] Mar 14 07:26:00 crc kubenswrapper[4893]: W0314 07:26:00.922765 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd370b3df_c71b_4f67_a02c_d617e3a09ac2.slice/crio-8f17b124ac3d80ef3ee8879665f3f10690e5954b53eff38be8f6530ef520af58 WatchSource:0}: Error finding container 8f17b124ac3d80ef3ee8879665f3f10690e5954b53eff38be8f6530ef520af58: Status 404 returned error can't find the container with id 8f17b124ac3d80ef3ee8879665f3f10690e5954b53eff38be8f6530ef520af58 Mar 14 07:26:01 crc kubenswrapper[4893]: I0314 07:26:01.529986 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557886-8hpcz" event={"ID":"d370b3df-c71b-4f67-a02c-d617e3a09ac2","Type":"ContainerStarted","Data":"8f17b124ac3d80ef3ee8879665f3f10690e5954b53eff38be8f6530ef520af58"} Mar 14 07:26:02 crc kubenswrapper[4893]: I0314 07:26:02.537947 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557886-8hpcz" event={"ID":"d370b3df-c71b-4f67-a02c-d617e3a09ac2","Type":"ContainerStarted","Data":"90c5889dbe26b489675785b24110a2834f4c443d15f6d8544e215fa4aa853b44"} Mar 14 07:26:02 crc kubenswrapper[4893]: I0314 07:26:02.570503 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557886-8hpcz" podStartSLOduration=1.395924047 podStartE2EDuration="2.570479052s" podCreationTimestamp="2026-03-14 07:26:00 +0000 UTC" firstStartedPulling="2026-03-14 07:26:00.924265787 +0000 UTC m=+1640.186442599" lastFinishedPulling="2026-03-14 07:26:02.098820812 +0000 UTC m=+1641.360997604" observedRunningTime="2026-03-14 07:26:02.567901989 +0000 UTC m=+1641.830078811" watchObservedRunningTime="2026-03-14 07:26:02.570479052 +0000 UTC m=+1641.832655844" Mar 14 07:26:03 crc kubenswrapper[4893]: I0314 07:26:03.550834 4893 generic.go:334] "Generic (PLEG): container finished" podID="d370b3df-c71b-4f67-a02c-d617e3a09ac2" containerID="90c5889dbe26b489675785b24110a2834f4c443d15f6d8544e215fa4aa853b44" exitCode=0 Mar 14 07:26:03 crc kubenswrapper[4893]: I0314 07:26:03.550938 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557886-8hpcz" event={"ID":"d370b3df-c71b-4f67-a02c-d617e3a09ac2","Type":"ContainerDied","Data":"90c5889dbe26b489675785b24110a2834f4c443d15f6d8544e215fa4aa853b44"} Mar 14 07:26:04 crc kubenswrapper[4893]: I0314 07:26:04.441575 4893 scope.go:117] "RemoveContainer" containerID="6a03dce0d2766abd66647169b859b2593ab9d52e6798ec4c10aa43a61bbaab8a" Mar 14 07:26:04 crc kubenswrapper[4893]: I0314 07:26:04.468766 4893 scope.go:117] "RemoveContainer" containerID="82ec983d96ddb66f0c0f9d7b584bdc0e99009b85136248cfa0e09a5bbe3c7cc6" Mar 14 07:26:04 crc kubenswrapper[4893]: I0314 07:26:04.515047 4893 scope.go:117] "RemoveContainer" containerID="b504f6819a498b47ce1fdbb1b7b652aa93b55e0a07d73848aeab56754f086a86" Mar 14 07:26:04 crc kubenswrapper[4893]: I0314 07:26:04.544885 4893 scope.go:117] "RemoveContainer" containerID="2c709f059a4f699836e9f1897e343dfc2e8a083c4e8cbaf73b7b97a62fcff1f6" Mar 14 07:26:04 crc kubenswrapper[4893]: I0314 07:26:04.575767 4893 scope.go:117] "RemoveContainer" containerID="415fb706f50c03599c55095f85501084d2b91b5d78ec775f3af93b69ce990538" Mar 14 07:26:04 crc kubenswrapper[4893]: I0314 07:26:04.607648 4893 scope.go:117] "RemoveContainer" containerID="53154c3e57be85d18226e2f1de5df6ce18d25ffce08f2f5e8c8de1fcba876ebf" Mar 14 07:26:04 crc kubenswrapper[4893]: I0314 07:26:04.637657 4893 scope.go:117] "RemoveContainer" containerID="5ba2a8c54d4952bb7c42fc8298c22338c24c173a22e289545b0cedc062b91036" Mar 14 07:26:04 crc kubenswrapper[4893]: I0314 07:26:04.860797 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557886-8hpcz" Mar 14 07:26:04 crc kubenswrapper[4893]: I0314 07:26:04.965252 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78mjk\" (UniqueName: \"kubernetes.io/projected/d370b3df-c71b-4f67-a02c-d617e3a09ac2-kube-api-access-78mjk\") pod \"d370b3df-c71b-4f67-a02c-d617e3a09ac2\" (UID: \"d370b3df-c71b-4f67-a02c-d617e3a09ac2\") " Mar 14 07:26:04 crc kubenswrapper[4893]: I0314 07:26:04.970239 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d370b3df-c71b-4f67-a02c-d617e3a09ac2-kube-api-access-78mjk" (OuterVolumeSpecName: "kube-api-access-78mjk") pod "d370b3df-c71b-4f67-a02c-d617e3a09ac2" (UID: "d370b3df-c71b-4f67-a02c-d617e3a09ac2"). InnerVolumeSpecName "kube-api-access-78mjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:26:05 crc kubenswrapper[4893]: I0314 07:26:05.067396 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78mjk\" (UniqueName: \"kubernetes.io/projected/d370b3df-c71b-4f67-a02c-d617e3a09ac2-kube-api-access-78mjk\") on node \"crc\" DevicePath \"\"" Mar 14 07:26:05 crc kubenswrapper[4893]: I0314 07:26:05.573439 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557886-8hpcz" event={"ID":"d370b3df-c71b-4f67-a02c-d617e3a09ac2","Type":"ContainerDied","Data":"8f17b124ac3d80ef3ee8879665f3f10690e5954b53eff38be8f6530ef520af58"} Mar 14 07:26:05 crc kubenswrapper[4893]: I0314 07:26:05.573785 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f17b124ac3d80ef3ee8879665f3f10690e5954b53eff38be8f6530ef520af58" Mar 14 07:26:05 crc kubenswrapper[4893]: I0314 07:26:05.573489 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557886-8hpcz" Mar 14 07:26:05 crc kubenswrapper[4893]: I0314 07:26:05.933407 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557880-glk9z"] Mar 14 07:26:05 crc kubenswrapper[4893]: I0314 07:26:05.939292 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557880-glk9z"] Mar 14 07:26:07 crc kubenswrapper[4893]: I0314 07:26:07.387176 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2b569a2-4d37-4f06-b9d3-12d05d3a66a9" path="/var/lib/kubelet/pods/a2b569a2-4d37-4f06-b9d3-12d05d3a66a9/volumes" Mar 14 07:26:14 crc kubenswrapper[4893]: I0314 07:26:14.377330 4893 scope.go:117] "RemoveContainer" containerID="95504669fee57337f728128f8e2d8bc7bfed00823cb8a0e77ef2c491f6b7fa2c" Mar 14 07:26:14 crc kubenswrapper[4893]: E0314 07:26:14.378249 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:26:28 crc kubenswrapper[4893]: I0314 07:26:28.376265 4893 scope.go:117] "RemoveContainer" containerID="95504669fee57337f728128f8e2d8bc7bfed00823cb8a0e77ef2c491f6b7fa2c" Mar 14 07:26:28 crc kubenswrapper[4893]: E0314 07:26:28.377069 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:26:41 crc kubenswrapper[4893]: I0314 07:26:41.387000 4893 scope.go:117] "RemoveContainer" containerID="95504669fee57337f728128f8e2d8bc7bfed00823cb8a0e77ef2c491f6b7fa2c" Mar 14 07:26:41 crc kubenswrapper[4893]: E0314 07:26:41.388925 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:26:56 crc kubenswrapper[4893]: I0314 07:26:56.376473 4893 scope.go:117] "RemoveContainer" containerID="95504669fee57337f728128f8e2d8bc7bfed00823cb8a0e77ef2c491f6b7fa2c" Mar 14 07:26:56 crc kubenswrapper[4893]: E0314 07:26:56.377146 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:27:04 crc kubenswrapper[4893]: I0314 07:27:04.841716 4893 scope.go:117] "RemoveContainer" containerID="34a97db1b93c5de877265ef53897d09a872b74dc2ac956116974a8aec710573c" Mar 14 07:27:04 crc kubenswrapper[4893]: I0314 07:27:04.886023 4893 scope.go:117] "RemoveContainer" containerID="371538afb0b740f188caaef5fc1d7c03c20a31b7d8b9647370541a43e9085a8a" Mar 14 07:27:04 crc kubenswrapper[4893]: I0314 07:27:04.929785 4893 scope.go:117] "RemoveContainer" containerID="f5264743c8414ef7e28aed218c88f09a10839261e5063a2e3bd80f60820a76e0" Mar 14 07:27:04 crc kubenswrapper[4893]: I0314 07:27:04.953267 4893 scope.go:117] "RemoveContainer" containerID="78dc1d4351b3b60a68c762a7af79d2853690910bf049ba88fbb38613b103188f" Mar 14 07:27:04 crc kubenswrapper[4893]: I0314 07:27:04.975127 4893 scope.go:117] "RemoveContainer" containerID="b6856d1eab53d5c7defdef917b8853ec5b843c6ef2d675d742473e83cd281411" Mar 14 07:27:04 crc kubenswrapper[4893]: I0314 07:27:04.998187 4893 scope.go:117] "RemoveContainer" containerID="51f5220871dd0aca09702139c996f8aa409270e4f425fc07daec2280fa1f537e" Mar 14 07:27:05 crc kubenswrapper[4893]: I0314 07:27:05.024598 4893 scope.go:117] "RemoveContainer" containerID="dfa315a626ac49bd79c0bdbc8fb09eebb1455a5dac7515215a91a87dd968d005" Mar 14 07:27:05 crc kubenswrapper[4893]: I0314 07:27:05.057535 4893 scope.go:117] "RemoveContainer" containerID="7789f926434baf6d22b935982b2af6e952f4a729d8d70fb74ec2c13da50d7032" Mar 14 07:27:05 crc kubenswrapper[4893]: I0314 07:27:05.080482 4893 scope.go:117] "RemoveContainer" containerID="7ae0c82deca1cab7ae23b6448053023c9a433f9266f77cf656b48631609f0ca2" Mar 14 07:27:05 crc kubenswrapper[4893]: I0314 07:27:05.116051 4893 scope.go:117] "RemoveContainer" containerID="418ccc387f45c514ece7bef0efcc443727d9cf334cffaee3ab2b54a98504fe59" Mar 14 07:27:05 crc kubenswrapper[4893]: I0314 07:27:05.142663 4893 scope.go:117] "RemoveContainer" containerID="af0bc49ff538ba4ee33598bcf5c437e12efb9cbf76333f072b88487e74333566" Mar 14 07:27:05 crc kubenswrapper[4893]: I0314 07:27:05.173744 4893 scope.go:117] "RemoveContainer" containerID="d33597c4daeda4ec0fe028c311645b884a212bf1f0a2101cb0b7a2f6903446eb" Mar 14 07:27:05 crc kubenswrapper[4893]: I0314 07:27:05.212092 4893 scope.go:117] "RemoveContainer" containerID="7c0fa865cdda16c5e7099733a49b13b73a32bab1ba4edabe119ddc0ae6452bb4" Mar 14 07:27:05 crc kubenswrapper[4893]: I0314 07:27:05.230923 4893 scope.go:117] "RemoveContainer" containerID="f1de5bb67f438364dc9e682d71951ebcde9019cf1468602db93d3be0fb8743d0" Mar 14 07:27:05 crc kubenswrapper[4893]: I0314 07:27:05.264771 4893 scope.go:117] "RemoveContainer" containerID="c474a796e7a35b3a593b6197965605c48c121d839c4564ec15637bd336060fca" Mar 14 07:27:05 crc kubenswrapper[4893]: I0314 07:27:05.297233 4893 scope.go:117] "RemoveContainer" containerID="6d895c5fb663549e3906c267cd862fb00a0c7b62a71de4e4499999dcb434b003" Mar 14 07:27:05 crc kubenswrapper[4893]: I0314 07:27:05.343370 4893 scope.go:117] "RemoveContainer" containerID="d721f52196064cd1c51902654dbbd16ab932abd7746c212327ec11b5046ddae0" Mar 14 07:27:05 crc kubenswrapper[4893]: I0314 07:27:05.361210 4893 scope.go:117] "RemoveContainer" containerID="a99af99d9bf92b3ba08ac70ddfbb37967dcaee5680fcad631e5bfa22ff771102" Mar 14 07:27:05 crc kubenswrapper[4893]: I0314 07:27:05.407879 4893 scope.go:117] "RemoveContainer" containerID="076d1362e0337a646e1d3cbbd38d27f9651cbd0d40bac4c48674ad108e71c423" Mar 14 07:27:09 crc kubenswrapper[4893]: I0314 07:27:09.377301 4893 scope.go:117] "RemoveContainer" containerID="95504669fee57337f728128f8e2d8bc7bfed00823cb8a0e77ef2c491f6b7fa2c" Mar 14 07:27:09 crc kubenswrapper[4893]: E0314 07:27:09.378073 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:27:21 crc kubenswrapper[4893]: I0314 07:27:21.381732 4893 scope.go:117] "RemoveContainer" containerID="95504669fee57337f728128f8e2d8bc7bfed00823cb8a0e77ef2c491f6b7fa2c" Mar 14 07:27:21 crc kubenswrapper[4893]: E0314 07:27:21.382471 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:27:33 crc kubenswrapper[4893]: I0314 07:27:33.377327 4893 scope.go:117] "RemoveContainer" containerID="95504669fee57337f728128f8e2d8bc7bfed00823cb8a0e77ef2c491f6b7fa2c" Mar 14 07:27:33 crc kubenswrapper[4893]: E0314 07:27:33.378130 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:27:47 crc kubenswrapper[4893]: I0314 07:27:47.377679 4893 scope.go:117] "RemoveContainer" containerID="95504669fee57337f728128f8e2d8bc7bfed00823cb8a0e77ef2c491f6b7fa2c" Mar 14 07:27:47 crc kubenswrapper[4893]: E0314 07:27:47.378664 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:27:59 crc kubenswrapper[4893]: I0314 07:27:59.377202 4893 scope.go:117] "RemoveContainer" containerID="95504669fee57337f728128f8e2d8bc7bfed00823cb8a0e77ef2c491f6b7fa2c" Mar 14 07:27:59 crc kubenswrapper[4893]: E0314 07:27:59.377792 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:28:00 crc kubenswrapper[4893]: I0314 07:28:00.146116 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557888-54pq6"] Mar 14 07:28:00 crc kubenswrapper[4893]: E0314 07:28:00.146666 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d370b3df-c71b-4f67-a02c-d617e3a09ac2" containerName="oc" Mar 14 07:28:00 crc kubenswrapper[4893]: I0314 07:28:00.146687 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="d370b3df-c71b-4f67-a02c-d617e3a09ac2" containerName="oc" Mar 14 07:28:00 crc kubenswrapper[4893]: I0314 07:28:00.146943 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="d370b3df-c71b-4f67-a02c-d617e3a09ac2" containerName="oc" Mar 14 07:28:00 crc kubenswrapper[4893]: I0314 07:28:00.147648 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557888-54pq6" Mar 14 07:28:00 crc kubenswrapper[4893]: I0314 07:28:00.152541 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 07:28:00 crc kubenswrapper[4893]: I0314 07:28:00.152550 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-44qb7" Mar 14 07:28:00 crc kubenswrapper[4893]: I0314 07:28:00.152622 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 07:28:00 crc kubenswrapper[4893]: I0314 07:28:00.153186 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557888-54pq6"] Mar 14 07:28:00 crc kubenswrapper[4893]: I0314 07:28:00.159101 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sh9tl\" (UniqueName: \"kubernetes.io/projected/2fcd5058-c59f-4b66-8b70-64286d6a3b6f-kube-api-access-sh9tl\") pod \"auto-csr-approver-29557888-54pq6\" (UID: \"2fcd5058-c59f-4b66-8b70-64286d6a3b6f\") " pod="openshift-infra/auto-csr-approver-29557888-54pq6" Mar 14 07:28:00 crc kubenswrapper[4893]: I0314 07:28:00.260774 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sh9tl\" (UniqueName: \"kubernetes.io/projected/2fcd5058-c59f-4b66-8b70-64286d6a3b6f-kube-api-access-sh9tl\") pod \"auto-csr-approver-29557888-54pq6\" (UID: \"2fcd5058-c59f-4b66-8b70-64286d6a3b6f\") " pod="openshift-infra/auto-csr-approver-29557888-54pq6" Mar 14 07:28:00 crc kubenswrapper[4893]: I0314 07:28:00.279857 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sh9tl\" (UniqueName: \"kubernetes.io/projected/2fcd5058-c59f-4b66-8b70-64286d6a3b6f-kube-api-access-sh9tl\") pod \"auto-csr-approver-29557888-54pq6\" (UID: \"2fcd5058-c59f-4b66-8b70-64286d6a3b6f\") " pod="openshift-infra/auto-csr-approver-29557888-54pq6" Mar 14 07:28:00 crc kubenswrapper[4893]: I0314 07:28:00.469981 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557888-54pq6" Mar 14 07:28:00 crc kubenswrapper[4893]: I0314 07:28:00.940826 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557888-54pq6"] Mar 14 07:28:00 crc kubenswrapper[4893]: W0314 07:28:00.941077 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2fcd5058_c59f_4b66_8b70_64286d6a3b6f.slice/crio-bc28f9cf060d705bdc4e63a5b50a5d2a6267618ebda98b68636350e4c694146b WatchSource:0}: Error finding container bc28f9cf060d705bdc4e63a5b50a5d2a6267618ebda98b68636350e4c694146b: Status 404 returned error can't find the container with id bc28f9cf060d705bdc4e63a5b50a5d2a6267618ebda98b68636350e4c694146b Mar 14 07:28:01 crc kubenswrapper[4893]: I0314 07:28:01.558390 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557888-54pq6" event={"ID":"2fcd5058-c59f-4b66-8b70-64286d6a3b6f","Type":"ContainerStarted","Data":"bc28f9cf060d705bdc4e63a5b50a5d2a6267618ebda98b68636350e4c694146b"} Mar 14 07:28:02 crc kubenswrapper[4893]: I0314 07:28:02.570197 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557888-54pq6" event={"ID":"2fcd5058-c59f-4b66-8b70-64286d6a3b6f","Type":"ContainerStarted","Data":"f7688ceb673a7f0259db2a4f08b4bf4ddff0bd15f556bc381abed4f1066be182"} Mar 14 07:28:02 crc kubenswrapper[4893]: I0314 07:28:02.589502 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557888-54pq6" podStartSLOduration=1.488125438 podStartE2EDuration="2.589486898s" podCreationTimestamp="2026-03-14 07:28:00 +0000 UTC" firstStartedPulling="2026-03-14 07:28:00.944754773 +0000 UTC m=+1760.206931565" lastFinishedPulling="2026-03-14 07:28:02.046116203 +0000 UTC m=+1761.308293025" observedRunningTime="2026-03-14 07:28:02.588995016 +0000 UTC m=+1761.851171808" watchObservedRunningTime="2026-03-14 07:28:02.589486898 +0000 UTC m=+1761.851663690" Mar 14 07:28:03 crc kubenswrapper[4893]: I0314 07:28:03.583960 4893 generic.go:334] "Generic (PLEG): container finished" podID="2fcd5058-c59f-4b66-8b70-64286d6a3b6f" containerID="f7688ceb673a7f0259db2a4f08b4bf4ddff0bd15f556bc381abed4f1066be182" exitCode=0 Mar 14 07:28:03 crc kubenswrapper[4893]: I0314 07:28:03.584185 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557888-54pq6" event={"ID":"2fcd5058-c59f-4b66-8b70-64286d6a3b6f","Type":"ContainerDied","Data":"f7688ceb673a7f0259db2a4f08b4bf4ddff0bd15f556bc381abed4f1066be182"} Mar 14 07:28:04 crc kubenswrapper[4893]: I0314 07:28:04.944053 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557888-54pq6" Mar 14 07:28:05 crc kubenswrapper[4893]: I0314 07:28:05.032692 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sh9tl\" (UniqueName: \"kubernetes.io/projected/2fcd5058-c59f-4b66-8b70-64286d6a3b6f-kube-api-access-sh9tl\") pod \"2fcd5058-c59f-4b66-8b70-64286d6a3b6f\" (UID: \"2fcd5058-c59f-4b66-8b70-64286d6a3b6f\") " Mar 14 07:28:05 crc kubenswrapper[4893]: I0314 07:28:05.042978 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fcd5058-c59f-4b66-8b70-64286d6a3b6f-kube-api-access-sh9tl" (OuterVolumeSpecName: "kube-api-access-sh9tl") pod "2fcd5058-c59f-4b66-8b70-64286d6a3b6f" (UID: "2fcd5058-c59f-4b66-8b70-64286d6a3b6f"). InnerVolumeSpecName "kube-api-access-sh9tl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:28:05 crc kubenswrapper[4893]: I0314 07:28:05.133733 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sh9tl\" (UniqueName: \"kubernetes.io/projected/2fcd5058-c59f-4b66-8b70-64286d6a3b6f-kube-api-access-sh9tl\") on node \"crc\" DevicePath \"\"" Mar 14 07:28:05 crc kubenswrapper[4893]: I0314 07:28:05.608397 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557888-54pq6" event={"ID":"2fcd5058-c59f-4b66-8b70-64286d6a3b6f","Type":"ContainerDied","Data":"bc28f9cf060d705bdc4e63a5b50a5d2a6267618ebda98b68636350e4c694146b"} Mar 14 07:28:05 crc kubenswrapper[4893]: I0314 07:28:05.608456 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc28f9cf060d705bdc4e63a5b50a5d2a6267618ebda98b68636350e4c694146b" Mar 14 07:28:05 crc kubenswrapper[4893]: I0314 07:28:05.608496 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557888-54pq6" Mar 14 07:28:05 crc kubenswrapper[4893]: I0314 07:28:05.679453 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557882-7m6mq"] Mar 14 07:28:05 crc kubenswrapper[4893]: I0314 07:28:05.689072 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557882-7m6mq"] Mar 14 07:28:05 crc kubenswrapper[4893]: I0314 07:28:05.789330 4893 scope.go:117] "RemoveContainer" containerID="0ff8d81af854d517f8b4724b88564193aeee08ef8708f6f38437ce2e9ccf15a3" Mar 14 07:28:05 crc kubenswrapper[4893]: I0314 07:28:05.822225 4893 scope.go:117] "RemoveContainer" containerID="de8f12a68ff328509ec01f4da9cfd1fdd44d102507da5c518743305ac27423ab" Mar 14 07:28:05 crc kubenswrapper[4893]: I0314 07:28:05.857144 4893 scope.go:117] "RemoveContainer" containerID="19931e7934a437538c7634c56ea710f8724a76205f9ecedc8b6d7e42a3abc4dd" Mar 14 07:28:05 crc kubenswrapper[4893]: I0314 07:28:05.895430 4893 scope.go:117] "RemoveContainer" containerID="78c6cc2dea39de360a990b14c55febaf458a97e1d80a447d463589cdbf99d88b" Mar 14 07:28:05 crc kubenswrapper[4893]: I0314 07:28:05.930867 4893 scope.go:117] "RemoveContainer" containerID="3dab1821b74126bc10d3f1995b8bb609c8b02b71125d8eef1e9212847979e89d" Mar 14 07:28:05 crc kubenswrapper[4893]: I0314 07:28:05.972379 4893 scope.go:117] "RemoveContainer" containerID="071be3d7b6163f25cab591d62f20975ae6d40b5630f0ede440ea7ddafb12315f" Mar 14 07:28:05 crc kubenswrapper[4893]: I0314 07:28:05.990495 4893 scope.go:117] "RemoveContainer" containerID="bea0181f97653e52ef6b00db91426f325b021b8db102f6f9ca17b54176110d69" Mar 14 07:28:06 crc kubenswrapper[4893]: I0314 07:28:06.006768 4893 scope.go:117] "RemoveContainer" containerID="838c803bb68c10a1197d0b8154267c2281ac4430fa3451398720d14bddbc3c4a" Mar 14 07:28:06 crc kubenswrapper[4893]: I0314 07:28:06.025057 4893 scope.go:117] "RemoveContainer" containerID="4d963d82562ecbbe578925bc8419a6dd95330e0048641392bedfe5325585ff73" Mar 14 07:28:06 crc kubenswrapper[4893]: I0314 07:28:06.057607 4893 scope.go:117] "RemoveContainer" containerID="9ed2f5c118d7fd5926e474d0b57158062f4bbe6f5683be858f729b9d25ff8388" Mar 14 07:28:07 crc kubenswrapper[4893]: I0314 07:28:07.391703 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eba46be1-6a5c-4665-aebf-6b243dec4ed7" path="/var/lib/kubelet/pods/eba46be1-6a5c-4665-aebf-6b243dec4ed7/volumes" Mar 14 07:28:10 crc kubenswrapper[4893]: I0314 07:28:10.377477 4893 scope.go:117] "RemoveContainer" containerID="95504669fee57337f728128f8e2d8bc7bfed00823cb8a0e77ef2c491f6b7fa2c" Mar 14 07:28:10 crc kubenswrapper[4893]: E0314 07:28:10.378321 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:28:22 crc kubenswrapper[4893]: I0314 07:28:22.376759 4893 scope.go:117] "RemoveContainer" containerID="95504669fee57337f728128f8e2d8bc7bfed00823cb8a0e77ef2c491f6b7fa2c" Mar 14 07:28:22 crc kubenswrapper[4893]: E0314 07:28:22.377493 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:28:33 crc kubenswrapper[4893]: I0314 07:28:33.377327 4893 scope.go:117] "RemoveContainer" containerID="95504669fee57337f728128f8e2d8bc7bfed00823cb8a0e77ef2c491f6b7fa2c" Mar 14 07:28:33 crc kubenswrapper[4893]: E0314 07:28:33.378310 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:28:47 crc kubenswrapper[4893]: I0314 07:28:47.377843 4893 scope.go:117] "RemoveContainer" containerID="95504669fee57337f728128f8e2d8bc7bfed00823cb8a0e77ef2c491f6b7fa2c" Mar 14 07:28:47 crc kubenswrapper[4893]: E0314 07:28:47.378681 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:28:58 crc kubenswrapper[4893]: I0314 07:28:58.376546 4893 scope.go:117] "RemoveContainer" containerID="95504669fee57337f728128f8e2d8bc7bfed00823cb8a0e77ef2c491f6b7fa2c" Mar 14 07:28:58 crc kubenswrapper[4893]: E0314 07:28:58.377293 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:29:06 crc kubenswrapper[4893]: I0314 07:29:06.199957 4893 scope.go:117] "RemoveContainer" containerID="d5aab96a43fad3ce075801aad0ba40c5256b90c39be16a195a7a51e462662ae4" Mar 14 07:29:06 crc kubenswrapper[4893]: I0314 07:29:06.227083 4893 scope.go:117] "RemoveContainer" containerID="f69d37d7aa3aaad764b626c6801d7a4b236d60fdfae8857b91bae1f1d591f9a8" Mar 14 07:29:06 crc kubenswrapper[4893]: I0314 07:29:06.253772 4893 scope.go:117] "RemoveContainer" containerID="1dda442ed9ea7342ce7b0e9960fc52e71ada89b41249ed7867a630b253b5adb5" Mar 14 07:29:06 crc kubenswrapper[4893]: I0314 07:29:06.319179 4893 scope.go:117] "RemoveContainer" containerID="950611c56e3344fdc62264065d2587e5a1938230c6418d51d23815de1934eaea" Mar 14 07:29:06 crc kubenswrapper[4893]: I0314 07:29:06.344133 4893 scope.go:117] "RemoveContainer" containerID="bf00332d3fc74a103886075ae2cfbfdc2a8883a1876737cc65040af966c58cfc" Mar 14 07:29:06 crc kubenswrapper[4893]: I0314 07:29:06.384025 4893 scope.go:117] "RemoveContainer" containerID="a0e606f83c4344007c738a0607d69e9e4174dab58bc088f15e1b8aeacbda77a9" Mar 14 07:29:06 crc kubenswrapper[4893]: I0314 07:29:06.421961 4893 scope.go:117] "RemoveContainer" containerID="3b4dcfde76f399ad63952470692ea7797b03fc5391b57f10b894e2301aa465e2" Mar 14 07:29:06 crc kubenswrapper[4893]: I0314 07:29:06.437316 4893 scope.go:117] "RemoveContainer" containerID="1d21f42482ee6e162a138e9e63502673b14426d96eadf78b516d61fd03d33c75" Mar 14 07:29:06 crc kubenswrapper[4893]: I0314 07:29:06.474105 4893 scope.go:117] "RemoveContainer" containerID="2fc121bd16a2e48ba5ae0e8191ee49932905a50f6c55bad773e4ec5663d05d86" Mar 14 07:29:06 crc kubenswrapper[4893]: I0314 07:29:06.500225 4893 scope.go:117] "RemoveContainer" containerID="7a443e5103fbd27a22cea9d965cf65d3ae0ac1bbfe32f671c5f3f620ad3f8284" Mar 14 07:29:09 crc kubenswrapper[4893]: I0314 07:29:09.376397 4893 scope.go:117] "RemoveContainer" containerID="95504669fee57337f728128f8e2d8bc7bfed00823cb8a0e77ef2c491f6b7fa2c" Mar 14 07:29:09 crc kubenswrapper[4893]: E0314 07:29:09.376850 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:29:23 crc kubenswrapper[4893]: I0314 07:29:23.158732 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lxg8n"] Mar 14 07:29:23 crc kubenswrapper[4893]: E0314 07:29:23.160026 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fcd5058-c59f-4b66-8b70-64286d6a3b6f" containerName="oc" Mar 14 07:29:23 crc kubenswrapper[4893]: I0314 07:29:23.160048 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fcd5058-c59f-4b66-8b70-64286d6a3b6f" containerName="oc" Mar 14 07:29:23 crc kubenswrapper[4893]: I0314 07:29:23.160340 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fcd5058-c59f-4b66-8b70-64286d6a3b6f" containerName="oc" Mar 14 07:29:23 crc kubenswrapper[4893]: I0314 07:29:23.162189 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lxg8n" Mar 14 07:29:23 crc kubenswrapper[4893]: I0314 07:29:23.171720 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lxg8n"] Mar 14 07:29:23 crc kubenswrapper[4893]: I0314 07:29:23.276091 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ddce0fd5-7e00-4f05-87ca-bf534185e31a-utilities\") pod \"community-operators-lxg8n\" (UID: \"ddce0fd5-7e00-4f05-87ca-bf534185e31a\") " pod="openshift-marketplace/community-operators-lxg8n" Mar 14 07:29:23 crc kubenswrapper[4893]: I0314 07:29:23.276153 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ddce0fd5-7e00-4f05-87ca-bf534185e31a-catalog-content\") pod \"community-operators-lxg8n\" (UID: \"ddce0fd5-7e00-4f05-87ca-bf534185e31a\") " pod="openshift-marketplace/community-operators-lxg8n" Mar 14 07:29:23 crc kubenswrapper[4893]: I0314 07:29:23.276411 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5kp9\" (UniqueName: \"kubernetes.io/projected/ddce0fd5-7e00-4f05-87ca-bf534185e31a-kube-api-access-l5kp9\") pod \"community-operators-lxg8n\" (UID: \"ddce0fd5-7e00-4f05-87ca-bf534185e31a\") " pod="openshift-marketplace/community-operators-lxg8n" Mar 14 07:29:23 crc kubenswrapper[4893]: I0314 07:29:23.379200 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ddce0fd5-7e00-4f05-87ca-bf534185e31a-utilities\") pod \"community-operators-lxg8n\" (UID: \"ddce0fd5-7e00-4f05-87ca-bf534185e31a\") " pod="openshift-marketplace/community-operators-lxg8n" Mar 14 07:29:23 crc kubenswrapper[4893]: I0314 07:29:23.379254 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ddce0fd5-7e00-4f05-87ca-bf534185e31a-catalog-content\") pod \"community-operators-lxg8n\" (UID: \"ddce0fd5-7e00-4f05-87ca-bf534185e31a\") " pod="openshift-marketplace/community-operators-lxg8n" Mar 14 07:29:23 crc kubenswrapper[4893]: I0314 07:29:23.379786 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ddce0fd5-7e00-4f05-87ca-bf534185e31a-utilities\") pod \"community-operators-lxg8n\" (UID: \"ddce0fd5-7e00-4f05-87ca-bf534185e31a\") " pod="openshift-marketplace/community-operators-lxg8n" Mar 14 07:29:23 crc kubenswrapper[4893]: I0314 07:29:23.379916 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ddce0fd5-7e00-4f05-87ca-bf534185e31a-catalog-content\") pod \"community-operators-lxg8n\" (UID: \"ddce0fd5-7e00-4f05-87ca-bf534185e31a\") " pod="openshift-marketplace/community-operators-lxg8n" Mar 14 07:29:23 crc kubenswrapper[4893]: I0314 07:29:23.380395 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5kp9\" (UniqueName: \"kubernetes.io/projected/ddce0fd5-7e00-4f05-87ca-bf534185e31a-kube-api-access-l5kp9\") pod \"community-operators-lxg8n\" (UID: \"ddce0fd5-7e00-4f05-87ca-bf534185e31a\") " pod="openshift-marketplace/community-operators-lxg8n" Mar 14 07:29:23 crc kubenswrapper[4893]: I0314 07:29:23.404614 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5kp9\" (UniqueName: \"kubernetes.io/projected/ddce0fd5-7e00-4f05-87ca-bf534185e31a-kube-api-access-l5kp9\") pod \"community-operators-lxg8n\" (UID: \"ddce0fd5-7e00-4f05-87ca-bf534185e31a\") " pod="openshift-marketplace/community-operators-lxg8n" Mar 14 07:29:23 crc kubenswrapper[4893]: I0314 07:29:23.497069 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lxg8n" Mar 14 07:29:23 crc kubenswrapper[4893]: I0314 07:29:23.985992 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lxg8n"] Mar 14 07:29:24 crc kubenswrapper[4893]: I0314 07:29:24.329034 4893 generic.go:334] "Generic (PLEG): container finished" podID="ddce0fd5-7e00-4f05-87ca-bf534185e31a" containerID="31dc48f946dbcd4367514c38baf10a07e0624875b02aec994168dd40658ea3fe" exitCode=0 Mar 14 07:29:24 crc kubenswrapper[4893]: I0314 07:29:24.329650 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lxg8n" event={"ID":"ddce0fd5-7e00-4f05-87ca-bf534185e31a","Type":"ContainerDied","Data":"31dc48f946dbcd4367514c38baf10a07e0624875b02aec994168dd40658ea3fe"} Mar 14 07:29:24 crc kubenswrapper[4893]: I0314 07:29:24.329820 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lxg8n" event={"ID":"ddce0fd5-7e00-4f05-87ca-bf534185e31a","Type":"ContainerStarted","Data":"c8cc1af4c6bd22b4c946b3f481f301f63fa6339c0063ad44b136f63b8d76b766"} Mar 14 07:29:24 crc kubenswrapper[4893]: I0314 07:29:24.331668 4893 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 07:29:24 crc kubenswrapper[4893]: I0314 07:29:24.377853 4893 scope.go:117] "RemoveContainer" containerID="95504669fee57337f728128f8e2d8bc7bfed00823cb8a0e77ef2c491f6b7fa2c" Mar 14 07:29:24 crc kubenswrapper[4893]: E0314 07:29:24.378136 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:29:25 crc kubenswrapper[4893]: I0314 07:29:25.341126 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lxg8n" event={"ID":"ddce0fd5-7e00-4f05-87ca-bf534185e31a","Type":"ContainerStarted","Data":"a379419132ec1412e3d9beedda7947f3c0a71de08405bc692659bf401dc720eb"} Mar 14 07:29:26 crc kubenswrapper[4893]: I0314 07:29:26.353251 4893 generic.go:334] "Generic (PLEG): container finished" podID="ddce0fd5-7e00-4f05-87ca-bf534185e31a" containerID="a379419132ec1412e3d9beedda7947f3c0a71de08405bc692659bf401dc720eb" exitCode=0 Mar 14 07:29:26 crc kubenswrapper[4893]: I0314 07:29:26.353311 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lxg8n" event={"ID":"ddce0fd5-7e00-4f05-87ca-bf534185e31a","Type":"ContainerDied","Data":"a379419132ec1412e3d9beedda7947f3c0a71de08405bc692659bf401dc720eb"} Mar 14 07:29:27 crc kubenswrapper[4893]: I0314 07:29:27.399515 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lxg8n" event={"ID":"ddce0fd5-7e00-4f05-87ca-bf534185e31a","Type":"ContainerStarted","Data":"7efa7f506b3fd51a7e4673b75499e331819fc1fad029519a4c339d7fbd671c89"} Mar 14 07:29:27 crc kubenswrapper[4893]: I0314 07:29:27.408944 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lxg8n" podStartSLOduration=1.8217416100000001 podStartE2EDuration="4.408922033s" podCreationTimestamp="2026-03-14 07:29:23 +0000 UTC" firstStartedPulling="2026-03-14 07:29:24.3312644 +0000 UTC m=+1843.593441212" lastFinishedPulling="2026-03-14 07:29:26.918444803 +0000 UTC m=+1846.180621635" observedRunningTime="2026-03-14 07:29:27.402896314 +0000 UTC m=+1846.665073116" watchObservedRunningTime="2026-03-14 07:29:27.408922033 +0000 UTC m=+1846.671098825" Mar 14 07:29:33 crc kubenswrapper[4893]: I0314 07:29:33.498298 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lxg8n" Mar 14 07:29:33 crc kubenswrapper[4893]: I0314 07:29:33.499190 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lxg8n" Mar 14 07:29:33 crc kubenswrapper[4893]: I0314 07:29:33.542654 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lxg8n" Mar 14 07:29:34 crc kubenswrapper[4893]: I0314 07:29:34.497420 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lxg8n" Mar 14 07:29:34 crc kubenswrapper[4893]: I0314 07:29:34.552294 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lxg8n"] Mar 14 07:29:36 crc kubenswrapper[4893]: I0314 07:29:36.462631 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lxg8n" podUID="ddce0fd5-7e00-4f05-87ca-bf534185e31a" containerName="registry-server" containerID="cri-o://7efa7f506b3fd51a7e4673b75499e331819fc1fad029519a4c339d7fbd671c89" gracePeriod=2 Mar 14 07:29:36 crc kubenswrapper[4893]: I0314 07:29:36.915690 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lxg8n" Mar 14 07:29:37 crc kubenswrapper[4893]: I0314 07:29:37.078234 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ddce0fd5-7e00-4f05-87ca-bf534185e31a-utilities\") pod \"ddce0fd5-7e00-4f05-87ca-bf534185e31a\" (UID: \"ddce0fd5-7e00-4f05-87ca-bf534185e31a\") " Mar 14 07:29:37 crc kubenswrapper[4893]: I0314 07:29:37.078341 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5kp9\" (UniqueName: \"kubernetes.io/projected/ddce0fd5-7e00-4f05-87ca-bf534185e31a-kube-api-access-l5kp9\") pod \"ddce0fd5-7e00-4f05-87ca-bf534185e31a\" (UID: \"ddce0fd5-7e00-4f05-87ca-bf534185e31a\") " Mar 14 07:29:37 crc kubenswrapper[4893]: I0314 07:29:37.078612 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ddce0fd5-7e00-4f05-87ca-bf534185e31a-catalog-content\") pod \"ddce0fd5-7e00-4f05-87ca-bf534185e31a\" (UID: \"ddce0fd5-7e00-4f05-87ca-bf534185e31a\") " Mar 14 07:29:37 crc kubenswrapper[4893]: I0314 07:29:37.079693 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ddce0fd5-7e00-4f05-87ca-bf534185e31a-utilities" (OuterVolumeSpecName: "utilities") pod "ddce0fd5-7e00-4f05-87ca-bf534185e31a" (UID: "ddce0fd5-7e00-4f05-87ca-bf534185e31a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:29:37 crc kubenswrapper[4893]: I0314 07:29:37.087049 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddce0fd5-7e00-4f05-87ca-bf534185e31a-kube-api-access-l5kp9" (OuterVolumeSpecName: "kube-api-access-l5kp9") pod "ddce0fd5-7e00-4f05-87ca-bf534185e31a" (UID: "ddce0fd5-7e00-4f05-87ca-bf534185e31a"). InnerVolumeSpecName "kube-api-access-l5kp9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:29:37 crc kubenswrapper[4893]: I0314 07:29:37.172125 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ddce0fd5-7e00-4f05-87ca-bf534185e31a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ddce0fd5-7e00-4f05-87ca-bf534185e31a" (UID: "ddce0fd5-7e00-4f05-87ca-bf534185e31a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:29:37 crc kubenswrapper[4893]: I0314 07:29:37.180348 4893 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ddce0fd5-7e00-4f05-87ca-bf534185e31a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 07:29:37 crc kubenswrapper[4893]: I0314 07:29:37.180416 4893 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ddce0fd5-7e00-4f05-87ca-bf534185e31a-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 07:29:37 crc kubenswrapper[4893]: I0314 07:29:37.180447 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5kp9\" (UniqueName: \"kubernetes.io/projected/ddce0fd5-7e00-4f05-87ca-bf534185e31a-kube-api-access-l5kp9\") on node \"crc\" DevicePath \"\"" Mar 14 07:29:37 crc kubenswrapper[4893]: I0314 07:29:37.478606 4893 generic.go:334] "Generic (PLEG): container finished" podID="ddce0fd5-7e00-4f05-87ca-bf534185e31a" containerID="7efa7f506b3fd51a7e4673b75499e331819fc1fad029519a4c339d7fbd671c89" exitCode=0 Mar 14 07:29:37 crc kubenswrapper[4893]: I0314 07:29:37.478686 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lxg8n" event={"ID":"ddce0fd5-7e00-4f05-87ca-bf534185e31a","Type":"ContainerDied","Data":"7efa7f506b3fd51a7e4673b75499e331819fc1fad029519a4c339d7fbd671c89"} Mar 14 07:29:37 crc kubenswrapper[4893]: I0314 07:29:37.478726 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lxg8n" event={"ID":"ddce0fd5-7e00-4f05-87ca-bf534185e31a","Type":"ContainerDied","Data":"c8cc1af4c6bd22b4c946b3f481f301f63fa6339c0063ad44b136f63b8d76b766"} Mar 14 07:29:37 crc kubenswrapper[4893]: I0314 07:29:37.478756 4893 scope.go:117] "RemoveContainer" containerID="7efa7f506b3fd51a7e4673b75499e331819fc1fad029519a4c339d7fbd671c89" Mar 14 07:29:37 crc kubenswrapper[4893]: I0314 07:29:37.478965 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lxg8n" Mar 14 07:29:37 crc kubenswrapper[4893]: I0314 07:29:37.513989 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lxg8n"] Mar 14 07:29:37 crc kubenswrapper[4893]: I0314 07:29:37.536523 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lxg8n"] Mar 14 07:29:37 crc kubenswrapper[4893]: I0314 07:29:37.539462 4893 scope.go:117] "RemoveContainer" containerID="a379419132ec1412e3d9beedda7947f3c0a71de08405bc692659bf401dc720eb" Mar 14 07:29:37 crc kubenswrapper[4893]: I0314 07:29:37.563477 4893 scope.go:117] "RemoveContainer" containerID="31dc48f946dbcd4367514c38baf10a07e0624875b02aec994168dd40658ea3fe" Mar 14 07:29:37 crc kubenswrapper[4893]: I0314 07:29:37.606491 4893 scope.go:117] "RemoveContainer" containerID="7efa7f506b3fd51a7e4673b75499e331819fc1fad029519a4c339d7fbd671c89" Mar 14 07:29:37 crc kubenswrapper[4893]: E0314 07:29:37.607126 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7efa7f506b3fd51a7e4673b75499e331819fc1fad029519a4c339d7fbd671c89\": container with ID starting with 7efa7f506b3fd51a7e4673b75499e331819fc1fad029519a4c339d7fbd671c89 not found: ID does not exist" containerID="7efa7f506b3fd51a7e4673b75499e331819fc1fad029519a4c339d7fbd671c89" Mar 14 07:29:37 crc kubenswrapper[4893]: I0314 07:29:37.607201 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7efa7f506b3fd51a7e4673b75499e331819fc1fad029519a4c339d7fbd671c89"} err="failed to get container status \"7efa7f506b3fd51a7e4673b75499e331819fc1fad029519a4c339d7fbd671c89\": rpc error: code = NotFound desc = could not find container \"7efa7f506b3fd51a7e4673b75499e331819fc1fad029519a4c339d7fbd671c89\": container with ID starting with 7efa7f506b3fd51a7e4673b75499e331819fc1fad029519a4c339d7fbd671c89 not found: ID does not exist" Mar 14 07:29:37 crc kubenswrapper[4893]: I0314 07:29:37.607232 4893 scope.go:117] "RemoveContainer" containerID="a379419132ec1412e3d9beedda7947f3c0a71de08405bc692659bf401dc720eb" Mar 14 07:29:37 crc kubenswrapper[4893]: E0314 07:29:37.607857 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a379419132ec1412e3d9beedda7947f3c0a71de08405bc692659bf401dc720eb\": container with ID starting with a379419132ec1412e3d9beedda7947f3c0a71de08405bc692659bf401dc720eb not found: ID does not exist" containerID="a379419132ec1412e3d9beedda7947f3c0a71de08405bc692659bf401dc720eb" Mar 14 07:29:37 crc kubenswrapper[4893]: I0314 07:29:37.607926 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a379419132ec1412e3d9beedda7947f3c0a71de08405bc692659bf401dc720eb"} err="failed to get container status \"a379419132ec1412e3d9beedda7947f3c0a71de08405bc692659bf401dc720eb\": rpc error: code = NotFound desc = could not find container \"a379419132ec1412e3d9beedda7947f3c0a71de08405bc692659bf401dc720eb\": container with ID starting with a379419132ec1412e3d9beedda7947f3c0a71de08405bc692659bf401dc720eb not found: ID does not exist" Mar 14 07:29:37 crc kubenswrapper[4893]: I0314 07:29:37.607973 4893 scope.go:117] "RemoveContainer" containerID="31dc48f946dbcd4367514c38baf10a07e0624875b02aec994168dd40658ea3fe" Mar 14 07:29:37 crc kubenswrapper[4893]: E0314 07:29:37.608456 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31dc48f946dbcd4367514c38baf10a07e0624875b02aec994168dd40658ea3fe\": container with ID starting with 31dc48f946dbcd4367514c38baf10a07e0624875b02aec994168dd40658ea3fe not found: ID does not exist" containerID="31dc48f946dbcd4367514c38baf10a07e0624875b02aec994168dd40658ea3fe" Mar 14 07:29:37 crc kubenswrapper[4893]: I0314 07:29:37.608495 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31dc48f946dbcd4367514c38baf10a07e0624875b02aec994168dd40658ea3fe"} err="failed to get container status \"31dc48f946dbcd4367514c38baf10a07e0624875b02aec994168dd40658ea3fe\": rpc error: code = NotFound desc = could not find container \"31dc48f946dbcd4367514c38baf10a07e0624875b02aec994168dd40658ea3fe\": container with ID starting with 31dc48f946dbcd4367514c38baf10a07e0624875b02aec994168dd40658ea3fe not found: ID does not exist" Mar 14 07:29:38 crc kubenswrapper[4893]: I0314 07:29:38.377363 4893 scope.go:117] "RemoveContainer" containerID="95504669fee57337f728128f8e2d8bc7bfed00823cb8a0e77ef2c491f6b7fa2c" Mar 14 07:29:38 crc kubenswrapper[4893]: E0314 07:29:38.377742 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:29:39 crc kubenswrapper[4893]: I0314 07:29:39.385121 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddce0fd5-7e00-4f05-87ca-bf534185e31a" path="/var/lib/kubelet/pods/ddce0fd5-7e00-4f05-87ca-bf534185e31a/volumes" Mar 14 07:29:51 crc kubenswrapper[4893]: I0314 07:29:51.381367 4893 scope.go:117] "RemoveContainer" containerID="95504669fee57337f728128f8e2d8bc7bfed00823cb8a0e77ef2c491f6b7fa2c" Mar 14 07:29:51 crc kubenswrapper[4893]: E0314 07:29:51.382819 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:30:00 crc kubenswrapper[4893]: I0314 07:30:00.148417 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557890-q7dn8"] Mar 14 07:30:00 crc kubenswrapper[4893]: E0314 07:30:00.150870 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddce0fd5-7e00-4f05-87ca-bf534185e31a" containerName="extract-content" Mar 14 07:30:00 crc kubenswrapper[4893]: I0314 07:30:00.152042 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddce0fd5-7e00-4f05-87ca-bf534185e31a" containerName="extract-content" Mar 14 07:30:00 crc kubenswrapper[4893]: E0314 07:30:00.152139 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddce0fd5-7e00-4f05-87ca-bf534185e31a" containerName="registry-server" Mar 14 07:30:00 crc kubenswrapper[4893]: I0314 07:30:00.152233 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddce0fd5-7e00-4f05-87ca-bf534185e31a" containerName="registry-server" Mar 14 07:30:00 crc kubenswrapper[4893]: E0314 07:30:00.152361 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddce0fd5-7e00-4f05-87ca-bf534185e31a" containerName="extract-utilities" Mar 14 07:30:00 crc kubenswrapper[4893]: I0314 07:30:00.152439 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddce0fd5-7e00-4f05-87ca-bf534185e31a" containerName="extract-utilities" Mar 14 07:30:00 crc kubenswrapper[4893]: I0314 07:30:00.152759 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddce0fd5-7e00-4f05-87ca-bf534185e31a" containerName="registry-server" Mar 14 07:30:00 crc kubenswrapper[4893]: I0314 07:30:00.153426 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557890-q7dn8" Mar 14 07:30:00 crc kubenswrapper[4893]: I0314 07:30:00.155868 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 07:30:00 crc kubenswrapper[4893]: I0314 07:30:00.157088 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 07:30:00 crc kubenswrapper[4893]: I0314 07:30:00.157685 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-44qb7" Mar 14 07:30:00 crc kubenswrapper[4893]: I0314 07:30:00.167865 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557890-wrxz5"] Mar 14 07:30:00 crc kubenswrapper[4893]: I0314 07:30:00.169129 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557890-wrxz5" Mar 14 07:30:00 crc kubenswrapper[4893]: I0314 07:30:00.172475 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 14 07:30:00 crc kubenswrapper[4893]: I0314 07:30:00.173915 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 14 07:30:00 crc kubenswrapper[4893]: I0314 07:30:00.177210 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557890-q7dn8"] Mar 14 07:30:00 crc kubenswrapper[4893]: I0314 07:30:00.187356 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557890-wrxz5"] Mar 14 07:30:00 crc kubenswrapper[4893]: I0314 07:30:00.232978 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d266d4cc-1e1f-43b7-8d8b-98e032249192-config-volume\") pod \"collect-profiles-29557890-wrxz5\" (UID: \"d266d4cc-1e1f-43b7-8d8b-98e032249192\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557890-wrxz5" Mar 14 07:30:00 crc kubenswrapper[4893]: I0314 07:30:00.233030 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8msn\" (UniqueName: \"kubernetes.io/projected/d266d4cc-1e1f-43b7-8d8b-98e032249192-kube-api-access-s8msn\") pod \"collect-profiles-29557890-wrxz5\" (UID: \"d266d4cc-1e1f-43b7-8d8b-98e032249192\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557890-wrxz5" Mar 14 07:30:00 crc kubenswrapper[4893]: I0314 07:30:00.233055 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d266d4cc-1e1f-43b7-8d8b-98e032249192-secret-volume\") pod \"collect-profiles-29557890-wrxz5\" (UID: \"d266d4cc-1e1f-43b7-8d8b-98e032249192\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557890-wrxz5" Mar 14 07:30:00 crc kubenswrapper[4893]: I0314 07:30:00.233102 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95pzz\" (UniqueName: \"kubernetes.io/projected/c94b8cc4-381a-4bcb-9fce-6f9d540d33c9-kube-api-access-95pzz\") pod \"auto-csr-approver-29557890-q7dn8\" (UID: \"c94b8cc4-381a-4bcb-9fce-6f9d540d33c9\") " pod="openshift-infra/auto-csr-approver-29557890-q7dn8" Mar 14 07:30:00 crc kubenswrapper[4893]: I0314 07:30:00.333871 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95pzz\" (UniqueName: \"kubernetes.io/projected/c94b8cc4-381a-4bcb-9fce-6f9d540d33c9-kube-api-access-95pzz\") pod \"auto-csr-approver-29557890-q7dn8\" (UID: \"c94b8cc4-381a-4bcb-9fce-6f9d540d33c9\") " pod="openshift-infra/auto-csr-approver-29557890-q7dn8" Mar 14 07:30:00 crc kubenswrapper[4893]: I0314 07:30:00.333980 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d266d4cc-1e1f-43b7-8d8b-98e032249192-config-volume\") pod \"collect-profiles-29557890-wrxz5\" (UID: \"d266d4cc-1e1f-43b7-8d8b-98e032249192\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557890-wrxz5" Mar 14 07:30:00 crc kubenswrapper[4893]: I0314 07:30:00.334000 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8msn\" (UniqueName: \"kubernetes.io/projected/d266d4cc-1e1f-43b7-8d8b-98e032249192-kube-api-access-s8msn\") pod \"collect-profiles-29557890-wrxz5\" (UID: \"d266d4cc-1e1f-43b7-8d8b-98e032249192\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557890-wrxz5" Mar 14 07:30:00 crc kubenswrapper[4893]: I0314 07:30:00.334017 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d266d4cc-1e1f-43b7-8d8b-98e032249192-secret-volume\") pod \"collect-profiles-29557890-wrxz5\" (UID: \"d266d4cc-1e1f-43b7-8d8b-98e032249192\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557890-wrxz5" Mar 14 07:30:00 crc kubenswrapper[4893]: I0314 07:30:00.335345 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d266d4cc-1e1f-43b7-8d8b-98e032249192-config-volume\") pod \"collect-profiles-29557890-wrxz5\" (UID: \"d266d4cc-1e1f-43b7-8d8b-98e032249192\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557890-wrxz5" Mar 14 07:30:00 crc kubenswrapper[4893]: I0314 07:30:00.351141 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95pzz\" (UniqueName: \"kubernetes.io/projected/c94b8cc4-381a-4bcb-9fce-6f9d540d33c9-kube-api-access-95pzz\") pod \"auto-csr-approver-29557890-q7dn8\" (UID: \"c94b8cc4-381a-4bcb-9fce-6f9d540d33c9\") " pod="openshift-infra/auto-csr-approver-29557890-q7dn8" Mar 14 07:30:00 crc kubenswrapper[4893]: I0314 07:30:00.351272 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d266d4cc-1e1f-43b7-8d8b-98e032249192-secret-volume\") pod \"collect-profiles-29557890-wrxz5\" (UID: \"d266d4cc-1e1f-43b7-8d8b-98e032249192\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557890-wrxz5" Mar 14 07:30:00 crc kubenswrapper[4893]: I0314 07:30:00.357197 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8msn\" (UniqueName: \"kubernetes.io/projected/d266d4cc-1e1f-43b7-8d8b-98e032249192-kube-api-access-s8msn\") pod \"collect-profiles-29557890-wrxz5\" (UID: \"d266d4cc-1e1f-43b7-8d8b-98e032249192\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557890-wrxz5" Mar 14 07:30:00 crc kubenswrapper[4893]: I0314 07:30:00.477210 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557890-q7dn8" Mar 14 07:30:00 crc kubenswrapper[4893]: I0314 07:30:00.490876 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557890-wrxz5" Mar 14 07:30:00 crc kubenswrapper[4893]: I0314 07:30:00.955840 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557890-q7dn8"] Mar 14 07:30:00 crc kubenswrapper[4893]: W0314 07:30:00.966751 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc94b8cc4_381a_4bcb_9fce_6f9d540d33c9.slice/crio-8b17d67be6048ab7c08902d52f9c0e5edc53b15343f2d4e2d99dc4f69f836f91 WatchSource:0}: Error finding container 8b17d67be6048ab7c08902d52f9c0e5edc53b15343f2d4e2d99dc4f69f836f91: Status 404 returned error can't find the container with id 8b17d67be6048ab7c08902d52f9c0e5edc53b15343f2d4e2d99dc4f69f836f91 Mar 14 07:30:01 crc kubenswrapper[4893]: I0314 07:30:01.015559 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557890-wrxz5"] Mar 14 07:30:01 crc kubenswrapper[4893]: W0314 07:30:01.018469 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd266d4cc_1e1f_43b7_8d8b_98e032249192.slice/crio-388b2bc90a0e045e279c277a12abb2f9e65722f35a1266b5befde91ccb595151 WatchSource:0}: Error finding container 388b2bc90a0e045e279c277a12abb2f9e65722f35a1266b5befde91ccb595151: Status 404 returned error can't find the container with id 388b2bc90a0e045e279c277a12abb2f9e65722f35a1266b5befde91ccb595151 Mar 14 07:30:01 crc kubenswrapper[4893]: I0314 07:30:01.668092 4893 generic.go:334] "Generic (PLEG): container finished" podID="d266d4cc-1e1f-43b7-8d8b-98e032249192" containerID="baa987483ca644dd873671273f1b8070a8cc32a295ba7d47cff6ea4d21605b8b" exitCode=0 Mar 14 07:30:01 crc kubenswrapper[4893]: I0314 07:30:01.668140 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557890-wrxz5" event={"ID":"d266d4cc-1e1f-43b7-8d8b-98e032249192","Type":"ContainerDied","Data":"baa987483ca644dd873671273f1b8070a8cc32a295ba7d47cff6ea4d21605b8b"} Mar 14 07:30:01 crc kubenswrapper[4893]: I0314 07:30:01.668402 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557890-wrxz5" event={"ID":"d266d4cc-1e1f-43b7-8d8b-98e032249192","Type":"ContainerStarted","Data":"388b2bc90a0e045e279c277a12abb2f9e65722f35a1266b5befde91ccb595151"} Mar 14 07:30:01 crc kubenswrapper[4893]: I0314 07:30:01.670008 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557890-q7dn8" event={"ID":"c94b8cc4-381a-4bcb-9fce-6f9d540d33c9","Type":"ContainerStarted","Data":"8b17d67be6048ab7c08902d52f9c0e5edc53b15343f2d4e2d99dc4f69f836f91"} Mar 14 07:30:02 crc kubenswrapper[4893]: I0314 07:30:02.680693 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557890-q7dn8" event={"ID":"c94b8cc4-381a-4bcb-9fce-6f9d540d33c9","Type":"ContainerStarted","Data":"1b410abf40a49057ddfc260eddfb00a317398bcbfb7019f23b4f5b24b557f53d"} Mar 14 07:30:02 crc kubenswrapper[4893]: I0314 07:30:02.696697 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557890-q7dn8" podStartSLOduration=1.470398081 podStartE2EDuration="2.696674762s" podCreationTimestamp="2026-03-14 07:30:00 +0000 UTC" firstStartedPulling="2026-03-14 07:30:00.968605902 +0000 UTC m=+1880.230782694" lastFinishedPulling="2026-03-14 07:30:02.194882583 +0000 UTC m=+1881.457059375" observedRunningTime="2026-03-14 07:30:02.696101008 +0000 UTC m=+1881.958277850" watchObservedRunningTime="2026-03-14 07:30:02.696674762 +0000 UTC m=+1881.958851564" Mar 14 07:30:02 crc kubenswrapper[4893]: I0314 07:30:02.977554 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557890-wrxz5" Mar 14 07:30:03 crc kubenswrapper[4893]: I0314 07:30:03.077623 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d266d4cc-1e1f-43b7-8d8b-98e032249192-secret-volume\") pod \"d266d4cc-1e1f-43b7-8d8b-98e032249192\" (UID: \"d266d4cc-1e1f-43b7-8d8b-98e032249192\") " Mar 14 07:30:03 crc kubenswrapper[4893]: I0314 07:30:03.077687 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d266d4cc-1e1f-43b7-8d8b-98e032249192-config-volume\") pod \"d266d4cc-1e1f-43b7-8d8b-98e032249192\" (UID: \"d266d4cc-1e1f-43b7-8d8b-98e032249192\") " Mar 14 07:30:03 crc kubenswrapper[4893]: I0314 07:30:03.077728 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8msn\" (UniqueName: \"kubernetes.io/projected/d266d4cc-1e1f-43b7-8d8b-98e032249192-kube-api-access-s8msn\") pod \"d266d4cc-1e1f-43b7-8d8b-98e032249192\" (UID: \"d266d4cc-1e1f-43b7-8d8b-98e032249192\") " Mar 14 07:30:03 crc kubenswrapper[4893]: I0314 07:30:03.078261 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d266d4cc-1e1f-43b7-8d8b-98e032249192-config-volume" (OuterVolumeSpecName: "config-volume") pod "d266d4cc-1e1f-43b7-8d8b-98e032249192" (UID: "d266d4cc-1e1f-43b7-8d8b-98e032249192"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:30:03 crc kubenswrapper[4893]: I0314 07:30:03.082866 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d266d4cc-1e1f-43b7-8d8b-98e032249192-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d266d4cc-1e1f-43b7-8d8b-98e032249192" (UID: "d266d4cc-1e1f-43b7-8d8b-98e032249192"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:30:03 crc kubenswrapper[4893]: I0314 07:30:03.082885 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d266d4cc-1e1f-43b7-8d8b-98e032249192-kube-api-access-s8msn" (OuterVolumeSpecName: "kube-api-access-s8msn") pod "d266d4cc-1e1f-43b7-8d8b-98e032249192" (UID: "d266d4cc-1e1f-43b7-8d8b-98e032249192"). InnerVolumeSpecName "kube-api-access-s8msn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:30:03 crc kubenswrapper[4893]: I0314 07:30:03.179417 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8msn\" (UniqueName: \"kubernetes.io/projected/d266d4cc-1e1f-43b7-8d8b-98e032249192-kube-api-access-s8msn\") on node \"crc\" DevicePath \"\"" Mar 14 07:30:03 crc kubenswrapper[4893]: I0314 07:30:03.179451 4893 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d266d4cc-1e1f-43b7-8d8b-98e032249192-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 14 07:30:03 crc kubenswrapper[4893]: I0314 07:30:03.179462 4893 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d266d4cc-1e1f-43b7-8d8b-98e032249192-config-volume\") on node \"crc\" DevicePath \"\"" Mar 14 07:30:03 crc kubenswrapper[4893]: I0314 07:30:03.689818 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557890-wrxz5" event={"ID":"d266d4cc-1e1f-43b7-8d8b-98e032249192","Type":"ContainerDied","Data":"388b2bc90a0e045e279c277a12abb2f9e65722f35a1266b5befde91ccb595151"} Mar 14 07:30:03 crc kubenswrapper[4893]: I0314 07:30:03.690132 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="388b2bc90a0e045e279c277a12abb2f9e65722f35a1266b5befde91ccb595151" Mar 14 07:30:03 crc kubenswrapper[4893]: I0314 07:30:03.689835 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557890-wrxz5" Mar 14 07:30:03 crc kubenswrapper[4893]: I0314 07:30:03.691268 4893 generic.go:334] "Generic (PLEG): container finished" podID="c94b8cc4-381a-4bcb-9fce-6f9d540d33c9" containerID="1b410abf40a49057ddfc260eddfb00a317398bcbfb7019f23b4f5b24b557f53d" exitCode=0 Mar 14 07:30:03 crc kubenswrapper[4893]: I0314 07:30:03.691345 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557890-q7dn8" event={"ID":"c94b8cc4-381a-4bcb-9fce-6f9d540d33c9","Type":"ContainerDied","Data":"1b410abf40a49057ddfc260eddfb00a317398bcbfb7019f23b4f5b24b557f53d"} Mar 14 07:30:04 crc kubenswrapper[4893]: I0314 07:30:04.965649 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557890-q7dn8" Mar 14 07:30:05 crc kubenswrapper[4893]: I0314 07:30:05.004940 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95pzz\" (UniqueName: \"kubernetes.io/projected/c94b8cc4-381a-4bcb-9fce-6f9d540d33c9-kube-api-access-95pzz\") pod \"c94b8cc4-381a-4bcb-9fce-6f9d540d33c9\" (UID: \"c94b8cc4-381a-4bcb-9fce-6f9d540d33c9\") " Mar 14 07:30:05 crc kubenswrapper[4893]: I0314 07:30:05.011259 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c94b8cc4-381a-4bcb-9fce-6f9d540d33c9-kube-api-access-95pzz" (OuterVolumeSpecName: "kube-api-access-95pzz") pod "c94b8cc4-381a-4bcb-9fce-6f9d540d33c9" (UID: "c94b8cc4-381a-4bcb-9fce-6f9d540d33c9"). InnerVolumeSpecName "kube-api-access-95pzz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:30:05 crc kubenswrapper[4893]: I0314 07:30:05.109510 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95pzz\" (UniqueName: \"kubernetes.io/projected/c94b8cc4-381a-4bcb-9fce-6f9d540d33c9-kube-api-access-95pzz\") on node \"crc\" DevicePath \"\"" Mar 14 07:30:05 crc kubenswrapper[4893]: I0314 07:30:05.706433 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557890-q7dn8" event={"ID":"c94b8cc4-381a-4bcb-9fce-6f9d540d33c9","Type":"ContainerDied","Data":"8b17d67be6048ab7c08902d52f9c0e5edc53b15343f2d4e2d99dc4f69f836f91"} Mar 14 07:30:05 crc kubenswrapper[4893]: I0314 07:30:05.706471 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557890-q7dn8" Mar 14 07:30:05 crc kubenswrapper[4893]: I0314 07:30:05.706488 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b17d67be6048ab7c08902d52f9c0e5edc53b15343f2d4e2d99dc4f69f836f91" Mar 14 07:30:05 crc kubenswrapper[4893]: I0314 07:30:05.758810 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557884-7b2sz"] Mar 14 07:30:05 crc kubenswrapper[4893]: I0314 07:30:05.766886 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557884-7b2sz"] Mar 14 07:30:06 crc kubenswrapper[4893]: I0314 07:30:06.377131 4893 scope.go:117] "RemoveContainer" containerID="95504669fee57337f728128f8e2d8bc7bfed00823cb8a0e77ef2c491f6b7fa2c" Mar 14 07:30:06 crc kubenswrapper[4893]: E0314 07:30:06.377715 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:30:06 crc kubenswrapper[4893]: I0314 07:30:06.590327 4893 scope.go:117] "RemoveContainer" containerID="212a975ff9e1cacce9d3e6283aa65866d8f8e34990736ef7141b9b6c2cfaca29" Mar 14 07:30:06 crc kubenswrapper[4893]: I0314 07:30:06.605799 4893 scope.go:117] "RemoveContainer" containerID="073b9c184ce9046866e4943bc04f1b040fc75258df01b2366699930f65b01981" Mar 14 07:30:06 crc kubenswrapper[4893]: I0314 07:30:06.625350 4893 scope.go:117] "RemoveContainer" containerID="90e9322ef26dbbffc9cc7bc8b8905ff428b7b56d164c235cfcf177745f83890c" Mar 14 07:30:06 crc kubenswrapper[4893]: I0314 07:30:06.661922 4893 scope.go:117] "RemoveContainer" containerID="7facda4f79e49359930910e74711a0b4e237b3502da367234832f322f7e87812" Mar 14 07:30:06 crc kubenswrapper[4893]: I0314 07:30:06.679786 4893 scope.go:117] "RemoveContainer" containerID="1479c505fa63564b68dc67398eb06f224838c90b598b1f33bb9bfdfc5ff3333d" Mar 14 07:30:06 crc kubenswrapper[4893]: I0314 07:30:06.696230 4893 scope.go:117] "RemoveContainer" containerID="f93c2e6852c1f30ddcb040cabab75f3dbad059392845c126c070852b93a48323" Mar 14 07:30:07 crc kubenswrapper[4893]: I0314 07:30:07.384832 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f56851e-f6ad-46d1-9b96-8c6a9d80a227" path="/var/lib/kubelet/pods/4f56851e-f6ad-46d1-9b96-8c6a9d80a227/volumes" Mar 14 07:30:18 crc kubenswrapper[4893]: I0314 07:30:18.377057 4893 scope.go:117] "RemoveContainer" containerID="95504669fee57337f728128f8e2d8bc7bfed00823cb8a0e77ef2c491f6b7fa2c" Mar 14 07:30:18 crc kubenswrapper[4893]: E0314 07:30:18.378015 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:30:30 crc kubenswrapper[4893]: I0314 07:30:30.376764 4893 scope.go:117] "RemoveContainer" containerID="95504669fee57337f728128f8e2d8bc7bfed00823cb8a0e77ef2c491f6b7fa2c" Mar 14 07:30:30 crc kubenswrapper[4893]: E0314 07:30:30.377573 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:30:43 crc kubenswrapper[4893]: I0314 07:30:43.377251 4893 scope.go:117] "RemoveContainer" containerID="95504669fee57337f728128f8e2d8bc7bfed00823cb8a0e77ef2c491f6b7fa2c" Mar 14 07:30:43 crc kubenswrapper[4893]: E0314 07:30:43.378811 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:30:55 crc kubenswrapper[4893]: I0314 07:30:55.376851 4893 scope.go:117] "RemoveContainer" containerID="95504669fee57337f728128f8e2d8bc7bfed00823cb8a0e77ef2c491f6b7fa2c" Mar 14 07:30:55 crc kubenswrapper[4893]: E0314 07:30:55.377801 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:31:06 crc kubenswrapper[4893]: I0314 07:31:06.781591 4893 scope.go:117] "RemoveContainer" containerID="0f1669c9351590b8d4ea31c6f74fb1dfaf88c99551b1ad6898325ff1daada067" Mar 14 07:31:09 crc kubenswrapper[4893]: I0314 07:31:09.376709 4893 scope.go:117] "RemoveContainer" containerID="95504669fee57337f728128f8e2d8bc7bfed00823cb8a0e77ef2c491f6b7fa2c" Mar 14 07:31:10 crc kubenswrapper[4893]: I0314 07:31:10.211554 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" event={"ID":"ad6724e5-48cf-4417-ae51-b1cb8c6af70d","Type":"ContainerStarted","Data":"92dbd2d4668ff2f1c2b1bbddce5f3e3766f7fb6098f86dddd01ddc9fa90c9cf8"} Mar 14 07:32:00 crc kubenswrapper[4893]: I0314 07:32:00.145259 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557892-2269r"] Mar 14 07:32:00 crc kubenswrapper[4893]: E0314 07:32:00.146369 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c94b8cc4-381a-4bcb-9fce-6f9d540d33c9" containerName="oc" Mar 14 07:32:00 crc kubenswrapper[4893]: I0314 07:32:00.146383 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="c94b8cc4-381a-4bcb-9fce-6f9d540d33c9" containerName="oc" Mar 14 07:32:00 crc kubenswrapper[4893]: E0314 07:32:00.146409 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d266d4cc-1e1f-43b7-8d8b-98e032249192" containerName="collect-profiles" Mar 14 07:32:00 crc kubenswrapper[4893]: I0314 07:32:00.146415 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="d266d4cc-1e1f-43b7-8d8b-98e032249192" containerName="collect-profiles" Mar 14 07:32:00 crc kubenswrapper[4893]: I0314 07:32:00.146569 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="d266d4cc-1e1f-43b7-8d8b-98e032249192" containerName="collect-profiles" Mar 14 07:32:00 crc kubenswrapper[4893]: I0314 07:32:00.146590 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="c94b8cc4-381a-4bcb-9fce-6f9d540d33c9" containerName="oc" Mar 14 07:32:00 crc kubenswrapper[4893]: I0314 07:32:00.147000 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557892-2269r" Mar 14 07:32:00 crc kubenswrapper[4893]: I0314 07:32:00.149435 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 07:32:00 crc kubenswrapper[4893]: I0314 07:32:00.151099 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-44qb7" Mar 14 07:32:00 crc kubenswrapper[4893]: I0314 07:32:00.151226 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 07:32:00 crc kubenswrapper[4893]: I0314 07:32:00.160079 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557892-2269r"] Mar 14 07:32:00 crc kubenswrapper[4893]: I0314 07:32:00.200165 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8xlj\" (UniqueName: \"kubernetes.io/projected/8697a2cb-05a0-4af0-8579-9a6629290862-kube-api-access-m8xlj\") pod \"auto-csr-approver-29557892-2269r\" (UID: \"8697a2cb-05a0-4af0-8579-9a6629290862\") " pod="openshift-infra/auto-csr-approver-29557892-2269r" Mar 14 07:32:00 crc kubenswrapper[4893]: I0314 07:32:00.300942 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8xlj\" (UniqueName: \"kubernetes.io/projected/8697a2cb-05a0-4af0-8579-9a6629290862-kube-api-access-m8xlj\") pod \"auto-csr-approver-29557892-2269r\" (UID: \"8697a2cb-05a0-4af0-8579-9a6629290862\") " pod="openshift-infra/auto-csr-approver-29557892-2269r" Mar 14 07:32:00 crc kubenswrapper[4893]: I0314 07:32:00.324916 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8xlj\" (UniqueName: \"kubernetes.io/projected/8697a2cb-05a0-4af0-8579-9a6629290862-kube-api-access-m8xlj\") pod \"auto-csr-approver-29557892-2269r\" (UID: \"8697a2cb-05a0-4af0-8579-9a6629290862\") " pod="openshift-infra/auto-csr-approver-29557892-2269r" Mar 14 07:32:00 crc kubenswrapper[4893]: I0314 07:32:00.468107 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557892-2269r" Mar 14 07:32:00 crc kubenswrapper[4893]: I0314 07:32:00.690772 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557892-2269r"] Mar 14 07:32:01 crc kubenswrapper[4893]: I0314 07:32:01.614589 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557892-2269r" event={"ID":"8697a2cb-05a0-4af0-8579-9a6629290862","Type":"ContainerStarted","Data":"4c75be487bdb5733bed019c4ccd6b7290c3247ca3f91e6d79b964dd0c3174855"} Mar 14 07:32:02 crc kubenswrapper[4893]: I0314 07:32:02.624112 4893 generic.go:334] "Generic (PLEG): container finished" podID="8697a2cb-05a0-4af0-8579-9a6629290862" containerID="179eac334379ef478f65f3ba4f3324286015925fb2d4e09fa8fb73affc60f8a9" exitCode=0 Mar 14 07:32:02 crc kubenswrapper[4893]: I0314 07:32:02.624178 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557892-2269r" event={"ID":"8697a2cb-05a0-4af0-8579-9a6629290862","Type":"ContainerDied","Data":"179eac334379ef478f65f3ba4f3324286015925fb2d4e09fa8fb73affc60f8a9"} Mar 14 07:32:03 crc kubenswrapper[4893]: I0314 07:32:03.915986 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557892-2269r" Mar 14 07:32:03 crc kubenswrapper[4893]: I0314 07:32:03.963469 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8xlj\" (UniqueName: \"kubernetes.io/projected/8697a2cb-05a0-4af0-8579-9a6629290862-kube-api-access-m8xlj\") pod \"8697a2cb-05a0-4af0-8579-9a6629290862\" (UID: \"8697a2cb-05a0-4af0-8579-9a6629290862\") " Mar 14 07:32:03 crc kubenswrapper[4893]: I0314 07:32:03.974072 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8697a2cb-05a0-4af0-8579-9a6629290862-kube-api-access-m8xlj" (OuterVolumeSpecName: "kube-api-access-m8xlj") pod "8697a2cb-05a0-4af0-8579-9a6629290862" (UID: "8697a2cb-05a0-4af0-8579-9a6629290862"). InnerVolumeSpecName "kube-api-access-m8xlj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:32:04 crc kubenswrapper[4893]: I0314 07:32:04.065349 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m8xlj\" (UniqueName: \"kubernetes.io/projected/8697a2cb-05a0-4af0-8579-9a6629290862-kube-api-access-m8xlj\") on node \"crc\" DevicePath \"\"" Mar 14 07:32:04 crc kubenswrapper[4893]: I0314 07:32:04.646806 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557892-2269r" event={"ID":"8697a2cb-05a0-4af0-8579-9a6629290862","Type":"ContainerDied","Data":"4c75be487bdb5733bed019c4ccd6b7290c3247ca3f91e6d79b964dd0c3174855"} Mar 14 07:32:04 crc kubenswrapper[4893]: I0314 07:32:04.646871 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c75be487bdb5733bed019c4ccd6b7290c3247ca3f91e6d79b964dd0c3174855" Mar 14 07:32:04 crc kubenswrapper[4893]: I0314 07:32:04.646922 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557892-2269r" Mar 14 07:32:04 crc kubenswrapper[4893]: I0314 07:32:04.986688 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557886-8hpcz"] Mar 14 07:32:04 crc kubenswrapper[4893]: I0314 07:32:04.990074 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557886-8hpcz"] Mar 14 07:32:05 crc kubenswrapper[4893]: I0314 07:32:05.384662 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d370b3df-c71b-4f67-a02c-d617e3a09ac2" path="/var/lib/kubelet/pods/d370b3df-c71b-4f67-a02c-d617e3a09ac2/volumes" Mar 14 07:32:06 crc kubenswrapper[4893]: I0314 07:32:06.870921 4893 scope.go:117] "RemoveContainer" containerID="90c5889dbe26b489675785b24110a2834f4c443d15f6d8544e215fa4aa853b44" Mar 14 07:33:29 crc kubenswrapper[4893]: I0314 07:33:29.731637 4893 patch_prober.go:28] interesting pod/machine-config-daemon-d4x6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:33:29 crc kubenswrapper[4893]: I0314 07:33:29.732313 4893 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:33:59 crc kubenswrapper[4893]: I0314 07:33:59.731270 4893 patch_prober.go:28] interesting pod/machine-config-daemon-d4x6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:33:59 crc kubenswrapper[4893]: I0314 07:33:59.732008 4893 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:34:00 crc kubenswrapper[4893]: I0314 07:34:00.162672 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557894-xhdbs"] Mar 14 07:34:00 crc kubenswrapper[4893]: E0314 07:34:00.163983 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8697a2cb-05a0-4af0-8579-9a6629290862" containerName="oc" Mar 14 07:34:00 crc kubenswrapper[4893]: I0314 07:34:00.164152 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="8697a2cb-05a0-4af0-8579-9a6629290862" containerName="oc" Mar 14 07:34:00 crc kubenswrapper[4893]: I0314 07:34:00.164580 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="8697a2cb-05a0-4af0-8579-9a6629290862" containerName="oc" Mar 14 07:34:00 crc kubenswrapper[4893]: I0314 07:34:00.165816 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557894-xhdbs" Mar 14 07:34:00 crc kubenswrapper[4893]: I0314 07:34:00.169008 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 07:34:00 crc kubenswrapper[4893]: I0314 07:34:00.169686 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 07:34:00 crc kubenswrapper[4893]: I0314 07:34:00.174263 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-44qb7" Mar 14 07:34:00 crc kubenswrapper[4893]: I0314 07:34:00.178647 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557894-xhdbs"] Mar 14 07:34:00 crc kubenswrapper[4893]: I0314 07:34:00.311551 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzx65\" (UniqueName: \"kubernetes.io/projected/761ca03d-ec29-4f34-a325-3396ffbf620e-kube-api-access-bzx65\") pod \"auto-csr-approver-29557894-xhdbs\" (UID: \"761ca03d-ec29-4f34-a325-3396ffbf620e\") " pod="openshift-infra/auto-csr-approver-29557894-xhdbs" Mar 14 07:34:00 crc kubenswrapper[4893]: I0314 07:34:00.413429 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzx65\" (UniqueName: \"kubernetes.io/projected/761ca03d-ec29-4f34-a325-3396ffbf620e-kube-api-access-bzx65\") pod \"auto-csr-approver-29557894-xhdbs\" (UID: \"761ca03d-ec29-4f34-a325-3396ffbf620e\") " pod="openshift-infra/auto-csr-approver-29557894-xhdbs" Mar 14 07:34:00 crc kubenswrapper[4893]: I0314 07:34:00.440029 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzx65\" (UniqueName: \"kubernetes.io/projected/761ca03d-ec29-4f34-a325-3396ffbf620e-kube-api-access-bzx65\") pod \"auto-csr-approver-29557894-xhdbs\" (UID: \"761ca03d-ec29-4f34-a325-3396ffbf620e\") " pod="openshift-infra/auto-csr-approver-29557894-xhdbs" Mar 14 07:34:00 crc kubenswrapper[4893]: I0314 07:34:00.509596 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557894-xhdbs" Mar 14 07:34:00 crc kubenswrapper[4893]: I0314 07:34:00.985054 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557894-xhdbs"] Mar 14 07:34:01 crc kubenswrapper[4893]: I0314 07:34:01.666275 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557894-xhdbs" event={"ID":"761ca03d-ec29-4f34-a325-3396ffbf620e","Type":"ContainerStarted","Data":"02b36673d6a60517de82414ee9d0fa3c548550a4c03c728502a6ef731fd64b2a"} Mar 14 07:34:02 crc kubenswrapper[4893]: I0314 07:34:02.675055 4893 generic.go:334] "Generic (PLEG): container finished" podID="761ca03d-ec29-4f34-a325-3396ffbf620e" containerID="707959329ad44149cc2390288f5cbbc19baee193dcc5acb11ddf6fd64aff9359" exitCode=0 Mar 14 07:34:02 crc kubenswrapper[4893]: I0314 07:34:02.675118 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557894-xhdbs" event={"ID":"761ca03d-ec29-4f34-a325-3396ffbf620e","Type":"ContainerDied","Data":"707959329ad44149cc2390288f5cbbc19baee193dcc5acb11ddf6fd64aff9359"} Mar 14 07:34:03 crc kubenswrapper[4893]: I0314 07:34:03.987145 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557894-xhdbs" Mar 14 07:34:04 crc kubenswrapper[4893]: I0314 07:34:04.070730 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzx65\" (UniqueName: \"kubernetes.io/projected/761ca03d-ec29-4f34-a325-3396ffbf620e-kube-api-access-bzx65\") pod \"761ca03d-ec29-4f34-a325-3396ffbf620e\" (UID: \"761ca03d-ec29-4f34-a325-3396ffbf620e\") " Mar 14 07:34:04 crc kubenswrapper[4893]: I0314 07:34:04.081940 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/761ca03d-ec29-4f34-a325-3396ffbf620e-kube-api-access-bzx65" (OuterVolumeSpecName: "kube-api-access-bzx65") pod "761ca03d-ec29-4f34-a325-3396ffbf620e" (UID: "761ca03d-ec29-4f34-a325-3396ffbf620e"). InnerVolumeSpecName "kube-api-access-bzx65". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:34:04 crc kubenswrapper[4893]: I0314 07:34:04.172224 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzx65\" (UniqueName: \"kubernetes.io/projected/761ca03d-ec29-4f34-a325-3396ffbf620e-kube-api-access-bzx65\") on node \"crc\" DevicePath \"\"" Mar 14 07:34:04 crc kubenswrapper[4893]: I0314 07:34:04.691241 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557894-xhdbs" event={"ID":"761ca03d-ec29-4f34-a325-3396ffbf620e","Type":"ContainerDied","Data":"02b36673d6a60517de82414ee9d0fa3c548550a4c03c728502a6ef731fd64b2a"} Mar 14 07:34:04 crc kubenswrapper[4893]: I0314 07:34:04.691309 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02b36673d6a60517de82414ee9d0fa3c548550a4c03c728502a6ef731fd64b2a" Mar 14 07:34:04 crc kubenswrapper[4893]: I0314 07:34:04.691355 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557894-xhdbs" Mar 14 07:34:05 crc kubenswrapper[4893]: I0314 07:34:05.071035 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557888-54pq6"] Mar 14 07:34:05 crc kubenswrapper[4893]: I0314 07:34:05.079772 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557888-54pq6"] Mar 14 07:34:05 crc kubenswrapper[4893]: I0314 07:34:05.387825 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fcd5058-c59f-4b66-8b70-64286d6a3b6f" path="/var/lib/kubelet/pods/2fcd5058-c59f-4b66-8b70-64286d6a3b6f/volumes" Mar 14 07:34:06 crc kubenswrapper[4893]: I0314 07:34:06.960289 4893 scope.go:117] "RemoveContainer" containerID="f7688ceb673a7f0259db2a4f08b4bf4ddff0bd15f556bc381abed4f1066be182" Mar 14 07:34:29 crc kubenswrapper[4893]: I0314 07:34:29.731647 4893 patch_prober.go:28] interesting pod/machine-config-daemon-d4x6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:34:29 crc kubenswrapper[4893]: I0314 07:34:29.732230 4893 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:34:29 crc kubenswrapper[4893]: I0314 07:34:29.732301 4893 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" Mar 14 07:34:29 crc kubenswrapper[4893]: I0314 07:34:29.733010 4893 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"92dbd2d4668ff2f1c2b1bbddce5f3e3766f7fb6098f86dddd01ddc9fa90c9cf8"} pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 07:34:29 crc kubenswrapper[4893]: I0314 07:34:29.733079 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" containerID="cri-o://92dbd2d4668ff2f1c2b1bbddce5f3e3766f7fb6098f86dddd01ddc9fa90c9cf8" gracePeriod=600 Mar 14 07:34:29 crc kubenswrapper[4893]: I0314 07:34:29.881007 4893 generic.go:334] "Generic (PLEG): container finished" podID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerID="92dbd2d4668ff2f1c2b1bbddce5f3e3766f7fb6098f86dddd01ddc9fa90c9cf8" exitCode=0 Mar 14 07:34:29 crc kubenswrapper[4893]: I0314 07:34:29.881055 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" event={"ID":"ad6724e5-48cf-4417-ae51-b1cb8c6af70d","Type":"ContainerDied","Data":"92dbd2d4668ff2f1c2b1bbddce5f3e3766f7fb6098f86dddd01ddc9fa90c9cf8"} Mar 14 07:34:29 crc kubenswrapper[4893]: I0314 07:34:29.881087 4893 scope.go:117] "RemoveContainer" containerID="95504669fee57337f728128f8e2d8bc7bfed00823cb8a0e77ef2c491f6b7fa2c" Mar 14 07:34:30 crc kubenswrapper[4893]: I0314 07:34:30.887236 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" event={"ID":"ad6724e5-48cf-4417-ae51-b1cb8c6af70d","Type":"ContainerStarted","Data":"b66fce5bdacd92a99f342206409b1a0337114400d80774b71d411c93354c80ab"} Mar 14 07:36:00 crc kubenswrapper[4893]: I0314 07:36:00.149305 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557896-zhqmr"] Mar 14 07:36:00 crc kubenswrapper[4893]: E0314 07:36:00.150286 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="761ca03d-ec29-4f34-a325-3396ffbf620e" containerName="oc" Mar 14 07:36:00 crc kubenswrapper[4893]: I0314 07:36:00.150307 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="761ca03d-ec29-4f34-a325-3396ffbf620e" containerName="oc" Mar 14 07:36:00 crc kubenswrapper[4893]: I0314 07:36:00.150580 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="761ca03d-ec29-4f34-a325-3396ffbf620e" containerName="oc" Mar 14 07:36:00 crc kubenswrapper[4893]: I0314 07:36:00.151242 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557896-zhqmr" Mar 14 07:36:00 crc kubenswrapper[4893]: I0314 07:36:00.154155 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 07:36:00 crc kubenswrapper[4893]: I0314 07:36:00.154380 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-44qb7" Mar 14 07:36:00 crc kubenswrapper[4893]: I0314 07:36:00.154733 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 07:36:00 crc kubenswrapper[4893]: I0314 07:36:00.179621 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557896-zhqmr"] Mar 14 07:36:00 crc kubenswrapper[4893]: I0314 07:36:00.246023 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-w8qj6"] Mar 14 07:36:00 crc kubenswrapper[4893]: I0314 07:36:00.247647 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w8qj6" Mar 14 07:36:00 crc kubenswrapper[4893]: I0314 07:36:00.260891 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w8qj6"] Mar 14 07:36:00 crc kubenswrapper[4893]: I0314 07:36:00.304254 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncgx4\" (UniqueName: \"kubernetes.io/projected/3b7d9832-78e5-4045-a6fa-64faf415f86b-kube-api-access-ncgx4\") pod \"auto-csr-approver-29557896-zhqmr\" (UID: \"3b7d9832-78e5-4045-a6fa-64faf415f86b\") " pod="openshift-infra/auto-csr-approver-29557896-zhqmr" Mar 14 07:36:00 crc kubenswrapper[4893]: I0314 07:36:00.405925 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bm5qm\" (UniqueName: \"kubernetes.io/projected/55eb989b-6741-404b-86a2-f63a78e2d24d-kube-api-access-bm5qm\") pod \"certified-operators-w8qj6\" (UID: \"55eb989b-6741-404b-86a2-f63a78e2d24d\") " pod="openshift-marketplace/certified-operators-w8qj6" Mar 14 07:36:00 crc kubenswrapper[4893]: I0314 07:36:00.406032 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncgx4\" (UniqueName: \"kubernetes.io/projected/3b7d9832-78e5-4045-a6fa-64faf415f86b-kube-api-access-ncgx4\") pod \"auto-csr-approver-29557896-zhqmr\" (UID: \"3b7d9832-78e5-4045-a6fa-64faf415f86b\") " pod="openshift-infra/auto-csr-approver-29557896-zhqmr" Mar 14 07:36:00 crc kubenswrapper[4893]: I0314 07:36:00.406077 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55eb989b-6741-404b-86a2-f63a78e2d24d-utilities\") pod \"certified-operators-w8qj6\" (UID: \"55eb989b-6741-404b-86a2-f63a78e2d24d\") " pod="openshift-marketplace/certified-operators-w8qj6" Mar 14 07:36:00 crc kubenswrapper[4893]: I0314 07:36:00.406100 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55eb989b-6741-404b-86a2-f63a78e2d24d-catalog-content\") pod \"certified-operators-w8qj6\" (UID: \"55eb989b-6741-404b-86a2-f63a78e2d24d\") " pod="openshift-marketplace/certified-operators-w8qj6" Mar 14 07:36:00 crc kubenswrapper[4893]: I0314 07:36:00.429535 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncgx4\" (UniqueName: \"kubernetes.io/projected/3b7d9832-78e5-4045-a6fa-64faf415f86b-kube-api-access-ncgx4\") pod \"auto-csr-approver-29557896-zhqmr\" (UID: \"3b7d9832-78e5-4045-a6fa-64faf415f86b\") " pod="openshift-infra/auto-csr-approver-29557896-zhqmr" Mar 14 07:36:00 crc kubenswrapper[4893]: I0314 07:36:00.467533 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557896-zhqmr" Mar 14 07:36:00 crc kubenswrapper[4893]: I0314 07:36:00.506872 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bm5qm\" (UniqueName: \"kubernetes.io/projected/55eb989b-6741-404b-86a2-f63a78e2d24d-kube-api-access-bm5qm\") pod \"certified-operators-w8qj6\" (UID: \"55eb989b-6741-404b-86a2-f63a78e2d24d\") " pod="openshift-marketplace/certified-operators-w8qj6" Mar 14 07:36:00 crc kubenswrapper[4893]: I0314 07:36:00.506982 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55eb989b-6741-404b-86a2-f63a78e2d24d-utilities\") pod \"certified-operators-w8qj6\" (UID: \"55eb989b-6741-404b-86a2-f63a78e2d24d\") " pod="openshift-marketplace/certified-operators-w8qj6" Mar 14 07:36:00 crc kubenswrapper[4893]: I0314 07:36:00.507005 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55eb989b-6741-404b-86a2-f63a78e2d24d-catalog-content\") pod \"certified-operators-w8qj6\" (UID: \"55eb989b-6741-404b-86a2-f63a78e2d24d\") " pod="openshift-marketplace/certified-operators-w8qj6" Mar 14 07:36:00 crc kubenswrapper[4893]: I0314 07:36:00.507397 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55eb989b-6741-404b-86a2-f63a78e2d24d-catalog-content\") pod \"certified-operators-w8qj6\" (UID: \"55eb989b-6741-404b-86a2-f63a78e2d24d\") " pod="openshift-marketplace/certified-operators-w8qj6" Mar 14 07:36:00 crc kubenswrapper[4893]: I0314 07:36:00.508276 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55eb989b-6741-404b-86a2-f63a78e2d24d-utilities\") pod \"certified-operators-w8qj6\" (UID: \"55eb989b-6741-404b-86a2-f63a78e2d24d\") " pod="openshift-marketplace/certified-operators-w8qj6" Mar 14 07:36:00 crc kubenswrapper[4893]: I0314 07:36:00.533634 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bm5qm\" (UniqueName: \"kubernetes.io/projected/55eb989b-6741-404b-86a2-f63a78e2d24d-kube-api-access-bm5qm\") pod \"certified-operators-w8qj6\" (UID: \"55eb989b-6741-404b-86a2-f63a78e2d24d\") " pod="openshift-marketplace/certified-operators-w8qj6" Mar 14 07:36:00 crc kubenswrapper[4893]: I0314 07:36:00.562157 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w8qj6" Mar 14 07:36:00 crc kubenswrapper[4893]: I0314 07:36:00.876352 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w8qj6"] Mar 14 07:36:00 crc kubenswrapper[4893]: I0314 07:36:00.999221 4893 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 07:36:01 crc kubenswrapper[4893]: I0314 07:36:01.001025 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557896-zhqmr"] Mar 14 07:36:01 crc kubenswrapper[4893]: I0314 07:36:01.631676 4893 generic.go:334] "Generic (PLEG): container finished" podID="55eb989b-6741-404b-86a2-f63a78e2d24d" containerID="fb616671eaf9b74b7650b720301b0bb29c563a5c581beb7bf391dbfa4bd2bdeb" exitCode=0 Mar 14 07:36:01 crc kubenswrapper[4893]: I0314 07:36:01.631757 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w8qj6" event={"ID":"55eb989b-6741-404b-86a2-f63a78e2d24d","Type":"ContainerDied","Data":"fb616671eaf9b74b7650b720301b0bb29c563a5c581beb7bf391dbfa4bd2bdeb"} Mar 14 07:36:01 crc kubenswrapper[4893]: I0314 07:36:01.632155 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w8qj6" event={"ID":"55eb989b-6741-404b-86a2-f63a78e2d24d","Type":"ContainerStarted","Data":"b5707aaec15ef3ad9a7528313f4f4a8779b344844a6a562ac2c52be61b7c1d4a"} Mar 14 07:36:01 crc kubenswrapper[4893]: I0314 07:36:01.634867 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557896-zhqmr" event={"ID":"3b7d9832-78e5-4045-a6fa-64faf415f86b","Type":"ContainerStarted","Data":"609d9e76f39dd7af211655667b3fb1ec575c5a3b7e4642c3d4222d90cc975d44"} Mar 14 07:36:02 crc kubenswrapper[4893]: I0314 07:36:02.642316 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557896-zhqmr" event={"ID":"3b7d9832-78e5-4045-a6fa-64faf415f86b","Type":"ContainerStarted","Data":"5312e1b3af7b2942dfd72589b2474274aa88700f3430699d9fd970bbc29c941b"} Mar 14 07:36:02 crc kubenswrapper[4893]: I0314 07:36:02.643948 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w8qj6" event={"ID":"55eb989b-6741-404b-86a2-f63a78e2d24d","Type":"ContainerStarted","Data":"c7f5849788216a51cd2bfb5c69a14a24772e20961d2972cdfb2e21a6a79e14f3"} Mar 14 07:36:02 crc kubenswrapper[4893]: I0314 07:36:02.666964 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557896-zhqmr" podStartSLOduration=1.399430939 podStartE2EDuration="2.666939224s" podCreationTimestamp="2026-03-14 07:36:00 +0000 UTC" firstStartedPulling="2026-03-14 07:36:00.998895912 +0000 UTC m=+2240.261072704" lastFinishedPulling="2026-03-14 07:36:02.266404197 +0000 UTC m=+2241.528580989" observedRunningTime="2026-03-14 07:36:02.652083498 +0000 UTC m=+2241.914260290" watchObservedRunningTime="2026-03-14 07:36:02.666939224 +0000 UTC m=+2241.929116016" Mar 14 07:36:03 crc kubenswrapper[4893]: I0314 07:36:03.659813 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w8qj6" event={"ID":"55eb989b-6741-404b-86a2-f63a78e2d24d","Type":"ContainerDied","Data":"c7f5849788216a51cd2bfb5c69a14a24772e20961d2972cdfb2e21a6a79e14f3"} Mar 14 07:36:03 crc kubenswrapper[4893]: I0314 07:36:03.659665 4893 generic.go:334] "Generic (PLEG): container finished" podID="55eb989b-6741-404b-86a2-f63a78e2d24d" containerID="c7f5849788216a51cd2bfb5c69a14a24772e20961d2972cdfb2e21a6a79e14f3" exitCode=0 Mar 14 07:36:03 crc kubenswrapper[4893]: I0314 07:36:03.667351 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557896-zhqmr" event={"ID":"3b7d9832-78e5-4045-a6fa-64faf415f86b","Type":"ContainerDied","Data":"5312e1b3af7b2942dfd72589b2474274aa88700f3430699d9fd970bbc29c941b"} Mar 14 07:36:03 crc kubenswrapper[4893]: I0314 07:36:03.667279 4893 generic.go:334] "Generic (PLEG): container finished" podID="3b7d9832-78e5-4045-a6fa-64faf415f86b" containerID="5312e1b3af7b2942dfd72589b2474274aa88700f3430699d9fd970bbc29c941b" exitCode=0 Mar 14 07:36:04 crc kubenswrapper[4893]: I0314 07:36:04.677049 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w8qj6" event={"ID":"55eb989b-6741-404b-86a2-f63a78e2d24d","Type":"ContainerStarted","Data":"c72a7ebcecda402b04a6d2e5d5b8dfd1c2b013164fb56a242db0398cb07a92fd"} Mar 14 07:36:04 crc kubenswrapper[4893]: I0314 07:36:04.705677 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-w8qj6" podStartSLOduration=2.254815631 podStartE2EDuration="4.705656807s" podCreationTimestamp="2026-03-14 07:36:00 +0000 UTC" firstStartedPulling="2026-03-14 07:36:01.634300525 +0000 UTC m=+2240.896477357" lastFinishedPulling="2026-03-14 07:36:04.085141731 +0000 UTC m=+2243.347318533" observedRunningTime="2026-03-14 07:36:04.702246384 +0000 UTC m=+2243.964423226" watchObservedRunningTime="2026-03-14 07:36:04.705656807 +0000 UTC m=+2243.967833609" Mar 14 07:36:05 crc kubenswrapper[4893]: I0314 07:36:05.051602 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557896-zhqmr" Mar 14 07:36:05 crc kubenswrapper[4893]: I0314 07:36:05.172303 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ncgx4\" (UniqueName: \"kubernetes.io/projected/3b7d9832-78e5-4045-a6fa-64faf415f86b-kube-api-access-ncgx4\") pod \"3b7d9832-78e5-4045-a6fa-64faf415f86b\" (UID: \"3b7d9832-78e5-4045-a6fa-64faf415f86b\") " Mar 14 07:36:05 crc kubenswrapper[4893]: I0314 07:36:05.191660 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b7d9832-78e5-4045-a6fa-64faf415f86b-kube-api-access-ncgx4" (OuterVolumeSpecName: "kube-api-access-ncgx4") pod "3b7d9832-78e5-4045-a6fa-64faf415f86b" (UID: "3b7d9832-78e5-4045-a6fa-64faf415f86b"). InnerVolumeSpecName "kube-api-access-ncgx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:36:05 crc kubenswrapper[4893]: I0314 07:36:05.273946 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ncgx4\" (UniqueName: \"kubernetes.io/projected/3b7d9832-78e5-4045-a6fa-64faf415f86b-kube-api-access-ncgx4\") on node \"crc\" DevicePath \"\"" Mar 14 07:36:05 crc kubenswrapper[4893]: I0314 07:36:05.707024 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557896-zhqmr" Mar 14 07:36:05 crc kubenswrapper[4893]: I0314 07:36:05.707008 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557896-zhqmr" event={"ID":"3b7d9832-78e5-4045-a6fa-64faf415f86b","Type":"ContainerDied","Data":"609d9e76f39dd7af211655667b3fb1ec575c5a3b7e4642c3d4222d90cc975d44"} Mar 14 07:36:05 crc kubenswrapper[4893]: I0314 07:36:05.707565 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="609d9e76f39dd7af211655667b3fb1ec575c5a3b7e4642c3d4222d90cc975d44" Mar 14 07:36:05 crc kubenswrapper[4893]: I0314 07:36:05.734669 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557890-q7dn8"] Mar 14 07:36:05 crc kubenswrapper[4893]: I0314 07:36:05.743211 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557890-q7dn8"] Mar 14 07:36:07 crc kubenswrapper[4893]: I0314 07:36:07.396208 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c94b8cc4-381a-4bcb-9fce-6f9d540d33c9" path="/var/lib/kubelet/pods/c94b8cc4-381a-4bcb-9fce-6f9d540d33c9/volumes" Mar 14 07:36:10 crc kubenswrapper[4893]: I0314 07:36:10.563389 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-w8qj6" Mar 14 07:36:10 crc kubenswrapper[4893]: I0314 07:36:10.563700 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-w8qj6" Mar 14 07:36:10 crc kubenswrapper[4893]: I0314 07:36:10.617740 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-w8qj6" Mar 14 07:36:10 crc kubenswrapper[4893]: I0314 07:36:10.793330 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-w8qj6" Mar 14 07:36:10 crc kubenswrapper[4893]: I0314 07:36:10.857632 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-w8qj6"] Mar 14 07:36:12 crc kubenswrapper[4893]: I0314 07:36:12.768022 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-w8qj6" podUID="55eb989b-6741-404b-86a2-f63a78e2d24d" containerName="registry-server" containerID="cri-o://c72a7ebcecda402b04a6d2e5d5b8dfd1c2b013164fb56a242db0398cb07a92fd" gracePeriod=2 Mar 14 07:36:13 crc kubenswrapper[4893]: I0314 07:36:13.150391 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w8qj6" Mar 14 07:36:13 crc kubenswrapper[4893]: I0314 07:36:13.301291 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55eb989b-6741-404b-86a2-f63a78e2d24d-catalog-content\") pod \"55eb989b-6741-404b-86a2-f63a78e2d24d\" (UID: \"55eb989b-6741-404b-86a2-f63a78e2d24d\") " Mar 14 07:36:13 crc kubenswrapper[4893]: I0314 07:36:13.301434 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bm5qm\" (UniqueName: \"kubernetes.io/projected/55eb989b-6741-404b-86a2-f63a78e2d24d-kube-api-access-bm5qm\") pod \"55eb989b-6741-404b-86a2-f63a78e2d24d\" (UID: \"55eb989b-6741-404b-86a2-f63a78e2d24d\") " Mar 14 07:36:13 crc kubenswrapper[4893]: I0314 07:36:13.301455 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55eb989b-6741-404b-86a2-f63a78e2d24d-utilities\") pod \"55eb989b-6741-404b-86a2-f63a78e2d24d\" (UID: \"55eb989b-6741-404b-86a2-f63a78e2d24d\") " Mar 14 07:36:13 crc kubenswrapper[4893]: I0314 07:36:13.303170 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55eb989b-6741-404b-86a2-f63a78e2d24d-utilities" (OuterVolumeSpecName: "utilities") pod "55eb989b-6741-404b-86a2-f63a78e2d24d" (UID: "55eb989b-6741-404b-86a2-f63a78e2d24d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:36:13 crc kubenswrapper[4893]: I0314 07:36:13.307645 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55eb989b-6741-404b-86a2-f63a78e2d24d-kube-api-access-bm5qm" (OuterVolumeSpecName: "kube-api-access-bm5qm") pod "55eb989b-6741-404b-86a2-f63a78e2d24d" (UID: "55eb989b-6741-404b-86a2-f63a78e2d24d"). InnerVolumeSpecName "kube-api-access-bm5qm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:36:13 crc kubenswrapper[4893]: I0314 07:36:13.351115 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55eb989b-6741-404b-86a2-f63a78e2d24d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "55eb989b-6741-404b-86a2-f63a78e2d24d" (UID: "55eb989b-6741-404b-86a2-f63a78e2d24d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:36:13 crc kubenswrapper[4893]: I0314 07:36:13.402495 4893 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55eb989b-6741-404b-86a2-f63a78e2d24d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 07:36:13 crc kubenswrapper[4893]: I0314 07:36:13.402536 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bm5qm\" (UniqueName: \"kubernetes.io/projected/55eb989b-6741-404b-86a2-f63a78e2d24d-kube-api-access-bm5qm\") on node \"crc\" DevicePath \"\"" Mar 14 07:36:13 crc kubenswrapper[4893]: I0314 07:36:13.402547 4893 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55eb989b-6741-404b-86a2-f63a78e2d24d-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 07:36:13 crc kubenswrapper[4893]: I0314 07:36:13.778967 4893 generic.go:334] "Generic (PLEG): container finished" podID="55eb989b-6741-404b-86a2-f63a78e2d24d" containerID="c72a7ebcecda402b04a6d2e5d5b8dfd1c2b013164fb56a242db0398cb07a92fd" exitCode=0 Mar 14 07:36:13 crc kubenswrapper[4893]: I0314 07:36:13.779020 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w8qj6" event={"ID":"55eb989b-6741-404b-86a2-f63a78e2d24d","Type":"ContainerDied","Data":"c72a7ebcecda402b04a6d2e5d5b8dfd1c2b013164fb56a242db0398cb07a92fd"} Mar 14 07:36:13 crc kubenswrapper[4893]: I0314 07:36:13.779050 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w8qj6" Mar 14 07:36:13 crc kubenswrapper[4893]: I0314 07:36:13.779070 4893 scope.go:117] "RemoveContainer" containerID="c72a7ebcecda402b04a6d2e5d5b8dfd1c2b013164fb56a242db0398cb07a92fd" Mar 14 07:36:13 crc kubenswrapper[4893]: I0314 07:36:13.779056 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w8qj6" event={"ID":"55eb989b-6741-404b-86a2-f63a78e2d24d","Type":"ContainerDied","Data":"b5707aaec15ef3ad9a7528313f4f4a8779b344844a6a562ac2c52be61b7c1d4a"} Mar 14 07:36:13 crc kubenswrapper[4893]: I0314 07:36:13.813012 4893 scope.go:117] "RemoveContainer" containerID="c7f5849788216a51cd2bfb5c69a14a24772e20961d2972cdfb2e21a6a79e14f3" Mar 14 07:36:13 crc kubenswrapper[4893]: I0314 07:36:13.817826 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-w8qj6"] Mar 14 07:36:13 crc kubenswrapper[4893]: I0314 07:36:13.827610 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-w8qj6"] Mar 14 07:36:13 crc kubenswrapper[4893]: I0314 07:36:13.832734 4893 scope.go:117] "RemoveContainer" containerID="fb616671eaf9b74b7650b720301b0bb29c563a5c581beb7bf391dbfa4bd2bdeb" Mar 14 07:36:13 crc kubenswrapper[4893]: I0314 07:36:13.853131 4893 scope.go:117] "RemoveContainer" containerID="c72a7ebcecda402b04a6d2e5d5b8dfd1c2b013164fb56a242db0398cb07a92fd" Mar 14 07:36:13 crc kubenswrapper[4893]: E0314 07:36:13.853582 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c72a7ebcecda402b04a6d2e5d5b8dfd1c2b013164fb56a242db0398cb07a92fd\": container with ID starting with c72a7ebcecda402b04a6d2e5d5b8dfd1c2b013164fb56a242db0398cb07a92fd not found: ID does not exist" containerID="c72a7ebcecda402b04a6d2e5d5b8dfd1c2b013164fb56a242db0398cb07a92fd" Mar 14 07:36:13 crc kubenswrapper[4893]: I0314 07:36:13.853624 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c72a7ebcecda402b04a6d2e5d5b8dfd1c2b013164fb56a242db0398cb07a92fd"} err="failed to get container status \"c72a7ebcecda402b04a6d2e5d5b8dfd1c2b013164fb56a242db0398cb07a92fd\": rpc error: code = NotFound desc = could not find container \"c72a7ebcecda402b04a6d2e5d5b8dfd1c2b013164fb56a242db0398cb07a92fd\": container with ID starting with c72a7ebcecda402b04a6d2e5d5b8dfd1c2b013164fb56a242db0398cb07a92fd not found: ID does not exist" Mar 14 07:36:13 crc kubenswrapper[4893]: I0314 07:36:13.853650 4893 scope.go:117] "RemoveContainer" containerID="c7f5849788216a51cd2bfb5c69a14a24772e20961d2972cdfb2e21a6a79e14f3" Mar 14 07:36:13 crc kubenswrapper[4893]: E0314 07:36:13.854043 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7f5849788216a51cd2bfb5c69a14a24772e20961d2972cdfb2e21a6a79e14f3\": container with ID starting with c7f5849788216a51cd2bfb5c69a14a24772e20961d2972cdfb2e21a6a79e14f3 not found: ID does not exist" containerID="c7f5849788216a51cd2bfb5c69a14a24772e20961d2972cdfb2e21a6a79e14f3" Mar 14 07:36:13 crc kubenswrapper[4893]: I0314 07:36:13.854769 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7f5849788216a51cd2bfb5c69a14a24772e20961d2972cdfb2e21a6a79e14f3"} err="failed to get container status \"c7f5849788216a51cd2bfb5c69a14a24772e20961d2972cdfb2e21a6a79e14f3\": rpc error: code = NotFound desc = could not find container \"c7f5849788216a51cd2bfb5c69a14a24772e20961d2972cdfb2e21a6a79e14f3\": container with ID starting with c7f5849788216a51cd2bfb5c69a14a24772e20961d2972cdfb2e21a6a79e14f3 not found: ID does not exist" Mar 14 07:36:13 crc kubenswrapper[4893]: I0314 07:36:13.854793 4893 scope.go:117] "RemoveContainer" containerID="fb616671eaf9b74b7650b720301b0bb29c563a5c581beb7bf391dbfa4bd2bdeb" Mar 14 07:36:13 crc kubenswrapper[4893]: E0314 07:36:13.855409 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb616671eaf9b74b7650b720301b0bb29c563a5c581beb7bf391dbfa4bd2bdeb\": container with ID starting with fb616671eaf9b74b7650b720301b0bb29c563a5c581beb7bf391dbfa4bd2bdeb not found: ID does not exist" containerID="fb616671eaf9b74b7650b720301b0bb29c563a5c581beb7bf391dbfa4bd2bdeb" Mar 14 07:36:13 crc kubenswrapper[4893]: I0314 07:36:13.855463 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb616671eaf9b74b7650b720301b0bb29c563a5c581beb7bf391dbfa4bd2bdeb"} err="failed to get container status \"fb616671eaf9b74b7650b720301b0bb29c563a5c581beb7bf391dbfa4bd2bdeb\": rpc error: code = NotFound desc = could not find container \"fb616671eaf9b74b7650b720301b0bb29c563a5c581beb7bf391dbfa4bd2bdeb\": container with ID starting with fb616671eaf9b74b7650b720301b0bb29c563a5c581beb7bf391dbfa4bd2bdeb not found: ID does not exist" Mar 14 07:36:15 crc kubenswrapper[4893]: I0314 07:36:15.386861 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55eb989b-6741-404b-86a2-f63a78e2d24d" path="/var/lib/kubelet/pods/55eb989b-6741-404b-86a2-f63a78e2d24d/volumes" Mar 14 07:36:24 crc kubenswrapper[4893]: I0314 07:36:24.418784 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9h668"] Mar 14 07:36:24 crc kubenswrapper[4893]: E0314 07:36:24.419606 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b7d9832-78e5-4045-a6fa-64faf415f86b" containerName="oc" Mar 14 07:36:24 crc kubenswrapper[4893]: I0314 07:36:24.419619 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b7d9832-78e5-4045-a6fa-64faf415f86b" containerName="oc" Mar 14 07:36:24 crc kubenswrapper[4893]: E0314 07:36:24.419640 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55eb989b-6741-404b-86a2-f63a78e2d24d" containerName="registry-server" Mar 14 07:36:24 crc kubenswrapper[4893]: I0314 07:36:24.419652 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="55eb989b-6741-404b-86a2-f63a78e2d24d" containerName="registry-server" Mar 14 07:36:24 crc kubenswrapper[4893]: E0314 07:36:24.419665 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55eb989b-6741-404b-86a2-f63a78e2d24d" containerName="extract-utilities" Mar 14 07:36:24 crc kubenswrapper[4893]: I0314 07:36:24.419671 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="55eb989b-6741-404b-86a2-f63a78e2d24d" containerName="extract-utilities" Mar 14 07:36:24 crc kubenswrapper[4893]: E0314 07:36:24.419684 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55eb989b-6741-404b-86a2-f63a78e2d24d" containerName="extract-content" Mar 14 07:36:24 crc kubenswrapper[4893]: I0314 07:36:24.419690 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="55eb989b-6741-404b-86a2-f63a78e2d24d" containerName="extract-content" Mar 14 07:36:24 crc kubenswrapper[4893]: I0314 07:36:24.419827 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b7d9832-78e5-4045-a6fa-64faf415f86b" containerName="oc" Mar 14 07:36:24 crc kubenswrapper[4893]: I0314 07:36:24.419848 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="55eb989b-6741-404b-86a2-f63a78e2d24d" containerName="registry-server" Mar 14 07:36:24 crc kubenswrapper[4893]: I0314 07:36:24.420876 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9h668" Mar 14 07:36:24 crc kubenswrapper[4893]: I0314 07:36:24.435804 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9h668"] Mar 14 07:36:24 crc kubenswrapper[4893]: I0314 07:36:24.569722 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3175605d-4cd2-496a-8792-2dd587b3dd9a-utilities\") pod \"redhat-marketplace-9h668\" (UID: \"3175605d-4cd2-496a-8792-2dd587b3dd9a\") " pod="openshift-marketplace/redhat-marketplace-9h668" Mar 14 07:36:24 crc kubenswrapper[4893]: I0314 07:36:24.569814 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7czcc\" (UniqueName: \"kubernetes.io/projected/3175605d-4cd2-496a-8792-2dd587b3dd9a-kube-api-access-7czcc\") pod \"redhat-marketplace-9h668\" (UID: \"3175605d-4cd2-496a-8792-2dd587b3dd9a\") " pod="openshift-marketplace/redhat-marketplace-9h668" Mar 14 07:36:24 crc kubenswrapper[4893]: I0314 07:36:24.569857 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3175605d-4cd2-496a-8792-2dd587b3dd9a-catalog-content\") pod \"redhat-marketplace-9h668\" (UID: \"3175605d-4cd2-496a-8792-2dd587b3dd9a\") " pod="openshift-marketplace/redhat-marketplace-9h668" Mar 14 07:36:24 crc kubenswrapper[4893]: I0314 07:36:24.619648 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rzk42"] Mar 14 07:36:24 crc kubenswrapper[4893]: I0314 07:36:24.621408 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rzk42" Mar 14 07:36:24 crc kubenswrapper[4893]: I0314 07:36:24.637621 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rzk42"] Mar 14 07:36:24 crc kubenswrapper[4893]: I0314 07:36:24.670701 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7czcc\" (UniqueName: \"kubernetes.io/projected/3175605d-4cd2-496a-8792-2dd587b3dd9a-kube-api-access-7czcc\") pod \"redhat-marketplace-9h668\" (UID: \"3175605d-4cd2-496a-8792-2dd587b3dd9a\") " pod="openshift-marketplace/redhat-marketplace-9h668" Mar 14 07:36:24 crc kubenswrapper[4893]: I0314 07:36:24.670750 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3175605d-4cd2-496a-8792-2dd587b3dd9a-catalog-content\") pod \"redhat-marketplace-9h668\" (UID: \"3175605d-4cd2-496a-8792-2dd587b3dd9a\") " pod="openshift-marketplace/redhat-marketplace-9h668" Mar 14 07:36:24 crc kubenswrapper[4893]: I0314 07:36:24.670816 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3175605d-4cd2-496a-8792-2dd587b3dd9a-utilities\") pod \"redhat-marketplace-9h668\" (UID: \"3175605d-4cd2-496a-8792-2dd587b3dd9a\") " pod="openshift-marketplace/redhat-marketplace-9h668" Mar 14 07:36:24 crc kubenswrapper[4893]: I0314 07:36:24.671287 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3175605d-4cd2-496a-8792-2dd587b3dd9a-utilities\") pod \"redhat-marketplace-9h668\" (UID: \"3175605d-4cd2-496a-8792-2dd587b3dd9a\") " pod="openshift-marketplace/redhat-marketplace-9h668" Mar 14 07:36:24 crc kubenswrapper[4893]: I0314 07:36:24.671485 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3175605d-4cd2-496a-8792-2dd587b3dd9a-catalog-content\") pod \"redhat-marketplace-9h668\" (UID: \"3175605d-4cd2-496a-8792-2dd587b3dd9a\") " pod="openshift-marketplace/redhat-marketplace-9h668" Mar 14 07:36:24 crc kubenswrapper[4893]: I0314 07:36:24.693485 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7czcc\" (UniqueName: \"kubernetes.io/projected/3175605d-4cd2-496a-8792-2dd587b3dd9a-kube-api-access-7czcc\") pod \"redhat-marketplace-9h668\" (UID: \"3175605d-4cd2-496a-8792-2dd587b3dd9a\") " pod="openshift-marketplace/redhat-marketplace-9h668" Mar 14 07:36:24 crc kubenswrapper[4893]: I0314 07:36:24.750585 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9h668" Mar 14 07:36:24 crc kubenswrapper[4893]: I0314 07:36:24.771826 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8f77a6f-7369-4936-9631-12789f4c691d-catalog-content\") pod \"redhat-operators-rzk42\" (UID: \"b8f77a6f-7369-4936-9631-12789f4c691d\") " pod="openshift-marketplace/redhat-operators-rzk42" Mar 14 07:36:24 crc kubenswrapper[4893]: I0314 07:36:24.771931 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqb77\" (UniqueName: \"kubernetes.io/projected/b8f77a6f-7369-4936-9631-12789f4c691d-kube-api-access-sqb77\") pod \"redhat-operators-rzk42\" (UID: \"b8f77a6f-7369-4936-9631-12789f4c691d\") " pod="openshift-marketplace/redhat-operators-rzk42" Mar 14 07:36:24 crc kubenswrapper[4893]: I0314 07:36:24.771991 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8f77a6f-7369-4936-9631-12789f4c691d-utilities\") pod \"redhat-operators-rzk42\" (UID: \"b8f77a6f-7369-4936-9631-12789f4c691d\") " pod="openshift-marketplace/redhat-operators-rzk42" Mar 14 07:36:24 crc kubenswrapper[4893]: I0314 07:36:24.873861 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8f77a6f-7369-4936-9631-12789f4c691d-utilities\") pod \"redhat-operators-rzk42\" (UID: \"b8f77a6f-7369-4936-9631-12789f4c691d\") " pod="openshift-marketplace/redhat-operators-rzk42" Mar 14 07:36:24 crc kubenswrapper[4893]: I0314 07:36:24.873932 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8f77a6f-7369-4936-9631-12789f4c691d-catalog-content\") pod \"redhat-operators-rzk42\" (UID: \"b8f77a6f-7369-4936-9631-12789f4c691d\") " pod="openshift-marketplace/redhat-operators-rzk42" Mar 14 07:36:24 crc kubenswrapper[4893]: I0314 07:36:24.873981 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqb77\" (UniqueName: \"kubernetes.io/projected/b8f77a6f-7369-4936-9631-12789f4c691d-kube-api-access-sqb77\") pod \"redhat-operators-rzk42\" (UID: \"b8f77a6f-7369-4936-9631-12789f4c691d\") " pod="openshift-marketplace/redhat-operators-rzk42" Mar 14 07:36:24 crc kubenswrapper[4893]: I0314 07:36:24.874860 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8f77a6f-7369-4936-9631-12789f4c691d-utilities\") pod \"redhat-operators-rzk42\" (UID: \"b8f77a6f-7369-4936-9631-12789f4c691d\") " pod="openshift-marketplace/redhat-operators-rzk42" Mar 14 07:36:24 crc kubenswrapper[4893]: I0314 07:36:24.875066 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8f77a6f-7369-4936-9631-12789f4c691d-catalog-content\") pod \"redhat-operators-rzk42\" (UID: \"b8f77a6f-7369-4936-9631-12789f4c691d\") " pod="openshift-marketplace/redhat-operators-rzk42" Mar 14 07:36:24 crc kubenswrapper[4893]: I0314 07:36:24.895395 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqb77\" (UniqueName: \"kubernetes.io/projected/b8f77a6f-7369-4936-9631-12789f4c691d-kube-api-access-sqb77\") pod \"redhat-operators-rzk42\" (UID: \"b8f77a6f-7369-4936-9631-12789f4c691d\") " pod="openshift-marketplace/redhat-operators-rzk42" Mar 14 07:36:24 crc kubenswrapper[4893]: I0314 07:36:24.939139 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rzk42" Mar 14 07:36:25 crc kubenswrapper[4893]: I0314 07:36:25.169261 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rzk42"] Mar 14 07:36:25 crc kubenswrapper[4893]: I0314 07:36:25.227989 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9h668"] Mar 14 07:36:25 crc kubenswrapper[4893]: W0314 07:36:25.242281 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3175605d_4cd2_496a_8792_2dd587b3dd9a.slice/crio-4295042de41eb7c7418f947483892505f4f5811fb7f3bf0a79a6fc04e322080a WatchSource:0}: Error finding container 4295042de41eb7c7418f947483892505f4f5811fb7f3bf0a79a6fc04e322080a: Status 404 returned error can't find the container with id 4295042de41eb7c7418f947483892505f4f5811fb7f3bf0a79a6fc04e322080a Mar 14 07:36:25 crc kubenswrapper[4893]: I0314 07:36:25.872698 4893 generic.go:334] "Generic (PLEG): container finished" podID="3175605d-4cd2-496a-8792-2dd587b3dd9a" containerID="0685ff7ea6c39881a0a7dc2a6e5cb391fae9cfb53702357e29914ed7d10a323a" exitCode=0 Mar 14 07:36:25 crc kubenswrapper[4893]: I0314 07:36:25.872758 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9h668" event={"ID":"3175605d-4cd2-496a-8792-2dd587b3dd9a","Type":"ContainerDied","Data":"0685ff7ea6c39881a0a7dc2a6e5cb391fae9cfb53702357e29914ed7d10a323a"} Mar 14 07:36:25 crc kubenswrapper[4893]: I0314 07:36:25.873028 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9h668" event={"ID":"3175605d-4cd2-496a-8792-2dd587b3dd9a","Type":"ContainerStarted","Data":"4295042de41eb7c7418f947483892505f4f5811fb7f3bf0a79a6fc04e322080a"} Mar 14 07:36:25 crc kubenswrapper[4893]: I0314 07:36:25.874597 4893 generic.go:334] "Generic (PLEG): container finished" podID="b8f77a6f-7369-4936-9631-12789f4c691d" containerID="cf55891072047381c13df5210c40a409275cf79e966eb6550e494bbe8c74e8f3" exitCode=0 Mar 14 07:36:25 crc kubenswrapper[4893]: I0314 07:36:25.874634 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rzk42" event={"ID":"b8f77a6f-7369-4936-9631-12789f4c691d","Type":"ContainerDied","Data":"cf55891072047381c13df5210c40a409275cf79e966eb6550e494bbe8c74e8f3"} Mar 14 07:36:25 crc kubenswrapper[4893]: I0314 07:36:25.874676 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rzk42" event={"ID":"b8f77a6f-7369-4936-9631-12789f4c691d","Type":"ContainerStarted","Data":"bd78f2db3ffa6c0084042b7a0990bef4c9067baf916fad70615e8561dbbb5f63"} Mar 14 07:36:26 crc kubenswrapper[4893]: I0314 07:36:26.882694 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rzk42" event={"ID":"b8f77a6f-7369-4936-9631-12789f4c691d","Type":"ContainerStarted","Data":"169aedd9b9ae1e73fa8bcc9bb9301f27e9e0168dd388c67624e815f861d7f990"} Mar 14 07:36:26 crc kubenswrapper[4893]: I0314 07:36:26.885468 4893 generic.go:334] "Generic (PLEG): container finished" podID="3175605d-4cd2-496a-8792-2dd587b3dd9a" containerID="f25f826f70c387ea2f92b3c98e199a6bbfeddddfeed659affc382b1b1f7dc874" exitCode=0 Mar 14 07:36:26 crc kubenswrapper[4893]: I0314 07:36:26.885503 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9h668" event={"ID":"3175605d-4cd2-496a-8792-2dd587b3dd9a","Type":"ContainerDied","Data":"f25f826f70c387ea2f92b3c98e199a6bbfeddddfeed659affc382b1b1f7dc874"} Mar 14 07:36:27 crc kubenswrapper[4893]: I0314 07:36:27.893088 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9h668" event={"ID":"3175605d-4cd2-496a-8792-2dd587b3dd9a","Type":"ContainerStarted","Data":"4a7fc6cb58289ccaf394f83e32222e3b287c8f8a8d86e2ec77c744617ad1e0c4"} Mar 14 07:36:27 crc kubenswrapper[4893]: I0314 07:36:27.894727 4893 generic.go:334] "Generic (PLEG): container finished" podID="b8f77a6f-7369-4936-9631-12789f4c691d" containerID="169aedd9b9ae1e73fa8bcc9bb9301f27e9e0168dd388c67624e815f861d7f990" exitCode=0 Mar 14 07:36:27 crc kubenswrapper[4893]: I0314 07:36:27.894782 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rzk42" event={"ID":"b8f77a6f-7369-4936-9631-12789f4c691d","Type":"ContainerDied","Data":"169aedd9b9ae1e73fa8bcc9bb9301f27e9e0168dd388c67624e815f861d7f990"} Mar 14 07:36:27 crc kubenswrapper[4893]: I0314 07:36:27.923355 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9h668" podStartSLOduration=2.504281926 podStartE2EDuration="3.923322124s" podCreationTimestamp="2026-03-14 07:36:24 +0000 UTC" firstStartedPulling="2026-03-14 07:36:25.874085311 +0000 UTC m=+2265.136262103" lastFinishedPulling="2026-03-14 07:36:27.293125509 +0000 UTC m=+2266.555302301" observedRunningTime="2026-03-14 07:36:27.914041376 +0000 UTC m=+2267.176218178" watchObservedRunningTime="2026-03-14 07:36:27.923322124 +0000 UTC m=+2267.185498956" Mar 14 07:36:28 crc kubenswrapper[4893]: I0314 07:36:28.907779 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rzk42" event={"ID":"b8f77a6f-7369-4936-9631-12789f4c691d","Type":"ContainerStarted","Data":"387738813beeb7c3eb1e2cba408fbe370dcd8e5e2561e6d8b6af1f9625de34f3"} Mar 14 07:36:28 crc kubenswrapper[4893]: I0314 07:36:28.935107 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rzk42" podStartSLOduration=2.301146522 podStartE2EDuration="4.935080419s" podCreationTimestamp="2026-03-14 07:36:24 +0000 UTC" firstStartedPulling="2026-03-14 07:36:25.877341451 +0000 UTC m=+2265.139518243" lastFinishedPulling="2026-03-14 07:36:28.511275348 +0000 UTC m=+2267.773452140" observedRunningTime="2026-03-14 07:36:28.932368702 +0000 UTC m=+2268.194545514" watchObservedRunningTime="2026-03-14 07:36:28.935080419 +0000 UTC m=+2268.197257221" Mar 14 07:36:34 crc kubenswrapper[4893]: I0314 07:36:34.751336 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9h668" Mar 14 07:36:34 crc kubenswrapper[4893]: I0314 07:36:34.751851 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9h668" Mar 14 07:36:34 crc kubenswrapper[4893]: I0314 07:36:34.807341 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9h668" Mar 14 07:36:34 crc kubenswrapper[4893]: I0314 07:36:34.941136 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rzk42" Mar 14 07:36:34 crc kubenswrapper[4893]: I0314 07:36:34.941326 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rzk42" Mar 14 07:36:35 crc kubenswrapper[4893]: I0314 07:36:35.015697 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9h668" Mar 14 07:36:35 crc kubenswrapper[4893]: I0314 07:36:35.069798 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9h668"] Mar 14 07:36:35 crc kubenswrapper[4893]: I0314 07:36:35.990135 4893 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rzk42" podUID="b8f77a6f-7369-4936-9631-12789f4c691d" containerName="registry-server" probeResult="failure" output=< Mar 14 07:36:35 crc kubenswrapper[4893]: timeout: failed to connect service ":50051" within 1s Mar 14 07:36:35 crc kubenswrapper[4893]: > Mar 14 07:36:36 crc kubenswrapper[4893]: I0314 07:36:36.966089 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9h668" podUID="3175605d-4cd2-496a-8792-2dd587b3dd9a" containerName="registry-server" containerID="cri-o://4a7fc6cb58289ccaf394f83e32222e3b287c8f8a8d86e2ec77c744617ad1e0c4" gracePeriod=2 Mar 14 07:36:37 crc kubenswrapper[4893]: I0314 07:36:37.414458 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9h668" Mar 14 07:36:37 crc kubenswrapper[4893]: I0314 07:36:37.551988 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3175605d-4cd2-496a-8792-2dd587b3dd9a-catalog-content\") pod \"3175605d-4cd2-496a-8792-2dd587b3dd9a\" (UID: \"3175605d-4cd2-496a-8792-2dd587b3dd9a\") " Mar 14 07:36:37 crc kubenswrapper[4893]: I0314 07:36:37.552087 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3175605d-4cd2-496a-8792-2dd587b3dd9a-utilities\") pod \"3175605d-4cd2-496a-8792-2dd587b3dd9a\" (UID: \"3175605d-4cd2-496a-8792-2dd587b3dd9a\") " Mar 14 07:36:37 crc kubenswrapper[4893]: I0314 07:36:37.552157 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7czcc\" (UniqueName: \"kubernetes.io/projected/3175605d-4cd2-496a-8792-2dd587b3dd9a-kube-api-access-7czcc\") pod \"3175605d-4cd2-496a-8792-2dd587b3dd9a\" (UID: \"3175605d-4cd2-496a-8792-2dd587b3dd9a\") " Mar 14 07:36:37 crc kubenswrapper[4893]: I0314 07:36:37.553512 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3175605d-4cd2-496a-8792-2dd587b3dd9a-utilities" (OuterVolumeSpecName: "utilities") pod "3175605d-4cd2-496a-8792-2dd587b3dd9a" (UID: "3175605d-4cd2-496a-8792-2dd587b3dd9a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:36:37 crc kubenswrapper[4893]: I0314 07:36:37.558088 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3175605d-4cd2-496a-8792-2dd587b3dd9a-kube-api-access-7czcc" (OuterVolumeSpecName: "kube-api-access-7czcc") pod "3175605d-4cd2-496a-8792-2dd587b3dd9a" (UID: "3175605d-4cd2-496a-8792-2dd587b3dd9a"). InnerVolumeSpecName "kube-api-access-7czcc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:36:37 crc kubenswrapper[4893]: I0314 07:36:37.588014 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3175605d-4cd2-496a-8792-2dd587b3dd9a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3175605d-4cd2-496a-8792-2dd587b3dd9a" (UID: "3175605d-4cd2-496a-8792-2dd587b3dd9a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:36:37 crc kubenswrapper[4893]: I0314 07:36:37.653847 4893 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3175605d-4cd2-496a-8792-2dd587b3dd9a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 07:36:37 crc kubenswrapper[4893]: I0314 07:36:37.653903 4893 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3175605d-4cd2-496a-8792-2dd587b3dd9a-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 07:36:37 crc kubenswrapper[4893]: I0314 07:36:37.653928 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7czcc\" (UniqueName: \"kubernetes.io/projected/3175605d-4cd2-496a-8792-2dd587b3dd9a-kube-api-access-7czcc\") on node \"crc\" DevicePath \"\"" Mar 14 07:36:37 crc kubenswrapper[4893]: I0314 07:36:37.979158 4893 generic.go:334] "Generic (PLEG): container finished" podID="3175605d-4cd2-496a-8792-2dd587b3dd9a" containerID="4a7fc6cb58289ccaf394f83e32222e3b287c8f8a8d86e2ec77c744617ad1e0c4" exitCode=0 Mar 14 07:36:37 crc kubenswrapper[4893]: I0314 07:36:37.979216 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9h668" event={"ID":"3175605d-4cd2-496a-8792-2dd587b3dd9a","Type":"ContainerDied","Data":"4a7fc6cb58289ccaf394f83e32222e3b287c8f8a8d86e2ec77c744617ad1e0c4"} Mar 14 07:36:37 crc kubenswrapper[4893]: I0314 07:36:37.979256 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9h668" event={"ID":"3175605d-4cd2-496a-8792-2dd587b3dd9a","Type":"ContainerDied","Data":"4295042de41eb7c7418f947483892505f4f5811fb7f3bf0a79a6fc04e322080a"} Mar 14 07:36:37 crc kubenswrapper[4893]: I0314 07:36:37.979258 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9h668" Mar 14 07:36:37 crc kubenswrapper[4893]: I0314 07:36:37.979302 4893 scope.go:117] "RemoveContainer" containerID="4a7fc6cb58289ccaf394f83e32222e3b287c8f8a8d86e2ec77c744617ad1e0c4" Mar 14 07:36:38 crc kubenswrapper[4893]: I0314 07:36:38.004334 4893 scope.go:117] "RemoveContainer" containerID="f25f826f70c387ea2f92b3c98e199a6bbfeddddfeed659affc382b1b1f7dc874" Mar 14 07:36:38 crc kubenswrapper[4893]: I0314 07:36:38.039579 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9h668"] Mar 14 07:36:38 crc kubenswrapper[4893]: I0314 07:36:38.052140 4893 scope.go:117] "RemoveContainer" containerID="0685ff7ea6c39881a0a7dc2a6e5cb391fae9cfb53702357e29914ed7d10a323a" Mar 14 07:36:38 crc kubenswrapper[4893]: I0314 07:36:38.054573 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9h668"] Mar 14 07:36:38 crc kubenswrapper[4893]: I0314 07:36:38.080086 4893 scope.go:117] "RemoveContainer" containerID="4a7fc6cb58289ccaf394f83e32222e3b287c8f8a8d86e2ec77c744617ad1e0c4" Mar 14 07:36:38 crc kubenswrapper[4893]: E0314 07:36:38.081132 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a7fc6cb58289ccaf394f83e32222e3b287c8f8a8d86e2ec77c744617ad1e0c4\": container with ID starting with 4a7fc6cb58289ccaf394f83e32222e3b287c8f8a8d86e2ec77c744617ad1e0c4 not found: ID does not exist" containerID="4a7fc6cb58289ccaf394f83e32222e3b287c8f8a8d86e2ec77c744617ad1e0c4" Mar 14 07:36:38 crc kubenswrapper[4893]: I0314 07:36:38.081298 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a7fc6cb58289ccaf394f83e32222e3b287c8f8a8d86e2ec77c744617ad1e0c4"} err="failed to get container status \"4a7fc6cb58289ccaf394f83e32222e3b287c8f8a8d86e2ec77c744617ad1e0c4\": rpc error: code = NotFound desc = could not find container \"4a7fc6cb58289ccaf394f83e32222e3b287c8f8a8d86e2ec77c744617ad1e0c4\": container with ID starting with 4a7fc6cb58289ccaf394f83e32222e3b287c8f8a8d86e2ec77c744617ad1e0c4 not found: ID does not exist" Mar 14 07:36:38 crc kubenswrapper[4893]: I0314 07:36:38.081445 4893 scope.go:117] "RemoveContainer" containerID="f25f826f70c387ea2f92b3c98e199a6bbfeddddfeed659affc382b1b1f7dc874" Mar 14 07:36:38 crc kubenswrapper[4893]: E0314 07:36:38.081967 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f25f826f70c387ea2f92b3c98e199a6bbfeddddfeed659affc382b1b1f7dc874\": container with ID starting with f25f826f70c387ea2f92b3c98e199a6bbfeddddfeed659affc382b1b1f7dc874 not found: ID does not exist" containerID="f25f826f70c387ea2f92b3c98e199a6bbfeddddfeed659affc382b1b1f7dc874" Mar 14 07:36:38 crc kubenswrapper[4893]: I0314 07:36:38.082001 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f25f826f70c387ea2f92b3c98e199a6bbfeddddfeed659affc382b1b1f7dc874"} err="failed to get container status \"f25f826f70c387ea2f92b3c98e199a6bbfeddddfeed659affc382b1b1f7dc874\": rpc error: code = NotFound desc = could not find container \"f25f826f70c387ea2f92b3c98e199a6bbfeddddfeed659affc382b1b1f7dc874\": container with ID starting with f25f826f70c387ea2f92b3c98e199a6bbfeddddfeed659affc382b1b1f7dc874 not found: ID does not exist" Mar 14 07:36:38 crc kubenswrapper[4893]: I0314 07:36:38.082020 4893 scope.go:117] "RemoveContainer" containerID="0685ff7ea6c39881a0a7dc2a6e5cb391fae9cfb53702357e29914ed7d10a323a" Mar 14 07:36:38 crc kubenswrapper[4893]: E0314 07:36:38.082262 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0685ff7ea6c39881a0a7dc2a6e5cb391fae9cfb53702357e29914ed7d10a323a\": container with ID starting with 0685ff7ea6c39881a0a7dc2a6e5cb391fae9cfb53702357e29914ed7d10a323a not found: ID does not exist" containerID="0685ff7ea6c39881a0a7dc2a6e5cb391fae9cfb53702357e29914ed7d10a323a" Mar 14 07:36:38 crc kubenswrapper[4893]: I0314 07:36:38.082284 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0685ff7ea6c39881a0a7dc2a6e5cb391fae9cfb53702357e29914ed7d10a323a"} err="failed to get container status \"0685ff7ea6c39881a0a7dc2a6e5cb391fae9cfb53702357e29914ed7d10a323a\": rpc error: code = NotFound desc = could not find container \"0685ff7ea6c39881a0a7dc2a6e5cb391fae9cfb53702357e29914ed7d10a323a\": container with ID starting with 0685ff7ea6c39881a0a7dc2a6e5cb391fae9cfb53702357e29914ed7d10a323a not found: ID does not exist" Mar 14 07:36:39 crc kubenswrapper[4893]: I0314 07:36:39.386440 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3175605d-4cd2-496a-8792-2dd587b3dd9a" path="/var/lib/kubelet/pods/3175605d-4cd2-496a-8792-2dd587b3dd9a/volumes" Mar 14 07:36:45 crc kubenswrapper[4893]: I0314 07:36:45.016414 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rzk42" Mar 14 07:36:45 crc kubenswrapper[4893]: I0314 07:36:45.098610 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rzk42" Mar 14 07:36:45 crc kubenswrapper[4893]: I0314 07:36:45.284468 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rzk42"] Mar 14 07:36:46 crc kubenswrapper[4893]: I0314 07:36:46.055122 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rzk42" podUID="b8f77a6f-7369-4936-9631-12789f4c691d" containerName="registry-server" containerID="cri-o://387738813beeb7c3eb1e2cba408fbe370dcd8e5e2561e6d8b6af1f9625de34f3" gracePeriod=2 Mar 14 07:36:46 crc kubenswrapper[4893]: I0314 07:36:46.499266 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rzk42" Mar 14 07:36:46 crc kubenswrapper[4893]: I0314 07:36:46.621000 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqb77\" (UniqueName: \"kubernetes.io/projected/b8f77a6f-7369-4936-9631-12789f4c691d-kube-api-access-sqb77\") pod \"b8f77a6f-7369-4936-9631-12789f4c691d\" (UID: \"b8f77a6f-7369-4936-9631-12789f4c691d\") " Mar 14 07:36:46 crc kubenswrapper[4893]: I0314 07:36:46.621196 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8f77a6f-7369-4936-9631-12789f4c691d-catalog-content\") pod \"b8f77a6f-7369-4936-9631-12789f4c691d\" (UID: \"b8f77a6f-7369-4936-9631-12789f4c691d\") " Mar 14 07:36:46 crc kubenswrapper[4893]: I0314 07:36:46.621275 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8f77a6f-7369-4936-9631-12789f4c691d-utilities\") pod \"b8f77a6f-7369-4936-9631-12789f4c691d\" (UID: \"b8f77a6f-7369-4936-9631-12789f4c691d\") " Mar 14 07:36:46 crc kubenswrapper[4893]: I0314 07:36:46.623270 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8f77a6f-7369-4936-9631-12789f4c691d-utilities" (OuterVolumeSpecName: "utilities") pod "b8f77a6f-7369-4936-9631-12789f4c691d" (UID: "b8f77a6f-7369-4936-9631-12789f4c691d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:36:46 crc kubenswrapper[4893]: I0314 07:36:46.623995 4893 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8f77a6f-7369-4936-9631-12789f4c691d-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 07:36:46 crc kubenswrapper[4893]: I0314 07:36:46.628265 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8f77a6f-7369-4936-9631-12789f4c691d-kube-api-access-sqb77" (OuterVolumeSpecName: "kube-api-access-sqb77") pod "b8f77a6f-7369-4936-9631-12789f4c691d" (UID: "b8f77a6f-7369-4936-9631-12789f4c691d"). InnerVolumeSpecName "kube-api-access-sqb77". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:36:46 crc kubenswrapper[4893]: I0314 07:36:46.726218 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqb77\" (UniqueName: \"kubernetes.io/projected/b8f77a6f-7369-4936-9631-12789f4c691d-kube-api-access-sqb77\") on node \"crc\" DevicePath \"\"" Mar 14 07:36:46 crc kubenswrapper[4893]: I0314 07:36:46.815695 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8f77a6f-7369-4936-9631-12789f4c691d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b8f77a6f-7369-4936-9631-12789f4c691d" (UID: "b8f77a6f-7369-4936-9631-12789f4c691d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:36:46 crc kubenswrapper[4893]: I0314 07:36:46.827185 4893 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8f77a6f-7369-4936-9631-12789f4c691d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 07:36:47 crc kubenswrapper[4893]: I0314 07:36:47.068958 4893 generic.go:334] "Generic (PLEG): container finished" podID="b8f77a6f-7369-4936-9631-12789f4c691d" containerID="387738813beeb7c3eb1e2cba408fbe370dcd8e5e2561e6d8b6af1f9625de34f3" exitCode=0 Mar 14 07:36:47 crc kubenswrapper[4893]: I0314 07:36:47.069044 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rzk42" Mar 14 07:36:47 crc kubenswrapper[4893]: I0314 07:36:47.069069 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rzk42" event={"ID":"b8f77a6f-7369-4936-9631-12789f4c691d","Type":"ContainerDied","Data":"387738813beeb7c3eb1e2cba408fbe370dcd8e5e2561e6d8b6af1f9625de34f3"} Mar 14 07:36:47 crc kubenswrapper[4893]: I0314 07:36:47.069929 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rzk42" event={"ID":"b8f77a6f-7369-4936-9631-12789f4c691d","Type":"ContainerDied","Data":"bd78f2db3ffa6c0084042b7a0990bef4c9067baf916fad70615e8561dbbb5f63"} Mar 14 07:36:47 crc kubenswrapper[4893]: I0314 07:36:47.069989 4893 scope.go:117] "RemoveContainer" containerID="387738813beeb7c3eb1e2cba408fbe370dcd8e5e2561e6d8b6af1f9625de34f3" Mar 14 07:36:47 crc kubenswrapper[4893]: I0314 07:36:47.113050 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rzk42"] Mar 14 07:36:47 crc kubenswrapper[4893]: I0314 07:36:47.113250 4893 scope.go:117] "RemoveContainer" containerID="169aedd9b9ae1e73fa8bcc9bb9301f27e9e0168dd388c67624e815f861d7f990" Mar 14 07:36:47 crc kubenswrapper[4893]: I0314 07:36:47.125812 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rzk42"] Mar 14 07:36:47 crc kubenswrapper[4893]: I0314 07:36:47.139568 4893 scope.go:117] "RemoveContainer" containerID="cf55891072047381c13df5210c40a409275cf79e966eb6550e494bbe8c74e8f3" Mar 14 07:36:47 crc kubenswrapper[4893]: I0314 07:36:47.172461 4893 scope.go:117] "RemoveContainer" containerID="387738813beeb7c3eb1e2cba408fbe370dcd8e5e2561e6d8b6af1f9625de34f3" Mar 14 07:36:47 crc kubenswrapper[4893]: E0314 07:36:47.174991 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"387738813beeb7c3eb1e2cba408fbe370dcd8e5e2561e6d8b6af1f9625de34f3\": container with ID starting with 387738813beeb7c3eb1e2cba408fbe370dcd8e5e2561e6d8b6af1f9625de34f3 not found: ID does not exist" containerID="387738813beeb7c3eb1e2cba408fbe370dcd8e5e2561e6d8b6af1f9625de34f3" Mar 14 07:36:47 crc kubenswrapper[4893]: I0314 07:36:47.175066 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"387738813beeb7c3eb1e2cba408fbe370dcd8e5e2561e6d8b6af1f9625de34f3"} err="failed to get container status \"387738813beeb7c3eb1e2cba408fbe370dcd8e5e2561e6d8b6af1f9625de34f3\": rpc error: code = NotFound desc = could not find container \"387738813beeb7c3eb1e2cba408fbe370dcd8e5e2561e6d8b6af1f9625de34f3\": container with ID starting with 387738813beeb7c3eb1e2cba408fbe370dcd8e5e2561e6d8b6af1f9625de34f3 not found: ID does not exist" Mar 14 07:36:47 crc kubenswrapper[4893]: I0314 07:36:47.175108 4893 scope.go:117] "RemoveContainer" containerID="169aedd9b9ae1e73fa8bcc9bb9301f27e9e0168dd388c67624e815f861d7f990" Mar 14 07:36:47 crc kubenswrapper[4893]: E0314 07:36:47.175604 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"169aedd9b9ae1e73fa8bcc9bb9301f27e9e0168dd388c67624e815f861d7f990\": container with ID starting with 169aedd9b9ae1e73fa8bcc9bb9301f27e9e0168dd388c67624e815f861d7f990 not found: ID does not exist" containerID="169aedd9b9ae1e73fa8bcc9bb9301f27e9e0168dd388c67624e815f861d7f990" Mar 14 07:36:47 crc kubenswrapper[4893]: I0314 07:36:47.175645 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"169aedd9b9ae1e73fa8bcc9bb9301f27e9e0168dd388c67624e815f861d7f990"} err="failed to get container status \"169aedd9b9ae1e73fa8bcc9bb9301f27e9e0168dd388c67624e815f861d7f990\": rpc error: code = NotFound desc = could not find container \"169aedd9b9ae1e73fa8bcc9bb9301f27e9e0168dd388c67624e815f861d7f990\": container with ID starting with 169aedd9b9ae1e73fa8bcc9bb9301f27e9e0168dd388c67624e815f861d7f990 not found: ID does not exist" Mar 14 07:36:47 crc kubenswrapper[4893]: I0314 07:36:47.175669 4893 scope.go:117] "RemoveContainer" containerID="cf55891072047381c13df5210c40a409275cf79e966eb6550e494bbe8c74e8f3" Mar 14 07:36:47 crc kubenswrapper[4893]: E0314 07:36:47.176160 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf55891072047381c13df5210c40a409275cf79e966eb6550e494bbe8c74e8f3\": container with ID starting with cf55891072047381c13df5210c40a409275cf79e966eb6550e494bbe8c74e8f3 not found: ID does not exist" containerID="cf55891072047381c13df5210c40a409275cf79e966eb6550e494bbe8c74e8f3" Mar 14 07:36:47 crc kubenswrapper[4893]: I0314 07:36:47.176199 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf55891072047381c13df5210c40a409275cf79e966eb6550e494bbe8c74e8f3"} err="failed to get container status \"cf55891072047381c13df5210c40a409275cf79e966eb6550e494bbe8c74e8f3\": rpc error: code = NotFound desc = could not find container \"cf55891072047381c13df5210c40a409275cf79e966eb6550e494bbe8c74e8f3\": container with ID starting with cf55891072047381c13df5210c40a409275cf79e966eb6550e494bbe8c74e8f3 not found: ID does not exist" Mar 14 07:36:47 crc kubenswrapper[4893]: I0314 07:36:47.401303 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8f77a6f-7369-4936-9631-12789f4c691d" path="/var/lib/kubelet/pods/b8f77a6f-7369-4936-9631-12789f4c691d/volumes" Mar 14 07:36:59 crc kubenswrapper[4893]: I0314 07:36:59.732160 4893 patch_prober.go:28] interesting pod/machine-config-daemon-d4x6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:36:59 crc kubenswrapper[4893]: I0314 07:36:59.732786 4893 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:37:07 crc kubenswrapper[4893]: I0314 07:37:07.067348 4893 scope.go:117] "RemoveContainer" containerID="1b410abf40a49057ddfc260eddfb00a317398bcbfb7019f23b4f5b24b557f53d" Mar 14 07:37:29 crc kubenswrapper[4893]: I0314 07:37:29.731050 4893 patch_prober.go:28] interesting pod/machine-config-daemon-d4x6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:37:29 crc kubenswrapper[4893]: I0314 07:37:29.731773 4893 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:37:59 crc kubenswrapper[4893]: I0314 07:37:59.731093 4893 patch_prober.go:28] interesting pod/machine-config-daemon-d4x6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:37:59 crc kubenswrapper[4893]: I0314 07:37:59.731694 4893 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:37:59 crc kubenswrapper[4893]: I0314 07:37:59.731756 4893 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" Mar 14 07:37:59 crc kubenswrapper[4893]: I0314 07:37:59.732495 4893 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b66fce5bdacd92a99f342206409b1a0337114400d80774b71d411c93354c80ab"} pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 07:37:59 crc kubenswrapper[4893]: I0314 07:37:59.732612 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" containerID="cri-o://b66fce5bdacd92a99f342206409b1a0337114400d80774b71d411c93354c80ab" gracePeriod=600 Mar 14 07:37:59 crc kubenswrapper[4893]: E0314 07:37:59.863234 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:38:00 crc kubenswrapper[4893]: I0314 07:38:00.167437 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557898-fxmn9"] Mar 14 07:38:00 crc kubenswrapper[4893]: E0314 07:38:00.168156 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8f77a6f-7369-4936-9631-12789f4c691d" containerName="extract-utilities" Mar 14 07:38:00 crc kubenswrapper[4893]: I0314 07:38:00.168175 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8f77a6f-7369-4936-9631-12789f4c691d" containerName="extract-utilities" Mar 14 07:38:00 crc kubenswrapper[4893]: E0314 07:38:00.168187 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3175605d-4cd2-496a-8792-2dd587b3dd9a" containerName="extract-utilities" Mar 14 07:38:00 crc kubenswrapper[4893]: I0314 07:38:00.168195 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="3175605d-4cd2-496a-8792-2dd587b3dd9a" containerName="extract-utilities" Mar 14 07:38:00 crc kubenswrapper[4893]: E0314 07:38:00.168215 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8f77a6f-7369-4936-9631-12789f4c691d" containerName="registry-server" Mar 14 07:38:00 crc kubenswrapper[4893]: I0314 07:38:00.168239 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8f77a6f-7369-4936-9631-12789f4c691d" containerName="registry-server" Mar 14 07:38:00 crc kubenswrapper[4893]: E0314 07:38:00.168262 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3175605d-4cd2-496a-8792-2dd587b3dd9a" containerName="registry-server" Mar 14 07:38:00 crc kubenswrapper[4893]: I0314 07:38:00.168270 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="3175605d-4cd2-496a-8792-2dd587b3dd9a" containerName="registry-server" Mar 14 07:38:00 crc kubenswrapper[4893]: E0314 07:38:00.168282 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3175605d-4cd2-496a-8792-2dd587b3dd9a" containerName="extract-content" Mar 14 07:38:00 crc kubenswrapper[4893]: I0314 07:38:00.168289 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="3175605d-4cd2-496a-8792-2dd587b3dd9a" containerName="extract-content" Mar 14 07:38:00 crc kubenswrapper[4893]: E0314 07:38:00.168298 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8f77a6f-7369-4936-9631-12789f4c691d" containerName="extract-content" Mar 14 07:38:00 crc kubenswrapper[4893]: I0314 07:38:00.168305 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8f77a6f-7369-4936-9631-12789f4c691d" containerName="extract-content" Mar 14 07:38:00 crc kubenswrapper[4893]: I0314 07:38:00.168463 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8f77a6f-7369-4936-9631-12789f4c691d" containerName="registry-server" Mar 14 07:38:00 crc kubenswrapper[4893]: I0314 07:38:00.168488 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="3175605d-4cd2-496a-8792-2dd587b3dd9a" containerName="registry-server" Mar 14 07:38:00 crc kubenswrapper[4893]: I0314 07:38:00.168992 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557898-fxmn9" Mar 14 07:38:00 crc kubenswrapper[4893]: I0314 07:38:00.171766 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 07:38:00 crc kubenswrapper[4893]: I0314 07:38:00.172165 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 07:38:00 crc kubenswrapper[4893]: I0314 07:38:00.172564 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-44qb7" Mar 14 07:38:00 crc kubenswrapper[4893]: I0314 07:38:00.182488 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557898-fxmn9"] Mar 14 07:38:00 crc kubenswrapper[4893]: I0314 07:38:00.234732 4893 generic.go:334] "Generic (PLEG): container finished" podID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerID="b66fce5bdacd92a99f342206409b1a0337114400d80774b71d411c93354c80ab" exitCode=0 Mar 14 07:38:00 crc kubenswrapper[4893]: I0314 07:38:00.234783 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" event={"ID":"ad6724e5-48cf-4417-ae51-b1cb8c6af70d","Type":"ContainerDied","Data":"b66fce5bdacd92a99f342206409b1a0337114400d80774b71d411c93354c80ab"} Mar 14 07:38:00 crc kubenswrapper[4893]: I0314 07:38:00.234820 4893 scope.go:117] "RemoveContainer" containerID="92dbd2d4668ff2f1c2b1bbddce5f3e3766f7fb6098f86dddd01ddc9fa90c9cf8" Mar 14 07:38:00 crc kubenswrapper[4893]: I0314 07:38:00.235404 4893 scope.go:117] "RemoveContainer" containerID="b66fce5bdacd92a99f342206409b1a0337114400d80774b71d411c93354c80ab" Mar 14 07:38:00 crc kubenswrapper[4893]: E0314 07:38:00.235624 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:38:00 crc kubenswrapper[4893]: I0314 07:38:00.324799 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wr7zn\" (UniqueName: \"kubernetes.io/projected/96b3624b-adc8-4bd6-98f4-3babf9627dee-kube-api-access-wr7zn\") pod \"auto-csr-approver-29557898-fxmn9\" (UID: \"96b3624b-adc8-4bd6-98f4-3babf9627dee\") " pod="openshift-infra/auto-csr-approver-29557898-fxmn9" Mar 14 07:38:00 crc kubenswrapper[4893]: I0314 07:38:00.426743 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wr7zn\" (UniqueName: \"kubernetes.io/projected/96b3624b-adc8-4bd6-98f4-3babf9627dee-kube-api-access-wr7zn\") pod \"auto-csr-approver-29557898-fxmn9\" (UID: \"96b3624b-adc8-4bd6-98f4-3babf9627dee\") " pod="openshift-infra/auto-csr-approver-29557898-fxmn9" Mar 14 07:38:00 crc kubenswrapper[4893]: I0314 07:38:00.460659 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wr7zn\" (UniqueName: \"kubernetes.io/projected/96b3624b-adc8-4bd6-98f4-3babf9627dee-kube-api-access-wr7zn\") pod \"auto-csr-approver-29557898-fxmn9\" (UID: \"96b3624b-adc8-4bd6-98f4-3babf9627dee\") " pod="openshift-infra/auto-csr-approver-29557898-fxmn9" Mar 14 07:38:00 crc kubenswrapper[4893]: I0314 07:38:00.490610 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557898-fxmn9" Mar 14 07:38:00 crc kubenswrapper[4893]: I0314 07:38:00.885894 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557898-fxmn9"] Mar 14 07:38:01 crc kubenswrapper[4893]: I0314 07:38:01.245343 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557898-fxmn9" event={"ID":"96b3624b-adc8-4bd6-98f4-3babf9627dee","Type":"ContainerStarted","Data":"e2069041649bdd331724bbe2642bde1d136c31d5d061f5cf50c2b1a5695e770d"} Mar 14 07:38:02 crc kubenswrapper[4893]: I0314 07:38:02.260550 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557898-fxmn9" event={"ID":"96b3624b-adc8-4bd6-98f4-3babf9627dee","Type":"ContainerStarted","Data":"87f38458f905c0979f8c8ee8cb85a51009a89cff69fb856d5c74e38907626a0f"} Mar 14 07:38:02 crc kubenswrapper[4893]: I0314 07:38:02.280600 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557898-fxmn9" podStartSLOduration=1.459394162 podStartE2EDuration="2.280572301s" podCreationTimestamp="2026-03-14 07:38:00 +0000 UTC" firstStartedPulling="2026-03-14 07:38:00.898383571 +0000 UTC m=+2360.160560363" lastFinishedPulling="2026-03-14 07:38:01.71956171 +0000 UTC m=+2360.981738502" observedRunningTime="2026-03-14 07:38:02.278425588 +0000 UTC m=+2361.540602420" watchObservedRunningTime="2026-03-14 07:38:02.280572301 +0000 UTC m=+2361.542749133" Mar 14 07:38:03 crc kubenswrapper[4893]: I0314 07:38:03.272702 4893 generic.go:334] "Generic (PLEG): container finished" podID="96b3624b-adc8-4bd6-98f4-3babf9627dee" containerID="87f38458f905c0979f8c8ee8cb85a51009a89cff69fb856d5c74e38907626a0f" exitCode=0 Mar 14 07:38:03 crc kubenswrapper[4893]: I0314 07:38:03.272761 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557898-fxmn9" event={"ID":"96b3624b-adc8-4bd6-98f4-3babf9627dee","Type":"ContainerDied","Data":"87f38458f905c0979f8c8ee8cb85a51009a89cff69fb856d5c74e38907626a0f"} Mar 14 07:38:04 crc kubenswrapper[4893]: I0314 07:38:04.632828 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557898-fxmn9" Mar 14 07:38:04 crc kubenswrapper[4893]: I0314 07:38:04.785372 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wr7zn\" (UniqueName: \"kubernetes.io/projected/96b3624b-adc8-4bd6-98f4-3babf9627dee-kube-api-access-wr7zn\") pod \"96b3624b-adc8-4bd6-98f4-3babf9627dee\" (UID: \"96b3624b-adc8-4bd6-98f4-3babf9627dee\") " Mar 14 07:38:04 crc kubenswrapper[4893]: I0314 07:38:04.792577 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b3624b-adc8-4bd6-98f4-3babf9627dee-kube-api-access-wr7zn" (OuterVolumeSpecName: "kube-api-access-wr7zn") pod "96b3624b-adc8-4bd6-98f4-3babf9627dee" (UID: "96b3624b-adc8-4bd6-98f4-3babf9627dee"). InnerVolumeSpecName "kube-api-access-wr7zn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:38:04 crc kubenswrapper[4893]: I0314 07:38:04.889004 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wr7zn\" (UniqueName: \"kubernetes.io/projected/96b3624b-adc8-4bd6-98f4-3babf9627dee-kube-api-access-wr7zn\") on node \"crc\" DevicePath \"\"" Mar 14 07:38:05 crc kubenswrapper[4893]: I0314 07:38:05.290396 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557898-fxmn9" event={"ID":"96b3624b-adc8-4bd6-98f4-3babf9627dee","Type":"ContainerDied","Data":"e2069041649bdd331724bbe2642bde1d136c31d5d061f5cf50c2b1a5695e770d"} Mar 14 07:38:05 crc kubenswrapper[4893]: I0314 07:38:05.290449 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e2069041649bdd331724bbe2642bde1d136c31d5d061f5cf50c2b1a5695e770d" Mar 14 07:38:05 crc kubenswrapper[4893]: I0314 07:38:05.290671 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557898-fxmn9" Mar 14 07:38:05 crc kubenswrapper[4893]: I0314 07:38:05.704668 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557892-2269r"] Mar 14 07:38:05 crc kubenswrapper[4893]: I0314 07:38:05.712363 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557892-2269r"] Mar 14 07:38:07 crc kubenswrapper[4893]: I0314 07:38:07.394751 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8697a2cb-05a0-4af0-8579-9a6629290862" path="/var/lib/kubelet/pods/8697a2cb-05a0-4af0-8579-9a6629290862/volumes" Mar 14 07:38:15 crc kubenswrapper[4893]: I0314 07:38:15.377239 4893 scope.go:117] "RemoveContainer" containerID="b66fce5bdacd92a99f342206409b1a0337114400d80774b71d411c93354c80ab" Mar 14 07:38:15 crc kubenswrapper[4893]: E0314 07:38:15.381406 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:38:30 crc kubenswrapper[4893]: I0314 07:38:30.377297 4893 scope.go:117] "RemoveContainer" containerID="b66fce5bdacd92a99f342206409b1a0337114400d80774b71d411c93354c80ab" Mar 14 07:38:30 crc kubenswrapper[4893]: E0314 07:38:30.378491 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:38:42 crc kubenswrapper[4893]: I0314 07:38:42.377848 4893 scope.go:117] "RemoveContainer" containerID="b66fce5bdacd92a99f342206409b1a0337114400d80774b71d411c93354c80ab" Mar 14 07:38:42 crc kubenswrapper[4893]: E0314 07:38:42.378983 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:38:54 crc kubenswrapper[4893]: I0314 07:38:54.376854 4893 scope.go:117] "RemoveContainer" containerID="b66fce5bdacd92a99f342206409b1a0337114400d80774b71d411c93354c80ab" Mar 14 07:38:54 crc kubenswrapper[4893]: E0314 07:38:54.377814 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:39:07 crc kubenswrapper[4893]: I0314 07:39:07.219569 4893 scope.go:117] "RemoveContainer" containerID="179eac334379ef478f65f3ba4f3324286015925fb2d4e09fa8fb73affc60f8a9" Mar 14 07:39:08 crc kubenswrapper[4893]: I0314 07:39:08.376600 4893 scope.go:117] "RemoveContainer" containerID="b66fce5bdacd92a99f342206409b1a0337114400d80774b71d411c93354c80ab" Mar 14 07:39:08 crc kubenswrapper[4893]: E0314 07:39:08.377040 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:39:20 crc kubenswrapper[4893]: I0314 07:39:20.376907 4893 scope.go:117] "RemoveContainer" containerID="b66fce5bdacd92a99f342206409b1a0337114400d80774b71d411c93354c80ab" Mar 14 07:39:20 crc kubenswrapper[4893]: E0314 07:39:20.377689 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:39:32 crc kubenswrapper[4893]: I0314 07:39:32.377209 4893 scope.go:117] "RemoveContainer" containerID="b66fce5bdacd92a99f342206409b1a0337114400d80774b71d411c93354c80ab" Mar 14 07:39:32 crc kubenswrapper[4893]: E0314 07:39:32.378108 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:39:45 crc kubenswrapper[4893]: I0314 07:39:45.377144 4893 scope.go:117] "RemoveContainer" containerID="b66fce5bdacd92a99f342206409b1a0337114400d80774b71d411c93354c80ab" Mar 14 07:39:45 crc kubenswrapper[4893]: E0314 07:39:45.378208 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:39:59 crc kubenswrapper[4893]: I0314 07:39:59.376565 4893 scope.go:117] "RemoveContainer" containerID="b66fce5bdacd92a99f342206409b1a0337114400d80774b71d411c93354c80ab" Mar 14 07:39:59 crc kubenswrapper[4893]: E0314 07:39:59.377344 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:40:00 crc kubenswrapper[4893]: I0314 07:40:00.139580 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557900-d6wgw"] Mar 14 07:40:00 crc kubenswrapper[4893]: E0314 07:40:00.139997 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96b3624b-adc8-4bd6-98f4-3babf9627dee" containerName="oc" Mar 14 07:40:00 crc kubenswrapper[4893]: I0314 07:40:00.140020 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="96b3624b-adc8-4bd6-98f4-3babf9627dee" containerName="oc" Mar 14 07:40:00 crc kubenswrapper[4893]: I0314 07:40:00.140208 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="96b3624b-adc8-4bd6-98f4-3babf9627dee" containerName="oc" Mar 14 07:40:00 crc kubenswrapper[4893]: I0314 07:40:00.140733 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557900-d6wgw" Mar 14 07:40:00 crc kubenswrapper[4893]: I0314 07:40:00.142432 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 07:40:00 crc kubenswrapper[4893]: I0314 07:40:00.142649 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-44qb7" Mar 14 07:40:00 crc kubenswrapper[4893]: I0314 07:40:00.142955 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 07:40:00 crc kubenswrapper[4893]: I0314 07:40:00.152615 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557900-d6wgw"] Mar 14 07:40:00 crc kubenswrapper[4893]: I0314 07:40:00.262421 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwprv\" (UniqueName: \"kubernetes.io/projected/9f23eec9-28e9-4fb9-9bfe-2cdd920df673-kube-api-access-pwprv\") pod \"auto-csr-approver-29557900-d6wgw\" (UID: \"9f23eec9-28e9-4fb9-9bfe-2cdd920df673\") " pod="openshift-infra/auto-csr-approver-29557900-d6wgw" Mar 14 07:40:00 crc kubenswrapper[4893]: I0314 07:40:00.364224 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwprv\" (UniqueName: \"kubernetes.io/projected/9f23eec9-28e9-4fb9-9bfe-2cdd920df673-kube-api-access-pwprv\") pod \"auto-csr-approver-29557900-d6wgw\" (UID: \"9f23eec9-28e9-4fb9-9bfe-2cdd920df673\") " pod="openshift-infra/auto-csr-approver-29557900-d6wgw" Mar 14 07:40:00 crc kubenswrapper[4893]: I0314 07:40:00.383905 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwprv\" (UniqueName: \"kubernetes.io/projected/9f23eec9-28e9-4fb9-9bfe-2cdd920df673-kube-api-access-pwprv\") pod \"auto-csr-approver-29557900-d6wgw\" (UID: \"9f23eec9-28e9-4fb9-9bfe-2cdd920df673\") " pod="openshift-infra/auto-csr-approver-29557900-d6wgw" Mar 14 07:40:00 crc kubenswrapper[4893]: I0314 07:40:00.474625 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557900-d6wgw" Mar 14 07:40:00 crc kubenswrapper[4893]: I0314 07:40:00.888216 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557900-d6wgw"] Mar 14 07:40:01 crc kubenswrapper[4893]: I0314 07:40:01.352816 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557900-d6wgw" event={"ID":"9f23eec9-28e9-4fb9-9bfe-2cdd920df673","Type":"ContainerStarted","Data":"f40a20dc5a4161985a1518cb9bd3c8a456c10a59f3b275aa35ad43cfc1874a5f"} Mar 14 07:40:03 crc kubenswrapper[4893]: I0314 07:40:03.369361 4893 generic.go:334] "Generic (PLEG): container finished" podID="9f23eec9-28e9-4fb9-9bfe-2cdd920df673" containerID="188c65ffe61a1f706a576b3f06bf58d8ce14c03b100c4c985656f2757c8131ec" exitCode=0 Mar 14 07:40:03 crc kubenswrapper[4893]: I0314 07:40:03.369448 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557900-d6wgw" event={"ID":"9f23eec9-28e9-4fb9-9bfe-2cdd920df673","Type":"ContainerDied","Data":"188c65ffe61a1f706a576b3f06bf58d8ce14c03b100c4c985656f2757c8131ec"} Mar 14 07:40:04 crc kubenswrapper[4893]: I0314 07:40:04.637048 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557900-d6wgw" Mar 14 07:40:04 crc kubenswrapper[4893]: I0314 07:40:04.816853 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwprv\" (UniqueName: \"kubernetes.io/projected/9f23eec9-28e9-4fb9-9bfe-2cdd920df673-kube-api-access-pwprv\") pod \"9f23eec9-28e9-4fb9-9bfe-2cdd920df673\" (UID: \"9f23eec9-28e9-4fb9-9bfe-2cdd920df673\") " Mar 14 07:40:04 crc kubenswrapper[4893]: I0314 07:40:04.822774 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f23eec9-28e9-4fb9-9bfe-2cdd920df673-kube-api-access-pwprv" (OuterVolumeSpecName: "kube-api-access-pwprv") pod "9f23eec9-28e9-4fb9-9bfe-2cdd920df673" (UID: "9f23eec9-28e9-4fb9-9bfe-2cdd920df673"). InnerVolumeSpecName "kube-api-access-pwprv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:40:04 crc kubenswrapper[4893]: I0314 07:40:04.918713 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwprv\" (UniqueName: \"kubernetes.io/projected/9f23eec9-28e9-4fb9-9bfe-2cdd920df673-kube-api-access-pwprv\") on node \"crc\" DevicePath \"\"" Mar 14 07:40:05 crc kubenswrapper[4893]: I0314 07:40:05.387432 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557900-d6wgw" event={"ID":"9f23eec9-28e9-4fb9-9bfe-2cdd920df673","Type":"ContainerDied","Data":"f40a20dc5a4161985a1518cb9bd3c8a456c10a59f3b275aa35ad43cfc1874a5f"} Mar 14 07:40:05 crc kubenswrapper[4893]: I0314 07:40:05.387808 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f40a20dc5a4161985a1518cb9bd3c8a456c10a59f3b275aa35ad43cfc1874a5f" Mar 14 07:40:05 crc kubenswrapper[4893]: I0314 07:40:05.387486 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557900-d6wgw" Mar 14 07:40:05 crc kubenswrapper[4893]: I0314 07:40:05.722258 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557894-xhdbs"] Mar 14 07:40:05 crc kubenswrapper[4893]: I0314 07:40:05.727035 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557894-xhdbs"] Mar 14 07:40:07 crc kubenswrapper[4893]: I0314 07:40:07.396111 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="761ca03d-ec29-4f34-a325-3396ffbf620e" path="/var/lib/kubelet/pods/761ca03d-ec29-4f34-a325-3396ffbf620e/volumes" Mar 14 07:40:14 crc kubenswrapper[4893]: I0314 07:40:14.377377 4893 scope.go:117] "RemoveContainer" containerID="b66fce5bdacd92a99f342206409b1a0337114400d80774b71d411c93354c80ab" Mar 14 07:40:14 crc kubenswrapper[4893]: E0314 07:40:14.378208 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:40:27 crc kubenswrapper[4893]: I0314 07:40:27.377452 4893 scope.go:117] "RemoveContainer" containerID="b66fce5bdacd92a99f342206409b1a0337114400d80774b71d411c93354c80ab" Mar 14 07:40:27 crc kubenswrapper[4893]: E0314 07:40:27.378311 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:40:38 crc kubenswrapper[4893]: I0314 07:40:38.376960 4893 scope.go:117] "RemoveContainer" containerID="b66fce5bdacd92a99f342206409b1a0337114400d80774b71d411c93354c80ab" Mar 14 07:40:38 crc kubenswrapper[4893]: E0314 07:40:38.377599 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:40:39 crc kubenswrapper[4893]: I0314 07:40:39.205970 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mxkg9"] Mar 14 07:40:39 crc kubenswrapper[4893]: E0314 07:40:39.206297 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f23eec9-28e9-4fb9-9bfe-2cdd920df673" containerName="oc" Mar 14 07:40:39 crc kubenswrapper[4893]: I0314 07:40:39.206313 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f23eec9-28e9-4fb9-9bfe-2cdd920df673" containerName="oc" Mar 14 07:40:39 crc kubenswrapper[4893]: I0314 07:40:39.206486 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f23eec9-28e9-4fb9-9bfe-2cdd920df673" containerName="oc" Mar 14 07:40:39 crc kubenswrapper[4893]: I0314 07:40:39.207803 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mxkg9" Mar 14 07:40:39 crc kubenswrapper[4893]: I0314 07:40:39.222042 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mxkg9"] Mar 14 07:40:39 crc kubenswrapper[4893]: I0314 07:40:39.330957 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h49v2\" (UniqueName: \"kubernetes.io/projected/f5c55a2d-4a16-4a38-97d2-1af4328c8ae7-kube-api-access-h49v2\") pod \"community-operators-mxkg9\" (UID: \"f5c55a2d-4a16-4a38-97d2-1af4328c8ae7\") " pod="openshift-marketplace/community-operators-mxkg9" Mar 14 07:40:39 crc kubenswrapper[4893]: I0314 07:40:39.331117 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5c55a2d-4a16-4a38-97d2-1af4328c8ae7-utilities\") pod \"community-operators-mxkg9\" (UID: \"f5c55a2d-4a16-4a38-97d2-1af4328c8ae7\") " pod="openshift-marketplace/community-operators-mxkg9" Mar 14 07:40:39 crc kubenswrapper[4893]: I0314 07:40:39.331146 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5c55a2d-4a16-4a38-97d2-1af4328c8ae7-catalog-content\") pod \"community-operators-mxkg9\" (UID: \"f5c55a2d-4a16-4a38-97d2-1af4328c8ae7\") " pod="openshift-marketplace/community-operators-mxkg9" Mar 14 07:40:39 crc kubenswrapper[4893]: I0314 07:40:39.432268 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5c55a2d-4a16-4a38-97d2-1af4328c8ae7-utilities\") pod \"community-operators-mxkg9\" (UID: \"f5c55a2d-4a16-4a38-97d2-1af4328c8ae7\") " pod="openshift-marketplace/community-operators-mxkg9" Mar 14 07:40:39 crc kubenswrapper[4893]: I0314 07:40:39.432345 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5c55a2d-4a16-4a38-97d2-1af4328c8ae7-catalog-content\") pod \"community-operators-mxkg9\" (UID: \"f5c55a2d-4a16-4a38-97d2-1af4328c8ae7\") " pod="openshift-marketplace/community-operators-mxkg9" Mar 14 07:40:39 crc kubenswrapper[4893]: I0314 07:40:39.432402 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h49v2\" (UniqueName: \"kubernetes.io/projected/f5c55a2d-4a16-4a38-97d2-1af4328c8ae7-kube-api-access-h49v2\") pod \"community-operators-mxkg9\" (UID: \"f5c55a2d-4a16-4a38-97d2-1af4328c8ae7\") " pod="openshift-marketplace/community-operators-mxkg9" Mar 14 07:40:39 crc kubenswrapper[4893]: I0314 07:40:39.432871 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5c55a2d-4a16-4a38-97d2-1af4328c8ae7-utilities\") pod \"community-operators-mxkg9\" (UID: \"f5c55a2d-4a16-4a38-97d2-1af4328c8ae7\") " pod="openshift-marketplace/community-operators-mxkg9" Mar 14 07:40:39 crc kubenswrapper[4893]: I0314 07:40:39.433288 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5c55a2d-4a16-4a38-97d2-1af4328c8ae7-catalog-content\") pod \"community-operators-mxkg9\" (UID: \"f5c55a2d-4a16-4a38-97d2-1af4328c8ae7\") " pod="openshift-marketplace/community-operators-mxkg9" Mar 14 07:40:39 crc kubenswrapper[4893]: I0314 07:40:39.460657 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h49v2\" (UniqueName: \"kubernetes.io/projected/f5c55a2d-4a16-4a38-97d2-1af4328c8ae7-kube-api-access-h49v2\") pod \"community-operators-mxkg9\" (UID: \"f5c55a2d-4a16-4a38-97d2-1af4328c8ae7\") " pod="openshift-marketplace/community-operators-mxkg9" Mar 14 07:40:39 crc kubenswrapper[4893]: I0314 07:40:39.532229 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mxkg9" Mar 14 07:40:40 crc kubenswrapper[4893]: I0314 07:40:40.008869 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mxkg9"] Mar 14 07:40:40 crc kubenswrapper[4893]: W0314 07:40:40.019318 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5c55a2d_4a16_4a38_97d2_1af4328c8ae7.slice/crio-6e1423947a154fc89669225087718547a7837904b5391aea7b1e8614945b335b WatchSource:0}: Error finding container 6e1423947a154fc89669225087718547a7837904b5391aea7b1e8614945b335b: Status 404 returned error can't find the container with id 6e1423947a154fc89669225087718547a7837904b5391aea7b1e8614945b335b Mar 14 07:40:40 crc kubenswrapper[4893]: I0314 07:40:40.690242 4893 generic.go:334] "Generic (PLEG): container finished" podID="f5c55a2d-4a16-4a38-97d2-1af4328c8ae7" containerID="1da89114f4636048e6d3674edbc1a46bb0e5024cdb1b9b7f89781bf91b4f446d" exitCode=0 Mar 14 07:40:40 crc kubenswrapper[4893]: I0314 07:40:40.690346 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxkg9" event={"ID":"f5c55a2d-4a16-4a38-97d2-1af4328c8ae7","Type":"ContainerDied","Data":"1da89114f4636048e6d3674edbc1a46bb0e5024cdb1b9b7f89781bf91b4f446d"} Mar 14 07:40:40 crc kubenswrapper[4893]: I0314 07:40:40.690780 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxkg9" event={"ID":"f5c55a2d-4a16-4a38-97d2-1af4328c8ae7","Type":"ContainerStarted","Data":"6e1423947a154fc89669225087718547a7837904b5391aea7b1e8614945b335b"} Mar 14 07:40:44 crc kubenswrapper[4893]: I0314 07:40:44.724088 4893 generic.go:334] "Generic (PLEG): container finished" podID="f5c55a2d-4a16-4a38-97d2-1af4328c8ae7" containerID="0ded860d0fde1cde1994093c8e886bc338326206279160ec03b7dc9cb631f7df" exitCode=0 Mar 14 07:40:44 crc kubenswrapper[4893]: I0314 07:40:44.724210 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxkg9" event={"ID":"f5c55a2d-4a16-4a38-97d2-1af4328c8ae7","Type":"ContainerDied","Data":"0ded860d0fde1cde1994093c8e886bc338326206279160ec03b7dc9cb631f7df"} Mar 14 07:40:45 crc kubenswrapper[4893]: I0314 07:40:45.736548 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxkg9" event={"ID":"f5c55a2d-4a16-4a38-97d2-1af4328c8ae7","Type":"ContainerStarted","Data":"87b775b894a5afe7a34ab5d1feb6e1c0b59391c0bf0aec2060c99b75a28f12a0"} Mar 14 07:40:45 crc kubenswrapper[4893]: I0314 07:40:45.758265 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mxkg9" podStartSLOduration=2.341387281 podStartE2EDuration="6.758245383s" podCreationTimestamp="2026-03-14 07:40:39 +0000 UTC" firstStartedPulling="2026-03-14 07:40:40.695687153 +0000 UTC m=+2519.957863975" lastFinishedPulling="2026-03-14 07:40:45.112545295 +0000 UTC m=+2524.374722077" observedRunningTime="2026-03-14 07:40:45.757538575 +0000 UTC m=+2525.019715397" watchObservedRunningTime="2026-03-14 07:40:45.758245383 +0000 UTC m=+2525.020422175" Mar 14 07:40:49 crc kubenswrapper[4893]: I0314 07:40:49.377063 4893 scope.go:117] "RemoveContainer" containerID="b66fce5bdacd92a99f342206409b1a0337114400d80774b71d411c93354c80ab" Mar 14 07:40:49 crc kubenswrapper[4893]: E0314 07:40:49.378134 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:40:49 crc kubenswrapper[4893]: I0314 07:40:49.533223 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mxkg9" Mar 14 07:40:49 crc kubenswrapper[4893]: I0314 07:40:49.533265 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mxkg9" Mar 14 07:40:49 crc kubenswrapper[4893]: I0314 07:40:49.581512 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mxkg9" Mar 14 07:40:59 crc kubenswrapper[4893]: I0314 07:40:59.587706 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mxkg9" Mar 14 07:40:59 crc kubenswrapper[4893]: I0314 07:40:59.677508 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mxkg9"] Mar 14 07:40:59 crc kubenswrapper[4893]: I0314 07:40:59.734744 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nkk7t"] Mar 14 07:40:59 crc kubenswrapper[4893]: I0314 07:40:59.734991 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nkk7t" podUID="9e3c87be-1689-4d71-b7e1-d71829082b39" containerName="registry-server" containerID="cri-o://9c825471d259f8050c542c4f894658221959723ad9e6ba3132c839512456366a" gracePeriod=2 Mar 14 07:41:00 crc kubenswrapper[4893]: I0314 07:41:00.119562 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nkk7t" Mar 14 07:41:00 crc kubenswrapper[4893]: I0314 07:41:00.245591 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wh9m\" (UniqueName: \"kubernetes.io/projected/9e3c87be-1689-4d71-b7e1-d71829082b39-kube-api-access-2wh9m\") pod \"9e3c87be-1689-4d71-b7e1-d71829082b39\" (UID: \"9e3c87be-1689-4d71-b7e1-d71829082b39\") " Mar 14 07:41:00 crc kubenswrapper[4893]: I0314 07:41:00.245679 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e3c87be-1689-4d71-b7e1-d71829082b39-utilities\") pod \"9e3c87be-1689-4d71-b7e1-d71829082b39\" (UID: \"9e3c87be-1689-4d71-b7e1-d71829082b39\") " Mar 14 07:41:00 crc kubenswrapper[4893]: I0314 07:41:00.245723 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e3c87be-1689-4d71-b7e1-d71829082b39-catalog-content\") pod \"9e3c87be-1689-4d71-b7e1-d71829082b39\" (UID: \"9e3c87be-1689-4d71-b7e1-d71829082b39\") " Mar 14 07:41:00 crc kubenswrapper[4893]: I0314 07:41:00.246314 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e3c87be-1689-4d71-b7e1-d71829082b39-utilities" (OuterVolumeSpecName: "utilities") pod "9e3c87be-1689-4d71-b7e1-d71829082b39" (UID: "9e3c87be-1689-4d71-b7e1-d71829082b39"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:41:00 crc kubenswrapper[4893]: I0314 07:41:00.252250 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e3c87be-1689-4d71-b7e1-d71829082b39-kube-api-access-2wh9m" (OuterVolumeSpecName: "kube-api-access-2wh9m") pod "9e3c87be-1689-4d71-b7e1-d71829082b39" (UID: "9e3c87be-1689-4d71-b7e1-d71829082b39"). InnerVolumeSpecName "kube-api-access-2wh9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:41:00 crc kubenswrapper[4893]: I0314 07:41:00.340888 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e3c87be-1689-4d71-b7e1-d71829082b39-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9e3c87be-1689-4d71-b7e1-d71829082b39" (UID: "9e3c87be-1689-4d71-b7e1-d71829082b39"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:41:00 crc kubenswrapper[4893]: I0314 07:41:00.346964 4893 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e3c87be-1689-4d71-b7e1-d71829082b39-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 07:41:00 crc kubenswrapper[4893]: I0314 07:41:00.347013 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wh9m\" (UniqueName: \"kubernetes.io/projected/9e3c87be-1689-4d71-b7e1-d71829082b39-kube-api-access-2wh9m\") on node \"crc\" DevicePath \"\"" Mar 14 07:41:00 crc kubenswrapper[4893]: I0314 07:41:00.347026 4893 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e3c87be-1689-4d71-b7e1-d71829082b39-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 07:41:00 crc kubenswrapper[4893]: I0314 07:41:00.848831 4893 generic.go:334] "Generic (PLEG): container finished" podID="9e3c87be-1689-4d71-b7e1-d71829082b39" containerID="9c825471d259f8050c542c4f894658221959723ad9e6ba3132c839512456366a" exitCode=0 Mar 14 07:41:00 crc kubenswrapper[4893]: I0314 07:41:00.848891 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nkk7t" Mar 14 07:41:00 crc kubenswrapper[4893]: I0314 07:41:00.848936 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nkk7t" event={"ID":"9e3c87be-1689-4d71-b7e1-d71829082b39","Type":"ContainerDied","Data":"9c825471d259f8050c542c4f894658221959723ad9e6ba3132c839512456366a"} Mar 14 07:41:00 crc kubenswrapper[4893]: I0314 07:41:00.849750 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nkk7t" event={"ID":"9e3c87be-1689-4d71-b7e1-d71829082b39","Type":"ContainerDied","Data":"b09b6b15446dd3e63f6241d8276c9f4b035f3b505842ade22f2c1eb59bd63e97"} Mar 14 07:41:00 crc kubenswrapper[4893]: I0314 07:41:00.849790 4893 scope.go:117] "RemoveContainer" containerID="9c825471d259f8050c542c4f894658221959723ad9e6ba3132c839512456366a" Mar 14 07:41:00 crc kubenswrapper[4893]: I0314 07:41:00.877750 4893 scope.go:117] "RemoveContainer" containerID="763e3789a1094fa4a390bb442fc1a920abfd164c30f77c77f3596aee09477afc" Mar 14 07:41:00 crc kubenswrapper[4893]: I0314 07:41:00.885698 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nkk7t"] Mar 14 07:41:00 crc kubenswrapper[4893]: I0314 07:41:00.891342 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nkk7t"] Mar 14 07:41:00 crc kubenswrapper[4893]: I0314 07:41:00.902224 4893 scope.go:117] "RemoveContainer" containerID="23920811e5af972e9f7e31cd5c859d37dd9b53123db7400b1c8f4b8fd098419c" Mar 14 07:41:00 crc kubenswrapper[4893]: I0314 07:41:00.930190 4893 scope.go:117] "RemoveContainer" containerID="9c825471d259f8050c542c4f894658221959723ad9e6ba3132c839512456366a" Mar 14 07:41:00 crc kubenswrapper[4893]: E0314 07:41:00.930610 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c825471d259f8050c542c4f894658221959723ad9e6ba3132c839512456366a\": container with ID starting with 9c825471d259f8050c542c4f894658221959723ad9e6ba3132c839512456366a not found: ID does not exist" containerID="9c825471d259f8050c542c4f894658221959723ad9e6ba3132c839512456366a" Mar 14 07:41:00 crc kubenswrapper[4893]: I0314 07:41:00.930640 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c825471d259f8050c542c4f894658221959723ad9e6ba3132c839512456366a"} err="failed to get container status \"9c825471d259f8050c542c4f894658221959723ad9e6ba3132c839512456366a\": rpc error: code = NotFound desc = could not find container \"9c825471d259f8050c542c4f894658221959723ad9e6ba3132c839512456366a\": container with ID starting with 9c825471d259f8050c542c4f894658221959723ad9e6ba3132c839512456366a not found: ID does not exist" Mar 14 07:41:00 crc kubenswrapper[4893]: I0314 07:41:00.930659 4893 scope.go:117] "RemoveContainer" containerID="763e3789a1094fa4a390bb442fc1a920abfd164c30f77c77f3596aee09477afc" Mar 14 07:41:00 crc kubenswrapper[4893]: E0314 07:41:00.930943 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"763e3789a1094fa4a390bb442fc1a920abfd164c30f77c77f3596aee09477afc\": container with ID starting with 763e3789a1094fa4a390bb442fc1a920abfd164c30f77c77f3596aee09477afc not found: ID does not exist" containerID="763e3789a1094fa4a390bb442fc1a920abfd164c30f77c77f3596aee09477afc" Mar 14 07:41:00 crc kubenswrapper[4893]: I0314 07:41:00.930965 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"763e3789a1094fa4a390bb442fc1a920abfd164c30f77c77f3596aee09477afc"} err="failed to get container status \"763e3789a1094fa4a390bb442fc1a920abfd164c30f77c77f3596aee09477afc\": rpc error: code = NotFound desc = could not find container \"763e3789a1094fa4a390bb442fc1a920abfd164c30f77c77f3596aee09477afc\": container with ID starting with 763e3789a1094fa4a390bb442fc1a920abfd164c30f77c77f3596aee09477afc not found: ID does not exist" Mar 14 07:41:00 crc kubenswrapper[4893]: I0314 07:41:00.930978 4893 scope.go:117] "RemoveContainer" containerID="23920811e5af972e9f7e31cd5c859d37dd9b53123db7400b1c8f4b8fd098419c" Mar 14 07:41:00 crc kubenswrapper[4893]: E0314 07:41:00.931191 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23920811e5af972e9f7e31cd5c859d37dd9b53123db7400b1c8f4b8fd098419c\": container with ID starting with 23920811e5af972e9f7e31cd5c859d37dd9b53123db7400b1c8f4b8fd098419c not found: ID does not exist" containerID="23920811e5af972e9f7e31cd5c859d37dd9b53123db7400b1c8f4b8fd098419c" Mar 14 07:41:00 crc kubenswrapper[4893]: I0314 07:41:00.931213 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23920811e5af972e9f7e31cd5c859d37dd9b53123db7400b1c8f4b8fd098419c"} err="failed to get container status \"23920811e5af972e9f7e31cd5c859d37dd9b53123db7400b1c8f4b8fd098419c\": rpc error: code = NotFound desc = could not find container \"23920811e5af972e9f7e31cd5c859d37dd9b53123db7400b1c8f4b8fd098419c\": container with ID starting with 23920811e5af972e9f7e31cd5c859d37dd9b53123db7400b1c8f4b8fd098419c not found: ID does not exist" Mar 14 07:41:01 crc kubenswrapper[4893]: I0314 07:41:01.387098 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e3c87be-1689-4d71-b7e1-d71829082b39" path="/var/lib/kubelet/pods/9e3c87be-1689-4d71-b7e1-d71829082b39/volumes" Mar 14 07:41:02 crc kubenswrapper[4893]: I0314 07:41:02.376695 4893 scope.go:117] "RemoveContainer" containerID="b66fce5bdacd92a99f342206409b1a0337114400d80774b71d411c93354c80ab" Mar 14 07:41:02 crc kubenswrapper[4893]: E0314 07:41:02.377304 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:41:07 crc kubenswrapper[4893]: I0314 07:41:07.315915 4893 scope.go:117] "RemoveContainer" containerID="707959329ad44149cc2390288f5cbbc19baee193dcc5acb11ddf6fd64aff9359" Mar 14 07:41:14 crc kubenswrapper[4893]: I0314 07:41:14.376798 4893 scope.go:117] "RemoveContainer" containerID="b66fce5bdacd92a99f342206409b1a0337114400d80774b71d411c93354c80ab" Mar 14 07:41:14 crc kubenswrapper[4893]: E0314 07:41:14.377607 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:41:27 crc kubenswrapper[4893]: I0314 07:41:27.377058 4893 scope.go:117] "RemoveContainer" containerID="b66fce5bdacd92a99f342206409b1a0337114400d80774b71d411c93354c80ab" Mar 14 07:41:27 crc kubenswrapper[4893]: E0314 07:41:27.379365 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:41:38 crc kubenswrapper[4893]: I0314 07:41:38.376465 4893 scope.go:117] "RemoveContainer" containerID="b66fce5bdacd92a99f342206409b1a0337114400d80774b71d411c93354c80ab" Mar 14 07:41:38 crc kubenswrapper[4893]: E0314 07:41:38.377492 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:41:53 crc kubenswrapper[4893]: I0314 07:41:53.377234 4893 scope.go:117] "RemoveContainer" containerID="b66fce5bdacd92a99f342206409b1a0337114400d80774b71d411c93354c80ab" Mar 14 07:41:53 crc kubenswrapper[4893]: E0314 07:41:53.378502 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:42:00 crc kubenswrapper[4893]: I0314 07:42:00.138113 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557902-bnbmw"] Mar 14 07:42:00 crc kubenswrapper[4893]: E0314 07:42:00.138851 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e3c87be-1689-4d71-b7e1-d71829082b39" containerName="registry-server" Mar 14 07:42:00 crc kubenswrapper[4893]: I0314 07:42:00.138869 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e3c87be-1689-4d71-b7e1-d71829082b39" containerName="registry-server" Mar 14 07:42:00 crc kubenswrapper[4893]: E0314 07:42:00.138885 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e3c87be-1689-4d71-b7e1-d71829082b39" containerName="extract-utilities" Mar 14 07:42:00 crc kubenswrapper[4893]: I0314 07:42:00.138893 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e3c87be-1689-4d71-b7e1-d71829082b39" containerName="extract-utilities" Mar 14 07:42:00 crc kubenswrapper[4893]: E0314 07:42:00.138904 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e3c87be-1689-4d71-b7e1-d71829082b39" containerName="extract-content" Mar 14 07:42:00 crc kubenswrapper[4893]: I0314 07:42:00.138911 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e3c87be-1689-4d71-b7e1-d71829082b39" containerName="extract-content" Mar 14 07:42:00 crc kubenswrapper[4893]: I0314 07:42:00.139054 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e3c87be-1689-4d71-b7e1-d71829082b39" containerName="registry-server" Mar 14 07:42:00 crc kubenswrapper[4893]: I0314 07:42:00.139594 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557902-bnbmw" Mar 14 07:42:00 crc kubenswrapper[4893]: I0314 07:42:00.143629 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 07:42:00 crc kubenswrapper[4893]: I0314 07:42:00.143855 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 07:42:00 crc kubenswrapper[4893]: I0314 07:42:00.145007 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-44qb7" Mar 14 07:42:00 crc kubenswrapper[4893]: I0314 07:42:00.151011 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557902-bnbmw"] Mar 14 07:42:00 crc kubenswrapper[4893]: I0314 07:42:00.213724 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cr5hw\" (UniqueName: \"kubernetes.io/projected/a7cb9ac3-d81d-481c-be21-914866d0d3d8-kube-api-access-cr5hw\") pod \"auto-csr-approver-29557902-bnbmw\" (UID: \"a7cb9ac3-d81d-481c-be21-914866d0d3d8\") " pod="openshift-infra/auto-csr-approver-29557902-bnbmw" Mar 14 07:42:00 crc kubenswrapper[4893]: I0314 07:42:00.315026 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cr5hw\" (UniqueName: \"kubernetes.io/projected/a7cb9ac3-d81d-481c-be21-914866d0d3d8-kube-api-access-cr5hw\") pod \"auto-csr-approver-29557902-bnbmw\" (UID: \"a7cb9ac3-d81d-481c-be21-914866d0d3d8\") " pod="openshift-infra/auto-csr-approver-29557902-bnbmw" Mar 14 07:42:00 crc kubenswrapper[4893]: I0314 07:42:00.334420 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cr5hw\" (UniqueName: \"kubernetes.io/projected/a7cb9ac3-d81d-481c-be21-914866d0d3d8-kube-api-access-cr5hw\") pod \"auto-csr-approver-29557902-bnbmw\" (UID: \"a7cb9ac3-d81d-481c-be21-914866d0d3d8\") " pod="openshift-infra/auto-csr-approver-29557902-bnbmw" Mar 14 07:42:00 crc kubenswrapper[4893]: I0314 07:42:00.469404 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557902-bnbmw" Mar 14 07:42:00 crc kubenswrapper[4893]: I0314 07:42:00.696211 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557902-bnbmw"] Mar 14 07:42:00 crc kubenswrapper[4893]: I0314 07:42:00.700132 4893 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 07:42:01 crc kubenswrapper[4893]: I0314 07:42:01.299946 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557902-bnbmw" event={"ID":"a7cb9ac3-d81d-481c-be21-914866d0d3d8","Type":"ContainerStarted","Data":"cd93b8fc5a01d0409c4c3c68b0a2294fcfb07f20e0f206709545887c9852f994"} Mar 14 07:42:03 crc kubenswrapper[4893]: I0314 07:42:03.314435 4893 generic.go:334] "Generic (PLEG): container finished" podID="a7cb9ac3-d81d-481c-be21-914866d0d3d8" containerID="c3f3a9afceb13e93fb754ab04f95b7f6bd1c323074479894b9e7909f1a705c75" exitCode=0 Mar 14 07:42:03 crc kubenswrapper[4893]: I0314 07:42:03.314491 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557902-bnbmw" event={"ID":"a7cb9ac3-d81d-481c-be21-914866d0d3d8","Type":"ContainerDied","Data":"c3f3a9afceb13e93fb754ab04f95b7f6bd1c323074479894b9e7909f1a705c75"} Mar 14 07:42:04 crc kubenswrapper[4893]: I0314 07:42:04.624331 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557902-bnbmw" Mar 14 07:42:04 crc kubenswrapper[4893]: I0314 07:42:04.780480 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cr5hw\" (UniqueName: \"kubernetes.io/projected/a7cb9ac3-d81d-481c-be21-914866d0d3d8-kube-api-access-cr5hw\") pod \"a7cb9ac3-d81d-481c-be21-914866d0d3d8\" (UID: \"a7cb9ac3-d81d-481c-be21-914866d0d3d8\") " Mar 14 07:42:04 crc kubenswrapper[4893]: I0314 07:42:04.785732 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7cb9ac3-d81d-481c-be21-914866d0d3d8-kube-api-access-cr5hw" (OuterVolumeSpecName: "kube-api-access-cr5hw") pod "a7cb9ac3-d81d-481c-be21-914866d0d3d8" (UID: "a7cb9ac3-d81d-481c-be21-914866d0d3d8"). InnerVolumeSpecName "kube-api-access-cr5hw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:42:04 crc kubenswrapper[4893]: I0314 07:42:04.881623 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cr5hw\" (UniqueName: \"kubernetes.io/projected/a7cb9ac3-d81d-481c-be21-914866d0d3d8-kube-api-access-cr5hw\") on node \"crc\" DevicePath \"\"" Mar 14 07:42:05 crc kubenswrapper[4893]: I0314 07:42:05.336009 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557902-bnbmw" event={"ID":"a7cb9ac3-d81d-481c-be21-914866d0d3d8","Type":"ContainerDied","Data":"cd93b8fc5a01d0409c4c3c68b0a2294fcfb07f20e0f206709545887c9852f994"} Mar 14 07:42:05 crc kubenswrapper[4893]: I0314 07:42:05.336057 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd93b8fc5a01d0409c4c3c68b0a2294fcfb07f20e0f206709545887c9852f994" Mar 14 07:42:05 crc kubenswrapper[4893]: I0314 07:42:05.336652 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557902-bnbmw" Mar 14 07:42:05 crc kubenswrapper[4893]: I0314 07:42:05.686098 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557896-zhqmr"] Mar 14 07:42:05 crc kubenswrapper[4893]: I0314 07:42:05.691058 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557896-zhqmr"] Mar 14 07:42:07 crc kubenswrapper[4893]: I0314 07:42:07.376214 4893 scope.go:117] "RemoveContainer" containerID="b66fce5bdacd92a99f342206409b1a0337114400d80774b71d411c93354c80ab" Mar 14 07:42:07 crc kubenswrapper[4893]: E0314 07:42:07.376482 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:42:07 crc kubenswrapper[4893]: I0314 07:42:07.384909 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b7d9832-78e5-4045-a6fa-64faf415f86b" path="/var/lib/kubelet/pods/3b7d9832-78e5-4045-a6fa-64faf415f86b/volumes" Mar 14 07:42:07 crc kubenswrapper[4893]: I0314 07:42:07.390151 4893 scope.go:117] "RemoveContainer" containerID="5312e1b3af7b2942dfd72589b2474274aa88700f3430699d9fd970bbc29c941b" Mar 14 07:42:22 crc kubenswrapper[4893]: I0314 07:42:22.376274 4893 scope.go:117] "RemoveContainer" containerID="b66fce5bdacd92a99f342206409b1a0337114400d80774b71d411c93354c80ab" Mar 14 07:42:22 crc kubenswrapper[4893]: E0314 07:42:22.377059 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:42:33 crc kubenswrapper[4893]: I0314 07:42:33.377978 4893 scope.go:117] "RemoveContainer" containerID="b66fce5bdacd92a99f342206409b1a0337114400d80774b71d411c93354c80ab" Mar 14 07:42:33 crc kubenswrapper[4893]: E0314 07:42:33.379251 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:42:48 crc kubenswrapper[4893]: I0314 07:42:48.377614 4893 scope.go:117] "RemoveContainer" containerID="b66fce5bdacd92a99f342206409b1a0337114400d80774b71d411c93354c80ab" Mar 14 07:42:48 crc kubenswrapper[4893]: E0314 07:42:48.378304 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:43:03 crc kubenswrapper[4893]: I0314 07:43:03.377122 4893 scope.go:117] "RemoveContainer" containerID="b66fce5bdacd92a99f342206409b1a0337114400d80774b71d411c93354c80ab" Mar 14 07:43:03 crc kubenswrapper[4893]: I0314 07:43:03.756995 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" event={"ID":"ad6724e5-48cf-4417-ae51-b1cb8c6af70d","Type":"ContainerStarted","Data":"16139de9ab1a03adaba97c225dd8a3825dfe9dc665a321422feff65835b9630f"} Mar 14 07:44:00 crc kubenswrapper[4893]: I0314 07:44:00.150860 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557904-4z2xz"] Mar 14 07:44:00 crc kubenswrapper[4893]: E0314 07:44:00.153385 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7cb9ac3-d81d-481c-be21-914866d0d3d8" containerName="oc" Mar 14 07:44:00 crc kubenswrapper[4893]: I0314 07:44:00.153403 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7cb9ac3-d81d-481c-be21-914866d0d3d8" containerName="oc" Mar 14 07:44:00 crc kubenswrapper[4893]: I0314 07:44:00.153576 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7cb9ac3-d81d-481c-be21-914866d0d3d8" containerName="oc" Mar 14 07:44:00 crc kubenswrapper[4893]: I0314 07:44:00.154506 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557904-4z2xz" Mar 14 07:44:00 crc kubenswrapper[4893]: I0314 07:44:00.156980 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 07:44:00 crc kubenswrapper[4893]: I0314 07:44:00.157167 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-44qb7" Mar 14 07:44:00 crc kubenswrapper[4893]: I0314 07:44:00.158872 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 07:44:00 crc kubenswrapper[4893]: I0314 07:44:00.162700 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557904-4z2xz"] Mar 14 07:44:00 crc kubenswrapper[4893]: I0314 07:44:00.268206 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26brc\" (UniqueName: \"kubernetes.io/projected/36dcf765-b9cd-40a4-8794-40dd081e5d91-kube-api-access-26brc\") pod \"auto-csr-approver-29557904-4z2xz\" (UID: \"36dcf765-b9cd-40a4-8794-40dd081e5d91\") " pod="openshift-infra/auto-csr-approver-29557904-4z2xz" Mar 14 07:44:00 crc kubenswrapper[4893]: I0314 07:44:00.369112 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26brc\" (UniqueName: \"kubernetes.io/projected/36dcf765-b9cd-40a4-8794-40dd081e5d91-kube-api-access-26brc\") pod \"auto-csr-approver-29557904-4z2xz\" (UID: \"36dcf765-b9cd-40a4-8794-40dd081e5d91\") " pod="openshift-infra/auto-csr-approver-29557904-4z2xz" Mar 14 07:44:00 crc kubenswrapper[4893]: I0314 07:44:00.387325 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26brc\" (UniqueName: \"kubernetes.io/projected/36dcf765-b9cd-40a4-8794-40dd081e5d91-kube-api-access-26brc\") pod \"auto-csr-approver-29557904-4z2xz\" (UID: \"36dcf765-b9cd-40a4-8794-40dd081e5d91\") " pod="openshift-infra/auto-csr-approver-29557904-4z2xz" Mar 14 07:44:00 crc kubenswrapper[4893]: I0314 07:44:00.481085 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557904-4z2xz" Mar 14 07:44:00 crc kubenswrapper[4893]: I0314 07:44:00.897619 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557904-4z2xz"] Mar 14 07:44:00 crc kubenswrapper[4893]: W0314 07:44:00.906221 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36dcf765_b9cd_40a4_8794_40dd081e5d91.slice/crio-6f51a2bff6e9361a8382170a966477501e16b7dd7bbbafc044ff009244edf61c WatchSource:0}: Error finding container 6f51a2bff6e9361a8382170a966477501e16b7dd7bbbafc044ff009244edf61c: Status 404 returned error can't find the container with id 6f51a2bff6e9361a8382170a966477501e16b7dd7bbbafc044ff009244edf61c Mar 14 07:44:01 crc kubenswrapper[4893]: I0314 07:44:01.168776 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557904-4z2xz" event={"ID":"36dcf765-b9cd-40a4-8794-40dd081e5d91","Type":"ContainerStarted","Data":"6f51a2bff6e9361a8382170a966477501e16b7dd7bbbafc044ff009244edf61c"} Mar 14 07:44:02 crc kubenswrapper[4893]: I0314 07:44:02.176782 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557904-4z2xz" event={"ID":"36dcf765-b9cd-40a4-8794-40dd081e5d91","Type":"ContainerStarted","Data":"d8e2b21cc83acc02487212d8644c920f9c242450d7179f285949fe5868385b5c"} Mar 14 07:44:02 crc kubenswrapper[4893]: I0314 07:44:02.194795 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557904-4z2xz" podStartSLOduration=1.239986513 podStartE2EDuration="2.194768363s" podCreationTimestamp="2026-03-14 07:44:00 +0000 UTC" firstStartedPulling="2026-03-14 07:44:00.909448696 +0000 UTC m=+2720.171625498" lastFinishedPulling="2026-03-14 07:44:01.864230556 +0000 UTC m=+2721.126407348" observedRunningTime="2026-03-14 07:44:02.191616616 +0000 UTC m=+2721.453793418" watchObservedRunningTime="2026-03-14 07:44:02.194768363 +0000 UTC m=+2721.456945195" Mar 14 07:44:03 crc kubenswrapper[4893]: I0314 07:44:03.187748 4893 generic.go:334] "Generic (PLEG): container finished" podID="36dcf765-b9cd-40a4-8794-40dd081e5d91" containerID="d8e2b21cc83acc02487212d8644c920f9c242450d7179f285949fe5868385b5c" exitCode=0 Mar 14 07:44:03 crc kubenswrapper[4893]: I0314 07:44:03.187846 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557904-4z2xz" event={"ID":"36dcf765-b9cd-40a4-8794-40dd081e5d91","Type":"ContainerDied","Data":"d8e2b21cc83acc02487212d8644c920f9c242450d7179f285949fe5868385b5c"} Mar 14 07:44:04 crc kubenswrapper[4893]: I0314 07:44:04.536223 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557904-4z2xz" Mar 14 07:44:04 crc kubenswrapper[4893]: I0314 07:44:04.635780 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26brc\" (UniqueName: \"kubernetes.io/projected/36dcf765-b9cd-40a4-8794-40dd081e5d91-kube-api-access-26brc\") pod \"36dcf765-b9cd-40a4-8794-40dd081e5d91\" (UID: \"36dcf765-b9cd-40a4-8794-40dd081e5d91\") " Mar 14 07:44:04 crc kubenswrapper[4893]: I0314 07:44:04.642027 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36dcf765-b9cd-40a4-8794-40dd081e5d91-kube-api-access-26brc" (OuterVolumeSpecName: "kube-api-access-26brc") pod "36dcf765-b9cd-40a4-8794-40dd081e5d91" (UID: "36dcf765-b9cd-40a4-8794-40dd081e5d91"). InnerVolumeSpecName "kube-api-access-26brc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:44:04 crc kubenswrapper[4893]: I0314 07:44:04.737614 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26brc\" (UniqueName: \"kubernetes.io/projected/36dcf765-b9cd-40a4-8794-40dd081e5d91-kube-api-access-26brc\") on node \"crc\" DevicePath \"\"" Mar 14 07:44:05 crc kubenswrapper[4893]: I0314 07:44:05.223478 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557904-4z2xz" event={"ID":"36dcf765-b9cd-40a4-8794-40dd081e5d91","Type":"ContainerDied","Data":"6f51a2bff6e9361a8382170a966477501e16b7dd7bbbafc044ff009244edf61c"} Mar 14 07:44:05 crc kubenswrapper[4893]: I0314 07:44:05.224885 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f51a2bff6e9361a8382170a966477501e16b7dd7bbbafc044ff009244edf61c" Mar 14 07:44:05 crc kubenswrapper[4893]: I0314 07:44:05.224551 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557904-4z2xz" Mar 14 07:44:05 crc kubenswrapper[4893]: I0314 07:44:05.606547 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557898-fxmn9"] Mar 14 07:44:05 crc kubenswrapper[4893]: I0314 07:44:05.616683 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557898-fxmn9"] Mar 14 07:44:07 crc kubenswrapper[4893]: I0314 07:44:07.389087 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b3624b-adc8-4bd6-98f4-3babf9627dee" path="/var/lib/kubelet/pods/96b3624b-adc8-4bd6-98f4-3babf9627dee/volumes" Mar 14 07:44:07 crc kubenswrapper[4893]: I0314 07:44:07.474802 4893 scope.go:117] "RemoveContainer" containerID="87f38458f905c0979f8c8ee8cb85a51009a89cff69fb856d5c74e38907626a0f" Mar 14 07:45:00 crc kubenswrapper[4893]: I0314 07:45:00.145710 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557905-v85sq"] Mar 14 07:45:00 crc kubenswrapper[4893]: E0314 07:45:00.146614 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36dcf765-b9cd-40a4-8794-40dd081e5d91" containerName="oc" Mar 14 07:45:00 crc kubenswrapper[4893]: I0314 07:45:00.146632 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="36dcf765-b9cd-40a4-8794-40dd081e5d91" containerName="oc" Mar 14 07:45:00 crc kubenswrapper[4893]: I0314 07:45:00.146874 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="36dcf765-b9cd-40a4-8794-40dd081e5d91" containerName="oc" Mar 14 07:45:00 crc kubenswrapper[4893]: I0314 07:45:00.147896 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557905-v85sq" Mar 14 07:45:00 crc kubenswrapper[4893]: I0314 07:45:00.150973 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 14 07:45:00 crc kubenswrapper[4893]: I0314 07:45:00.152057 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 14 07:45:00 crc kubenswrapper[4893]: I0314 07:45:00.152723 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557905-v85sq"] Mar 14 07:45:00 crc kubenswrapper[4893]: I0314 07:45:00.244337 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/42235e44-e2b1-4658-a6f9-994a77d58696-secret-volume\") pod \"collect-profiles-29557905-v85sq\" (UID: \"42235e44-e2b1-4658-a6f9-994a77d58696\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557905-v85sq" Mar 14 07:45:00 crc kubenswrapper[4893]: I0314 07:45:00.244435 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lx5k\" (UniqueName: \"kubernetes.io/projected/42235e44-e2b1-4658-a6f9-994a77d58696-kube-api-access-9lx5k\") pod \"collect-profiles-29557905-v85sq\" (UID: \"42235e44-e2b1-4658-a6f9-994a77d58696\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557905-v85sq" Mar 14 07:45:00 crc kubenswrapper[4893]: I0314 07:45:00.244489 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/42235e44-e2b1-4658-a6f9-994a77d58696-config-volume\") pod \"collect-profiles-29557905-v85sq\" (UID: \"42235e44-e2b1-4658-a6f9-994a77d58696\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557905-v85sq" Mar 14 07:45:00 crc kubenswrapper[4893]: I0314 07:45:00.345431 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/42235e44-e2b1-4658-a6f9-994a77d58696-secret-volume\") pod \"collect-profiles-29557905-v85sq\" (UID: \"42235e44-e2b1-4658-a6f9-994a77d58696\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557905-v85sq" Mar 14 07:45:00 crc kubenswrapper[4893]: I0314 07:45:00.345487 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lx5k\" (UniqueName: \"kubernetes.io/projected/42235e44-e2b1-4658-a6f9-994a77d58696-kube-api-access-9lx5k\") pod \"collect-profiles-29557905-v85sq\" (UID: \"42235e44-e2b1-4658-a6f9-994a77d58696\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557905-v85sq" Mar 14 07:45:00 crc kubenswrapper[4893]: I0314 07:45:00.345537 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/42235e44-e2b1-4658-a6f9-994a77d58696-config-volume\") pod \"collect-profiles-29557905-v85sq\" (UID: \"42235e44-e2b1-4658-a6f9-994a77d58696\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557905-v85sq" Mar 14 07:45:00 crc kubenswrapper[4893]: I0314 07:45:00.346698 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/42235e44-e2b1-4658-a6f9-994a77d58696-config-volume\") pod \"collect-profiles-29557905-v85sq\" (UID: \"42235e44-e2b1-4658-a6f9-994a77d58696\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557905-v85sq" Mar 14 07:45:00 crc kubenswrapper[4893]: I0314 07:45:00.356644 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/42235e44-e2b1-4658-a6f9-994a77d58696-secret-volume\") pod \"collect-profiles-29557905-v85sq\" (UID: \"42235e44-e2b1-4658-a6f9-994a77d58696\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557905-v85sq" Mar 14 07:45:00 crc kubenswrapper[4893]: I0314 07:45:00.366693 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lx5k\" (UniqueName: \"kubernetes.io/projected/42235e44-e2b1-4658-a6f9-994a77d58696-kube-api-access-9lx5k\") pod \"collect-profiles-29557905-v85sq\" (UID: \"42235e44-e2b1-4658-a6f9-994a77d58696\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557905-v85sq" Mar 14 07:45:00 crc kubenswrapper[4893]: I0314 07:45:00.465730 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557905-v85sq" Mar 14 07:45:00 crc kubenswrapper[4893]: I0314 07:45:00.914883 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557905-v85sq"] Mar 14 07:45:01 crc kubenswrapper[4893]: I0314 07:45:01.643718 4893 generic.go:334] "Generic (PLEG): container finished" podID="42235e44-e2b1-4658-a6f9-994a77d58696" containerID="1b4d799c3397d0bacd3987a1f21bdcd15a65ff61014165699516fde7d0ec6ebe" exitCode=0 Mar 14 07:45:01 crc kubenswrapper[4893]: I0314 07:45:01.643818 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557905-v85sq" event={"ID":"42235e44-e2b1-4658-a6f9-994a77d58696","Type":"ContainerDied","Data":"1b4d799c3397d0bacd3987a1f21bdcd15a65ff61014165699516fde7d0ec6ebe"} Mar 14 07:45:01 crc kubenswrapper[4893]: I0314 07:45:01.644024 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557905-v85sq" event={"ID":"42235e44-e2b1-4658-a6f9-994a77d58696","Type":"ContainerStarted","Data":"6dfb23b121bc6bde0b7d4721f4b28fca649bf0913785e313e9074a98762d6efb"} Mar 14 07:45:02 crc kubenswrapper[4893]: I0314 07:45:02.955413 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557905-v85sq" Mar 14 07:45:03 crc kubenswrapper[4893]: I0314 07:45:03.088323 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9lx5k\" (UniqueName: \"kubernetes.io/projected/42235e44-e2b1-4658-a6f9-994a77d58696-kube-api-access-9lx5k\") pod \"42235e44-e2b1-4658-a6f9-994a77d58696\" (UID: \"42235e44-e2b1-4658-a6f9-994a77d58696\") " Mar 14 07:45:03 crc kubenswrapper[4893]: I0314 07:45:03.088466 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/42235e44-e2b1-4658-a6f9-994a77d58696-config-volume\") pod \"42235e44-e2b1-4658-a6f9-994a77d58696\" (UID: \"42235e44-e2b1-4658-a6f9-994a77d58696\") " Mar 14 07:45:03 crc kubenswrapper[4893]: I0314 07:45:03.088502 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/42235e44-e2b1-4658-a6f9-994a77d58696-secret-volume\") pod \"42235e44-e2b1-4658-a6f9-994a77d58696\" (UID: \"42235e44-e2b1-4658-a6f9-994a77d58696\") " Mar 14 07:45:03 crc kubenswrapper[4893]: I0314 07:45:03.089933 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42235e44-e2b1-4658-a6f9-994a77d58696-config-volume" (OuterVolumeSpecName: "config-volume") pod "42235e44-e2b1-4658-a6f9-994a77d58696" (UID: "42235e44-e2b1-4658-a6f9-994a77d58696"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:45:03 crc kubenswrapper[4893]: I0314 07:45:03.098427 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42235e44-e2b1-4658-a6f9-994a77d58696-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "42235e44-e2b1-4658-a6f9-994a77d58696" (UID: "42235e44-e2b1-4658-a6f9-994a77d58696"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:45:03 crc kubenswrapper[4893]: I0314 07:45:03.098879 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42235e44-e2b1-4658-a6f9-994a77d58696-kube-api-access-9lx5k" (OuterVolumeSpecName: "kube-api-access-9lx5k") pod "42235e44-e2b1-4658-a6f9-994a77d58696" (UID: "42235e44-e2b1-4658-a6f9-994a77d58696"). InnerVolumeSpecName "kube-api-access-9lx5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:45:03 crc kubenswrapper[4893]: I0314 07:45:03.190652 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9lx5k\" (UniqueName: \"kubernetes.io/projected/42235e44-e2b1-4658-a6f9-994a77d58696-kube-api-access-9lx5k\") on node \"crc\" DevicePath \"\"" Mar 14 07:45:03 crc kubenswrapper[4893]: I0314 07:45:03.191198 4893 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/42235e44-e2b1-4658-a6f9-994a77d58696-config-volume\") on node \"crc\" DevicePath \"\"" Mar 14 07:45:03 crc kubenswrapper[4893]: I0314 07:45:03.191317 4893 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/42235e44-e2b1-4658-a6f9-994a77d58696-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 14 07:45:03 crc kubenswrapper[4893]: E0314 07:45:03.542229 4893 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42235e44_e2b1_4658_a6f9_994a77d58696.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42235e44_e2b1_4658_a6f9_994a77d58696.slice/crio-6dfb23b121bc6bde0b7d4721f4b28fca649bf0913785e313e9074a98762d6efb\": RecentStats: unable to find data in memory cache]" Mar 14 07:45:03 crc kubenswrapper[4893]: I0314 07:45:03.657680 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557905-v85sq" event={"ID":"42235e44-e2b1-4658-a6f9-994a77d58696","Type":"ContainerDied","Data":"6dfb23b121bc6bde0b7d4721f4b28fca649bf0913785e313e9074a98762d6efb"} Mar 14 07:45:03 crc kubenswrapper[4893]: I0314 07:45:03.658001 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6dfb23b121bc6bde0b7d4721f4b28fca649bf0913785e313e9074a98762d6efb" Mar 14 07:45:03 crc kubenswrapper[4893]: I0314 07:45:03.657748 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557905-v85sq" Mar 14 07:45:04 crc kubenswrapper[4893]: I0314 07:45:04.025723 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557860-g42hf"] Mar 14 07:45:04 crc kubenswrapper[4893]: I0314 07:45:04.030377 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557860-g42hf"] Mar 14 07:45:05 crc kubenswrapper[4893]: I0314 07:45:05.386340 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3009e07b-2452-425c-95c3-3a78fa993d62" path="/var/lib/kubelet/pods/3009e07b-2452-425c-95c3-3a78fa993d62/volumes" Mar 14 07:45:07 crc kubenswrapper[4893]: I0314 07:45:07.574033 4893 scope.go:117] "RemoveContainer" containerID="f346a9d6640b2d3342f142ca50107602a1405bb3b1a5118c157b77375c6a113a" Mar 14 07:45:29 crc kubenswrapper[4893]: I0314 07:45:29.732751 4893 patch_prober.go:28] interesting pod/machine-config-daemon-d4x6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:45:29 crc kubenswrapper[4893]: I0314 07:45:29.733674 4893 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:45:59 crc kubenswrapper[4893]: I0314 07:45:59.731494 4893 patch_prober.go:28] interesting pod/machine-config-daemon-d4x6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:45:59 crc kubenswrapper[4893]: I0314 07:45:59.732087 4893 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:46:00 crc kubenswrapper[4893]: I0314 07:46:00.136397 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557906-wr4ng"] Mar 14 07:46:00 crc kubenswrapper[4893]: E0314 07:46:00.136746 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42235e44-e2b1-4658-a6f9-994a77d58696" containerName="collect-profiles" Mar 14 07:46:00 crc kubenswrapper[4893]: I0314 07:46:00.136763 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="42235e44-e2b1-4658-a6f9-994a77d58696" containerName="collect-profiles" Mar 14 07:46:00 crc kubenswrapper[4893]: I0314 07:46:00.136905 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="42235e44-e2b1-4658-a6f9-994a77d58696" containerName="collect-profiles" Mar 14 07:46:00 crc kubenswrapper[4893]: I0314 07:46:00.137404 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557906-wr4ng" Mar 14 07:46:00 crc kubenswrapper[4893]: I0314 07:46:00.143127 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 07:46:00 crc kubenswrapper[4893]: I0314 07:46:00.143292 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 07:46:00 crc kubenswrapper[4893]: I0314 07:46:00.143423 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-44qb7" Mar 14 07:46:00 crc kubenswrapper[4893]: I0314 07:46:00.149605 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557906-wr4ng"] Mar 14 07:46:00 crc kubenswrapper[4893]: I0314 07:46:00.237569 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrvd9\" (UniqueName: \"kubernetes.io/projected/82a414a7-6d45-4bb0-a625-374719d0482e-kube-api-access-vrvd9\") pod \"auto-csr-approver-29557906-wr4ng\" (UID: \"82a414a7-6d45-4bb0-a625-374719d0482e\") " pod="openshift-infra/auto-csr-approver-29557906-wr4ng" Mar 14 07:46:00 crc kubenswrapper[4893]: I0314 07:46:00.339019 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrvd9\" (UniqueName: \"kubernetes.io/projected/82a414a7-6d45-4bb0-a625-374719d0482e-kube-api-access-vrvd9\") pod \"auto-csr-approver-29557906-wr4ng\" (UID: \"82a414a7-6d45-4bb0-a625-374719d0482e\") " pod="openshift-infra/auto-csr-approver-29557906-wr4ng" Mar 14 07:46:00 crc kubenswrapper[4893]: I0314 07:46:00.361156 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrvd9\" (UniqueName: \"kubernetes.io/projected/82a414a7-6d45-4bb0-a625-374719d0482e-kube-api-access-vrvd9\") pod \"auto-csr-approver-29557906-wr4ng\" (UID: \"82a414a7-6d45-4bb0-a625-374719d0482e\") " pod="openshift-infra/auto-csr-approver-29557906-wr4ng" Mar 14 07:46:00 crc kubenswrapper[4893]: I0314 07:46:00.457338 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557906-wr4ng" Mar 14 07:46:00 crc kubenswrapper[4893]: I0314 07:46:00.884975 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557906-wr4ng"] Mar 14 07:46:01 crc kubenswrapper[4893]: I0314 07:46:01.086320 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557906-wr4ng" event={"ID":"82a414a7-6d45-4bb0-a625-374719d0482e","Type":"ContainerStarted","Data":"0a88e99acabd54e2d2bdab157450806b6605304e1428e910ea4a0c33cd89e444"} Mar 14 07:46:02 crc kubenswrapper[4893]: I0314 07:46:02.093771 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557906-wr4ng" event={"ID":"82a414a7-6d45-4bb0-a625-374719d0482e","Type":"ContainerStarted","Data":"c208fead99477366a558d9bfd2e24e0fc68b80ef714a4b3bbdaf654cb3ceb973"} Mar 14 07:46:02 crc kubenswrapper[4893]: I0314 07:46:02.112798 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557906-wr4ng" podStartSLOduration=1.2598128100000001 podStartE2EDuration="2.112768566s" podCreationTimestamp="2026-03-14 07:46:00 +0000 UTC" firstStartedPulling="2026-03-14 07:46:00.892766799 +0000 UTC m=+2840.154943591" lastFinishedPulling="2026-03-14 07:46:01.745722545 +0000 UTC m=+2841.007899347" observedRunningTime="2026-03-14 07:46:02.106506323 +0000 UTC m=+2841.368683155" watchObservedRunningTime="2026-03-14 07:46:02.112768566 +0000 UTC m=+2841.374945388" Mar 14 07:46:03 crc kubenswrapper[4893]: I0314 07:46:03.102668 4893 generic.go:334] "Generic (PLEG): container finished" podID="82a414a7-6d45-4bb0-a625-374719d0482e" containerID="c208fead99477366a558d9bfd2e24e0fc68b80ef714a4b3bbdaf654cb3ceb973" exitCode=0 Mar 14 07:46:03 crc kubenswrapper[4893]: I0314 07:46:03.102722 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557906-wr4ng" event={"ID":"82a414a7-6d45-4bb0-a625-374719d0482e","Type":"ContainerDied","Data":"c208fead99477366a558d9bfd2e24e0fc68b80ef714a4b3bbdaf654cb3ceb973"} Mar 14 07:46:04 crc kubenswrapper[4893]: I0314 07:46:04.521569 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557906-wr4ng" Mar 14 07:46:04 crc kubenswrapper[4893]: I0314 07:46:04.704309 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrvd9\" (UniqueName: \"kubernetes.io/projected/82a414a7-6d45-4bb0-a625-374719d0482e-kube-api-access-vrvd9\") pod \"82a414a7-6d45-4bb0-a625-374719d0482e\" (UID: \"82a414a7-6d45-4bb0-a625-374719d0482e\") " Mar 14 07:46:04 crc kubenswrapper[4893]: I0314 07:46:04.710605 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82a414a7-6d45-4bb0-a625-374719d0482e-kube-api-access-vrvd9" (OuterVolumeSpecName: "kube-api-access-vrvd9") pod "82a414a7-6d45-4bb0-a625-374719d0482e" (UID: "82a414a7-6d45-4bb0-a625-374719d0482e"). InnerVolumeSpecName "kube-api-access-vrvd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:46:04 crc kubenswrapper[4893]: I0314 07:46:04.806217 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrvd9\" (UniqueName: \"kubernetes.io/projected/82a414a7-6d45-4bb0-a625-374719d0482e-kube-api-access-vrvd9\") on node \"crc\" DevicePath \"\"" Mar 14 07:46:05 crc kubenswrapper[4893]: I0314 07:46:05.123828 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557906-wr4ng" event={"ID":"82a414a7-6d45-4bb0-a625-374719d0482e","Type":"ContainerDied","Data":"0a88e99acabd54e2d2bdab157450806b6605304e1428e910ea4a0c33cd89e444"} Mar 14 07:46:05 crc kubenswrapper[4893]: I0314 07:46:05.124267 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a88e99acabd54e2d2bdab157450806b6605304e1428e910ea4a0c33cd89e444" Mar 14 07:46:05 crc kubenswrapper[4893]: I0314 07:46:05.123869 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557906-wr4ng" Mar 14 07:46:05 crc kubenswrapper[4893]: I0314 07:46:05.608985 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557900-d6wgw"] Mar 14 07:46:05 crc kubenswrapper[4893]: I0314 07:46:05.615595 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557900-d6wgw"] Mar 14 07:46:07 crc kubenswrapper[4893]: I0314 07:46:07.400165 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f23eec9-28e9-4fb9-9bfe-2cdd920df673" path="/var/lib/kubelet/pods/9f23eec9-28e9-4fb9-9bfe-2cdd920df673/volumes" Mar 14 07:46:07 crc kubenswrapper[4893]: I0314 07:46:07.636975 4893 scope.go:117] "RemoveContainer" containerID="188c65ffe61a1f706a576b3f06bf58d8ce14c03b100c4c985656f2757c8131ec" Mar 14 07:46:29 crc kubenswrapper[4893]: I0314 07:46:29.731565 4893 patch_prober.go:28] interesting pod/machine-config-daemon-d4x6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:46:29 crc kubenswrapper[4893]: I0314 07:46:29.732123 4893 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:46:29 crc kubenswrapper[4893]: I0314 07:46:29.732179 4893 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" Mar 14 07:46:29 crc kubenswrapper[4893]: I0314 07:46:29.732829 4893 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"16139de9ab1a03adaba97c225dd8a3825dfe9dc665a321422feff65835b9630f"} pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 07:46:29 crc kubenswrapper[4893]: I0314 07:46:29.732882 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" containerID="cri-o://16139de9ab1a03adaba97c225dd8a3825dfe9dc665a321422feff65835b9630f" gracePeriod=600 Mar 14 07:46:30 crc kubenswrapper[4893]: I0314 07:46:30.337426 4893 generic.go:334] "Generic (PLEG): container finished" podID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerID="16139de9ab1a03adaba97c225dd8a3825dfe9dc665a321422feff65835b9630f" exitCode=0 Mar 14 07:46:30 crc kubenswrapper[4893]: I0314 07:46:30.337513 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" event={"ID":"ad6724e5-48cf-4417-ae51-b1cb8c6af70d","Type":"ContainerDied","Data":"16139de9ab1a03adaba97c225dd8a3825dfe9dc665a321422feff65835b9630f"} Mar 14 07:46:30 crc kubenswrapper[4893]: I0314 07:46:30.337830 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" event={"ID":"ad6724e5-48cf-4417-ae51-b1cb8c6af70d","Type":"ContainerStarted","Data":"7d4f888dda4f72e93fbe049c9579bc4d31994ffcfdb006bc2c321cea789b8be2"} Mar 14 07:46:30 crc kubenswrapper[4893]: I0314 07:46:30.337856 4893 scope.go:117] "RemoveContainer" containerID="b66fce5bdacd92a99f342206409b1a0337114400d80774b71d411c93354c80ab" Mar 14 07:46:51 crc kubenswrapper[4893]: I0314 07:46:51.436233 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8nmt6"] Mar 14 07:46:51 crc kubenswrapper[4893]: E0314 07:46:51.437136 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82a414a7-6d45-4bb0-a625-374719d0482e" containerName="oc" Mar 14 07:46:51 crc kubenswrapper[4893]: I0314 07:46:51.437153 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="82a414a7-6d45-4bb0-a625-374719d0482e" containerName="oc" Mar 14 07:46:51 crc kubenswrapper[4893]: I0314 07:46:51.437327 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="82a414a7-6d45-4bb0-a625-374719d0482e" containerName="oc" Mar 14 07:46:51 crc kubenswrapper[4893]: I0314 07:46:51.442338 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8nmt6" Mar 14 07:46:51 crc kubenswrapper[4893]: I0314 07:46:51.457465 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8nmt6"] Mar 14 07:46:51 crc kubenswrapper[4893]: I0314 07:46:51.556879 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c201b70b-6716-4ca0-8e1b-a69eec255486-utilities\") pod \"redhat-marketplace-8nmt6\" (UID: \"c201b70b-6716-4ca0-8e1b-a69eec255486\") " pod="openshift-marketplace/redhat-marketplace-8nmt6" Mar 14 07:46:51 crc kubenswrapper[4893]: I0314 07:46:51.557177 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c201b70b-6716-4ca0-8e1b-a69eec255486-catalog-content\") pod \"redhat-marketplace-8nmt6\" (UID: \"c201b70b-6716-4ca0-8e1b-a69eec255486\") " pod="openshift-marketplace/redhat-marketplace-8nmt6" Mar 14 07:46:51 crc kubenswrapper[4893]: I0314 07:46:51.557279 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6xrs\" (UniqueName: \"kubernetes.io/projected/c201b70b-6716-4ca0-8e1b-a69eec255486-kube-api-access-d6xrs\") pod \"redhat-marketplace-8nmt6\" (UID: \"c201b70b-6716-4ca0-8e1b-a69eec255486\") " pod="openshift-marketplace/redhat-marketplace-8nmt6" Mar 14 07:46:51 crc kubenswrapper[4893]: I0314 07:46:51.658133 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c201b70b-6716-4ca0-8e1b-a69eec255486-utilities\") pod \"redhat-marketplace-8nmt6\" (UID: \"c201b70b-6716-4ca0-8e1b-a69eec255486\") " pod="openshift-marketplace/redhat-marketplace-8nmt6" Mar 14 07:46:51 crc kubenswrapper[4893]: I0314 07:46:51.658200 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c201b70b-6716-4ca0-8e1b-a69eec255486-catalog-content\") pod \"redhat-marketplace-8nmt6\" (UID: \"c201b70b-6716-4ca0-8e1b-a69eec255486\") " pod="openshift-marketplace/redhat-marketplace-8nmt6" Mar 14 07:46:51 crc kubenswrapper[4893]: I0314 07:46:51.658230 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6xrs\" (UniqueName: \"kubernetes.io/projected/c201b70b-6716-4ca0-8e1b-a69eec255486-kube-api-access-d6xrs\") pod \"redhat-marketplace-8nmt6\" (UID: \"c201b70b-6716-4ca0-8e1b-a69eec255486\") " pod="openshift-marketplace/redhat-marketplace-8nmt6" Mar 14 07:46:51 crc kubenswrapper[4893]: I0314 07:46:51.658711 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c201b70b-6716-4ca0-8e1b-a69eec255486-utilities\") pod \"redhat-marketplace-8nmt6\" (UID: \"c201b70b-6716-4ca0-8e1b-a69eec255486\") " pod="openshift-marketplace/redhat-marketplace-8nmt6" Mar 14 07:46:51 crc kubenswrapper[4893]: I0314 07:46:51.658763 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c201b70b-6716-4ca0-8e1b-a69eec255486-catalog-content\") pod \"redhat-marketplace-8nmt6\" (UID: \"c201b70b-6716-4ca0-8e1b-a69eec255486\") " pod="openshift-marketplace/redhat-marketplace-8nmt6" Mar 14 07:46:51 crc kubenswrapper[4893]: I0314 07:46:51.680379 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6xrs\" (UniqueName: \"kubernetes.io/projected/c201b70b-6716-4ca0-8e1b-a69eec255486-kube-api-access-d6xrs\") pod \"redhat-marketplace-8nmt6\" (UID: \"c201b70b-6716-4ca0-8e1b-a69eec255486\") " pod="openshift-marketplace/redhat-marketplace-8nmt6" Mar 14 07:46:51 crc kubenswrapper[4893]: I0314 07:46:51.777468 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8nmt6" Mar 14 07:46:52 crc kubenswrapper[4893]: I0314 07:46:52.234446 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8nmt6"] Mar 14 07:46:52 crc kubenswrapper[4893]: I0314 07:46:52.530299 4893 generic.go:334] "Generic (PLEG): container finished" podID="c201b70b-6716-4ca0-8e1b-a69eec255486" containerID="d717d9865a66625fd130de2346b23a1a0cc6c7062cc75773e1784dc7148d30b2" exitCode=0 Mar 14 07:46:52 crc kubenswrapper[4893]: I0314 07:46:52.530633 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8nmt6" event={"ID":"c201b70b-6716-4ca0-8e1b-a69eec255486","Type":"ContainerDied","Data":"d717d9865a66625fd130de2346b23a1a0cc6c7062cc75773e1784dc7148d30b2"} Mar 14 07:46:52 crc kubenswrapper[4893]: I0314 07:46:52.530658 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8nmt6" event={"ID":"c201b70b-6716-4ca0-8e1b-a69eec255486","Type":"ContainerStarted","Data":"d368d788796ab582c19bc86be53f5c1a7825b70c81bea5d8aa8667a13b7fc1f5"} Mar 14 07:46:54 crc kubenswrapper[4893]: I0314 07:46:54.546811 4893 generic.go:334] "Generic (PLEG): container finished" podID="c201b70b-6716-4ca0-8e1b-a69eec255486" containerID="5b904167016cc11762d2f33b071f5ad288c7ceb861ca9703edb06cc91f5f9a0a" exitCode=0 Mar 14 07:46:54 crc kubenswrapper[4893]: I0314 07:46:54.547136 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8nmt6" event={"ID":"c201b70b-6716-4ca0-8e1b-a69eec255486","Type":"ContainerDied","Data":"5b904167016cc11762d2f33b071f5ad288c7ceb861ca9703edb06cc91f5f9a0a"} Mar 14 07:46:55 crc kubenswrapper[4893]: I0314 07:46:55.561379 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8nmt6" event={"ID":"c201b70b-6716-4ca0-8e1b-a69eec255486","Type":"ContainerStarted","Data":"f61415d4c8cdf9bc5b5fad415cb333d3473ab4e51dbdaa555e434cfe231c55e4"} Mar 14 07:46:55 crc kubenswrapper[4893]: I0314 07:46:55.584252 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8nmt6" podStartSLOduration=2.165317926 podStartE2EDuration="4.584030578s" podCreationTimestamp="2026-03-14 07:46:51 +0000 UTC" firstStartedPulling="2026-03-14 07:46:52.532031913 +0000 UTC m=+2891.794208705" lastFinishedPulling="2026-03-14 07:46:54.950744555 +0000 UTC m=+2894.212921357" observedRunningTime="2026-03-14 07:46:55.579992249 +0000 UTC m=+2894.842169041" watchObservedRunningTime="2026-03-14 07:46:55.584030578 +0000 UTC m=+2894.846207360" Mar 14 07:47:01 crc kubenswrapper[4893]: I0314 07:47:01.777861 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8nmt6" Mar 14 07:47:01 crc kubenswrapper[4893]: I0314 07:47:01.778496 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8nmt6" Mar 14 07:47:01 crc kubenswrapper[4893]: I0314 07:47:01.823464 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8nmt6" Mar 14 07:47:02 crc kubenswrapper[4893]: I0314 07:47:02.681678 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8nmt6" Mar 14 07:47:02 crc kubenswrapper[4893]: I0314 07:47:02.727719 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8nmt6"] Mar 14 07:47:04 crc kubenswrapper[4893]: I0314 07:47:04.648135 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8nmt6" podUID="c201b70b-6716-4ca0-8e1b-a69eec255486" containerName="registry-server" containerID="cri-o://f61415d4c8cdf9bc5b5fad415cb333d3473ab4e51dbdaa555e434cfe231c55e4" gracePeriod=2 Mar 14 07:47:05 crc kubenswrapper[4893]: I0314 07:47:05.030059 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8nmt6" Mar 14 07:47:05 crc kubenswrapper[4893]: I0314 07:47:05.165962 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c201b70b-6716-4ca0-8e1b-a69eec255486-catalog-content\") pod \"c201b70b-6716-4ca0-8e1b-a69eec255486\" (UID: \"c201b70b-6716-4ca0-8e1b-a69eec255486\") " Mar 14 07:47:05 crc kubenswrapper[4893]: I0314 07:47:05.166061 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c201b70b-6716-4ca0-8e1b-a69eec255486-utilities\") pod \"c201b70b-6716-4ca0-8e1b-a69eec255486\" (UID: \"c201b70b-6716-4ca0-8e1b-a69eec255486\") " Mar 14 07:47:05 crc kubenswrapper[4893]: I0314 07:47:05.166208 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6xrs\" (UniqueName: \"kubernetes.io/projected/c201b70b-6716-4ca0-8e1b-a69eec255486-kube-api-access-d6xrs\") pod \"c201b70b-6716-4ca0-8e1b-a69eec255486\" (UID: \"c201b70b-6716-4ca0-8e1b-a69eec255486\") " Mar 14 07:47:05 crc kubenswrapper[4893]: I0314 07:47:05.167094 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c201b70b-6716-4ca0-8e1b-a69eec255486-utilities" (OuterVolumeSpecName: "utilities") pod "c201b70b-6716-4ca0-8e1b-a69eec255486" (UID: "c201b70b-6716-4ca0-8e1b-a69eec255486"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:47:05 crc kubenswrapper[4893]: I0314 07:47:05.172681 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c201b70b-6716-4ca0-8e1b-a69eec255486-kube-api-access-d6xrs" (OuterVolumeSpecName: "kube-api-access-d6xrs") pod "c201b70b-6716-4ca0-8e1b-a69eec255486" (UID: "c201b70b-6716-4ca0-8e1b-a69eec255486"). InnerVolumeSpecName "kube-api-access-d6xrs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:47:05 crc kubenswrapper[4893]: I0314 07:47:05.195255 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c201b70b-6716-4ca0-8e1b-a69eec255486-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c201b70b-6716-4ca0-8e1b-a69eec255486" (UID: "c201b70b-6716-4ca0-8e1b-a69eec255486"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:47:05 crc kubenswrapper[4893]: I0314 07:47:05.269513 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6xrs\" (UniqueName: \"kubernetes.io/projected/c201b70b-6716-4ca0-8e1b-a69eec255486-kube-api-access-d6xrs\") on node \"crc\" DevicePath \"\"" Mar 14 07:47:05 crc kubenswrapper[4893]: I0314 07:47:05.269594 4893 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c201b70b-6716-4ca0-8e1b-a69eec255486-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 07:47:05 crc kubenswrapper[4893]: I0314 07:47:05.269608 4893 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c201b70b-6716-4ca0-8e1b-a69eec255486-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 07:47:05 crc kubenswrapper[4893]: I0314 07:47:05.667986 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-m9zjw"] Mar 14 07:47:05 crc kubenswrapper[4893]: E0314 07:47:05.668314 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c201b70b-6716-4ca0-8e1b-a69eec255486" containerName="extract-content" Mar 14 07:47:05 crc kubenswrapper[4893]: I0314 07:47:05.668332 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="c201b70b-6716-4ca0-8e1b-a69eec255486" containerName="extract-content" Mar 14 07:47:05 crc kubenswrapper[4893]: E0314 07:47:05.668360 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c201b70b-6716-4ca0-8e1b-a69eec255486" containerName="extract-utilities" Mar 14 07:47:05 crc kubenswrapper[4893]: I0314 07:47:05.668367 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="c201b70b-6716-4ca0-8e1b-a69eec255486" containerName="extract-utilities" Mar 14 07:47:05 crc kubenswrapper[4893]: E0314 07:47:05.668394 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c201b70b-6716-4ca0-8e1b-a69eec255486" containerName="registry-server" Mar 14 07:47:05 crc kubenswrapper[4893]: I0314 07:47:05.668403 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="c201b70b-6716-4ca0-8e1b-a69eec255486" containerName="registry-server" Mar 14 07:47:05 crc kubenswrapper[4893]: I0314 07:47:05.668654 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="c201b70b-6716-4ca0-8e1b-a69eec255486" containerName="registry-server" Mar 14 07:47:05 crc kubenswrapper[4893]: I0314 07:47:05.669814 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m9zjw" Mar 14 07:47:05 crc kubenswrapper[4893]: I0314 07:47:05.670510 4893 generic.go:334] "Generic (PLEG): container finished" podID="c201b70b-6716-4ca0-8e1b-a69eec255486" containerID="f61415d4c8cdf9bc5b5fad415cb333d3473ab4e51dbdaa555e434cfe231c55e4" exitCode=0 Mar 14 07:47:05 crc kubenswrapper[4893]: I0314 07:47:05.670552 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8nmt6" event={"ID":"c201b70b-6716-4ca0-8e1b-a69eec255486","Type":"ContainerDied","Data":"f61415d4c8cdf9bc5b5fad415cb333d3473ab4e51dbdaa555e434cfe231c55e4"} Mar 14 07:47:05 crc kubenswrapper[4893]: I0314 07:47:05.670571 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8nmt6" event={"ID":"c201b70b-6716-4ca0-8e1b-a69eec255486","Type":"ContainerDied","Data":"d368d788796ab582c19bc86be53f5c1a7825b70c81bea5d8aa8667a13b7fc1f5"} Mar 14 07:47:05 crc kubenswrapper[4893]: I0314 07:47:05.670584 4893 scope.go:117] "RemoveContainer" containerID="f61415d4c8cdf9bc5b5fad415cb333d3473ab4e51dbdaa555e434cfe231c55e4" Mar 14 07:47:05 crc kubenswrapper[4893]: I0314 07:47:05.670682 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8nmt6" Mar 14 07:47:05 crc kubenswrapper[4893]: I0314 07:47:05.673994 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfngz\" (UniqueName: \"kubernetes.io/projected/f1f5f96e-bb94-47fa-a1f4-50f174e7929a-kube-api-access-hfngz\") pod \"redhat-operators-m9zjw\" (UID: \"f1f5f96e-bb94-47fa-a1f4-50f174e7929a\") " pod="openshift-marketplace/redhat-operators-m9zjw" Mar 14 07:47:05 crc kubenswrapper[4893]: I0314 07:47:05.674045 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1f5f96e-bb94-47fa-a1f4-50f174e7929a-catalog-content\") pod \"redhat-operators-m9zjw\" (UID: \"f1f5f96e-bb94-47fa-a1f4-50f174e7929a\") " pod="openshift-marketplace/redhat-operators-m9zjw" Mar 14 07:47:05 crc kubenswrapper[4893]: I0314 07:47:05.674098 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1f5f96e-bb94-47fa-a1f4-50f174e7929a-utilities\") pod \"redhat-operators-m9zjw\" (UID: \"f1f5f96e-bb94-47fa-a1f4-50f174e7929a\") " pod="openshift-marketplace/redhat-operators-m9zjw" Mar 14 07:47:05 crc kubenswrapper[4893]: I0314 07:47:05.675143 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m9zjw"] Mar 14 07:47:05 crc kubenswrapper[4893]: I0314 07:47:05.705301 4893 scope.go:117] "RemoveContainer" containerID="5b904167016cc11762d2f33b071f5ad288c7ceb861ca9703edb06cc91f5f9a0a" Mar 14 07:47:05 crc kubenswrapper[4893]: I0314 07:47:05.715400 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8nmt6"] Mar 14 07:47:05 crc kubenswrapper[4893]: I0314 07:47:05.720416 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8nmt6"] Mar 14 07:47:05 crc kubenswrapper[4893]: I0314 07:47:05.727158 4893 scope.go:117] "RemoveContainer" containerID="d717d9865a66625fd130de2346b23a1a0cc6c7062cc75773e1784dc7148d30b2" Mar 14 07:47:05 crc kubenswrapper[4893]: I0314 07:47:05.754755 4893 scope.go:117] "RemoveContainer" containerID="f61415d4c8cdf9bc5b5fad415cb333d3473ab4e51dbdaa555e434cfe231c55e4" Mar 14 07:47:05 crc kubenswrapper[4893]: E0314 07:47:05.755484 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f61415d4c8cdf9bc5b5fad415cb333d3473ab4e51dbdaa555e434cfe231c55e4\": container with ID starting with f61415d4c8cdf9bc5b5fad415cb333d3473ab4e51dbdaa555e434cfe231c55e4 not found: ID does not exist" containerID="f61415d4c8cdf9bc5b5fad415cb333d3473ab4e51dbdaa555e434cfe231c55e4" Mar 14 07:47:05 crc kubenswrapper[4893]: I0314 07:47:05.755568 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f61415d4c8cdf9bc5b5fad415cb333d3473ab4e51dbdaa555e434cfe231c55e4"} err="failed to get container status \"f61415d4c8cdf9bc5b5fad415cb333d3473ab4e51dbdaa555e434cfe231c55e4\": rpc error: code = NotFound desc = could not find container \"f61415d4c8cdf9bc5b5fad415cb333d3473ab4e51dbdaa555e434cfe231c55e4\": container with ID starting with f61415d4c8cdf9bc5b5fad415cb333d3473ab4e51dbdaa555e434cfe231c55e4 not found: ID does not exist" Mar 14 07:47:05 crc kubenswrapper[4893]: I0314 07:47:05.755598 4893 scope.go:117] "RemoveContainer" containerID="5b904167016cc11762d2f33b071f5ad288c7ceb861ca9703edb06cc91f5f9a0a" Mar 14 07:47:05 crc kubenswrapper[4893]: E0314 07:47:05.756170 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b904167016cc11762d2f33b071f5ad288c7ceb861ca9703edb06cc91f5f9a0a\": container with ID starting with 5b904167016cc11762d2f33b071f5ad288c7ceb861ca9703edb06cc91f5f9a0a not found: ID does not exist" containerID="5b904167016cc11762d2f33b071f5ad288c7ceb861ca9703edb06cc91f5f9a0a" Mar 14 07:47:05 crc kubenswrapper[4893]: I0314 07:47:05.756214 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b904167016cc11762d2f33b071f5ad288c7ceb861ca9703edb06cc91f5f9a0a"} err="failed to get container status \"5b904167016cc11762d2f33b071f5ad288c7ceb861ca9703edb06cc91f5f9a0a\": rpc error: code = NotFound desc = could not find container \"5b904167016cc11762d2f33b071f5ad288c7ceb861ca9703edb06cc91f5f9a0a\": container with ID starting with 5b904167016cc11762d2f33b071f5ad288c7ceb861ca9703edb06cc91f5f9a0a not found: ID does not exist" Mar 14 07:47:05 crc kubenswrapper[4893]: I0314 07:47:05.756244 4893 scope.go:117] "RemoveContainer" containerID="d717d9865a66625fd130de2346b23a1a0cc6c7062cc75773e1784dc7148d30b2" Mar 14 07:47:05 crc kubenswrapper[4893]: E0314 07:47:05.756507 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d717d9865a66625fd130de2346b23a1a0cc6c7062cc75773e1784dc7148d30b2\": container with ID starting with d717d9865a66625fd130de2346b23a1a0cc6c7062cc75773e1784dc7148d30b2 not found: ID does not exist" containerID="d717d9865a66625fd130de2346b23a1a0cc6c7062cc75773e1784dc7148d30b2" Mar 14 07:47:05 crc kubenswrapper[4893]: I0314 07:47:05.756687 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d717d9865a66625fd130de2346b23a1a0cc6c7062cc75773e1784dc7148d30b2"} err="failed to get container status \"d717d9865a66625fd130de2346b23a1a0cc6c7062cc75773e1784dc7148d30b2\": rpc error: code = NotFound desc = could not find container \"d717d9865a66625fd130de2346b23a1a0cc6c7062cc75773e1784dc7148d30b2\": container with ID starting with d717d9865a66625fd130de2346b23a1a0cc6c7062cc75773e1784dc7148d30b2 not found: ID does not exist" Mar 14 07:47:05 crc kubenswrapper[4893]: I0314 07:47:05.775833 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1f5f96e-bb94-47fa-a1f4-50f174e7929a-utilities\") pod \"redhat-operators-m9zjw\" (UID: \"f1f5f96e-bb94-47fa-a1f4-50f174e7929a\") " pod="openshift-marketplace/redhat-operators-m9zjw" Mar 14 07:47:05 crc kubenswrapper[4893]: I0314 07:47:05.775894 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfngz\" (UniqueName: \"kubernetes.io/projected/f1f5f96e-bb94-47fa-a1f4-50f174e7929a-kube-api-access-hfngz\") pod \"redhat-operators-m9zjw\" (UID: \"f1f5f96e-bb94-47fa-a1f4-50f174e7929a\") " pod="openshift-marketplace/redhat-operators-m9zjw" Mar 14 07:47:05 crc kubenswrapper[4893]: I0314 07:47:05.775942 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1f5f96e-bb94-47fa-a1f4-50f174e7929a-catalog-content\") pod \"redhat-operators-m9zjw\" (UID: \"f1f5f96e-bb94-47fa-a1f4-50f174e7929a\") " pod="openshift-marketplace/redhat-operators-m9zjw" Mar 14 07:47:05 crc kubenswrapper[4893]: I0314 07:47:05.776371 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1f5f96e-bb94-47fa-a1f4-50f174e7929a-catalog-content\") pod \"redhat-operators-m9zjw\" (UID: \"f1f5f96e-bb94-47fa-a1f4-50f174e7929a\") " pod="openshift-marketplace/redhat-operators-m9zjw" Mar 14 07:47:05 crc kubenswrapper[4893]: I0314 07:47:05.776602 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1f5f96e-bb94-47fa-a1f4-50f174e7929a-utilities\") pod \"redhat-operators-m9zjw\" (UID: \"f1f5f96e-bb94-47fa-a1f4-50f174e7929a\") " pod="openshift-marketplace/redhat-operators-m9zjw" Mar 14 07:47:05 crc kubenswrapper[4893]: I0314 07:47:05.792107 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfngz\" (UniqueName: \"kubernetes.io/projected/f1f5f96e-bb94-47fa-a1f4-50f174e7929a-kube-api-access-hfngz\") pod \"redhat-operators-m9zjw\" (UID: \"f1f5f96e-bb94-47fa-a1f4-50f174e7929a\") " pod="openshift-marketplace/redhat-operators-m9zjw" Mar 14 07:47:06 crc kubenswrapper[4893]: I0314 07:47:06.000789 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m9zjw" Mar 14 07:47:06 crc kubenswrapper[4893]: I0314 07:47:06.265628 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m9zjw"] Mar 14 07:47:06 crc kubenswrapper[4893]: I0314 07:47:06.683230 4893 generic.go:334] "Generic (PLEG): container finished" podID="f1f5f96e-bb94-47fa-a1f4-50f174e7929a" containerID="174676876ca90dd67a231481d4ff7aba4c935d61683283ea0f3b0fefaa63b2b6" exitCode=0 Mar 14 07:47:06 crc kubenswrapper[4893]: I0314 07:47:06.683720 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m9zjw" event={"ID":"f1f5f96e-bb94-47fa-a1f4-50f174e7929a","Type":"ContainerDied","Data":"174676876ca90dd67a231481d4ff7aba4c935d61683283ea0f3b0fefaa63b2b6"} Mar 14 07:47:06 crc kubenswrapper[4893]: I0314 07:47:06.683754 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m9zjw" event={"ID":"f1f5f96e-bb94-47fa-a1f4-50f174e7929a","Type":"ContainerStarted","Data":"ab45da0e1a384b7d88811573f987ed4df55c7b9aacd3c3d98bc3f0111e2abb04"} Mar 14 07:47:06 crc kubenswrapper[4893]: I0314 07:47:06.685570 4893 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 07:47:07 crc kubenswrapper[4893]: I0314 07:47:07.385207 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c201b70b-6716-4ca0-8e1b-a69eec255486" path="/var/lib/kubelet/pods/c201b70b-6716-4ca0-8e1b-a69eec255486/volumes" Mar 14 07:47:07 crc kubenswrapper[4893]: I0314 07:47:07.694691 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m9zjw" event={"ID":"f1f5f96e-bb94-47fa-a1f4-50f174e7929a","Type":"ContainerStarted","Data":"97e939de9cdf0716b28adab92c3e8114e88b3c78d9400005a84e7a29f9cfbf1a"} Mar 14 07:47:08 crc kubenswrapper[4893]: I0314 07:47:08.702786 4893 generic.go:334] "Generic (PLEG): container finished" podID="f1f5f96e-bb94-47fa-a1f4-50f174e7929a" containerID="97e939de9cdf0716b28adab92c3e8114e88b3c78d9400005a84e7a29f9cfbf1a" exitCode=0 Mar 14 07:47:08 crc kubenswrapper[4893]: I0314 07:47:08.702877 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m9zjw" event={"ID":"f1f5f96e-bb94-47fa-a1f4-50f174e7929a","Type":"ContainerDied","Data":"97e939de9cdf0716b28adab92c3e8114e88b3c78d9400005a84e7a29f9cfbf1a"} Mar 14 07:47:09 crc kubenswrapper[4893]: I0314 07:47:09.711905 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m9zjw" event={"ID":"f1f5f96e-bb94-47fa-a1f4-50f174e7929a","Type":"ContainerStarted","Data":"83e27bbefb767c17c2e27bf9a6982dd45a587519a0f6d09c0bc5582f72cf91de"} Mar 14 07:47:09 crc kubenswrapper[4893]: I0314 07:47:09.730169 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-m9zjw" podStartSLOduration=2.325714768 podStartE2EDuration="4.73015183s" podCreationTimestamp="2026-03-14 07:47:05 +0000 UTC" firstStartedPulling="2026-03-14 07:47:06.685250028 +0000 UTC m=+2905.947426820" lastFinishedPulling="2026-03-14 07:47:09.08968709 +0000 UTC m=+2908.351863882" observedRunningTime="2026-03-14 07:47:09.727454164 +0000 UTC m=+2908.989630966" watchObservedRunningTime="2026-03-14 07:47:09.73015183 +0000 UTC m=+2908.992328632" Mar 14 07:47:16 crc kubenswrapper[4893]: I0314 07:47:16.001638 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-m9zjw" Mar 14 07:47:16 crc kubenswrapper[4893]: I0314 07:47:16.002325 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-m9zjw" Mar 14 07:47:16 crc kubenswrapper[4893]: I0314 07:47:16.051090 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-m9zjw" Mar 14 07:47:16 crc kubenswrapper[4893]: I0314 07:47:16.796325 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-m9zjw" Mar 14 07:47:16 crc kubenswrapper[4893]: I0314 07:47:16.846322 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m9zjw"] Mar 14 07:47:18 crc kubenswrapper[4893]: I0314 07:47:18.785965 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-m9zjw" podUID="f1f5f96e-bb94-47fa-a1f4-50f174e7929a" containerName="registry-server" containerID="cri-o://83e27bbefb767c17c2e27bf9a6982dd45a587519a0f6d09c0bc5582f72cf91de" gracePeriod=2 Mar 14 07:47:19 crc kubenswrapper[4893]: I0314 07:47:19.690507 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m9zjw" Mar 14 07:47:19 crc kubenswrapper[4893]: I0314 07:47:19.796354 4893 generic.go:334] "Generic (PLEG): container finished" podID="f1f5f96e-bb94-47fa-a1f4-50f174e7929a" containerID="83e27bbefb767c17c2e27bf9a6982dd45a587519a0f6d09c0bc5582f72cf91de" exitCode=0 Mar 14 07:47:19 crc kubenswrapper[4893]: I0314 07:47:19.796421 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m9zjw" event={"ID":"f1f5f96e-bb94-47fa-a1f4-50f174e7929a","Type":"ContainerDied","Data":"83e27bbefb767c17c2e27bf9a6982dd45a587519a0f6d09c0bc5582f72cf91de"} Mar 14 07:47:19 crc kubenswrapper[4893]: I0314 07:47:19.796477 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m9zjw" event={"ID":"f1f5f96e-bb94-47fa-a1f4-50f174e7929a","Type":"ContainerDied","Data":"ab45da0e1a384b7d88811573f987ed4df55c7b9aacd3c3d98bc3f0111e2abb04"} Mar 14 07:47:19 crc kubenswrapper[4893]: I0314 07:47:19.796499 4893 scope.go:117] "RemoveContainer" containerID="83e27bbefb767c17c2e27bf9a6982dd45a587519a0f6d09c0bc5582f72cf91de" Mar 14 07:47:19 crc kubenswrapper[4893]: I0314 07:47:19.796719 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m9zjw" Mar 14 07:47:19 crc kubenswrapper[4893]: I0314 07:47:19.819402 4893 scope.go:117] "RemoveContainer" containerID="97e939de9cdf0716b28adab92c3e8114e88b3c78d9400005a84e7a29f9cfbf1a" Mar 14 07:47:19 crc kubenswrapper[4893]: I0314 07:47:19.837670 4893 scope.go:117] "RemoveContainer" containerID="174676876ca90dd67a231481d4ff7aba4c935d61683283ea0f3b0fefaa63b2b6" Mar 14 07:47:19 crc kubenswrapper[4893]: I0314 07:47:19.859556 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hfngz\" (UniqueName: \"kubernetes.io/projected/f1f5f96e-bb94-47fa-a1f4-50f174e7929a-kube-api-access-hfngz\") pod \"f1f5f96e-bb94-47fa-a1f4-50f174e7929a\" (UID: \"f1f5f96e-bb94-47fa-a1f4-50f174e7929a\") " Mar 14 07:47:19 crc kubenswrapper[4893]: I0314 07:47:19.859676 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1f5f96e-bb94-47fa-a1f4-50f174e7929a-utilities\") pod \"f1f5f96e-bb94-47fa-a1f4-50f174e7929a\" (UID: \"f1f5f96e-bb94-47fa-a1f4-50f174e7929a\") " Mar 14 07:47:19 crc kubenswrapper[4893]: I0314 07:47:19.859726 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1f5f96e-bb94-47fa-a1f4-50f174e7929a-catalog-content\") pod \"f1f5f96e-bb94-47fa-a1f4-50f174e7929a\" (UID: \"f1f5f96e-bb94-47fa-a1f4-50f174e7929a\") " Mar 14 07:47:19 crc kubenswrapper[4893]: I0314 07:47:19.864847 4893 scope.go:117] "RemoveContainer" containerID="83e27bbefb767c17c2e27bf9a6982dd45a587519a0f6d09c0bc5582f72cf91de" Mar 14 07:47:19 crc kubenswrapper[4893]: E0314 07:47:19.865411 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83e27bbefb767c17c2e27bf9a6982dd45a587519a0f6d09c0bc5582f72cf91de\": container with ID starting with 83e27bbefb767c17c2e27bf9a6982dd45a587519a0f6d09c0bc5582f72cf91de not found: ID does not exist" containerID="83e27bbefb767c17c2e27bf9a6982dd45a587519a0f6d09c0bc5582f72cf91de" Mar 14 07:47:19 crc kubenswrapper[4893]: I0314 07:47:19.865465 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83e27bbefb767c17c2e27bf9a6982dd45a587519a0f6d09c0bc5582f72cf91de"} err="failed to get container status \"83e27bbefb767c17c2e27bf9a6982dd45a587519a0f6d09c0bc5582f72cf91de\": rpc error: code = NotFound desc = could not find container \"83e27bbefb767c17c2e27bf9a6982dd45a587519a0f6d09c0bc5582f72cf91de\": container with ID starting with 83e27bbefb767c17c2e27bf9a6982dd45a587519a0f6d09c0bc5582f72cf91de not found: ID does not exist" Mar 14 07:47:19 crc kubenswrapper[4893]: I0314 07:47:19.865501 4893 scope.go:117] "RemoveContainer" containerID="97e939de9cdf0716b28adab92c3e8114e88b3c78d9400005a84e7a29f9cfbf1a" Mar 14 07:47:19 crc kubenswrapper[4893]: E0314 07:47:19.865963 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97e939de9cdf0716b28adab92c3e8114e88b3c78d9400005a84e7a29f9cfbf1a\": container with ID starting with 97e939de9cdf0716b28adab92c3e8114e88b3c78d9400005a84e7a29f9cfbf1a not found: ID does not exist" containerID="97e939de9cdf0716b28adab92c3e8114e88b3c78d9400005a84e7a29f9cfbf1a" Mar 14 07:47:19 crc kubenswrapper[4893]: I0314 07:47:19.866039 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97e939de9cdf0716b28adab92c3e8114e88b3c78d9400005a84e7a29f9cfbf1a"} err="failed to get container status \"97e939de9cdf0716b28adab92c3e8114e88b3c78d9400005a84e7a29f9cfbf1a\": rpc error: code = NotFound desc = could not find container \"97e939de9cdf0716b28adab92c3e8114e88b3c78d9400005a84e7a29f9cfbf1a\": container with ID starting with 97e939de9cdf0716b28adab92c3e8114e88b3c78d9400005a84e7a29f9cfbf1a not found: ID does not exist" Mar 14 07:47:19 crc kubenswrapper[4893]: I0314 07:47:19.866071 4893 scope.go:117] "RemoveContainer" containerID="174676876ca90dd67a231481d4ff7aba4c935d61683283ea0f3b0fefaa63b2b6" Mar 14 07:47:19 crc kubenswrapper[4893]: E0314 07:47:19.866329 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"174676876ca90dd67a231481d4ff7aba4c935d61683283ea0f3b0fefaa63b2b6\": container with ID starting with 174676876ca90dd67a231481d4ff7aba4c935d61683283ea0f3b0fefaa63b2b6 not found: ID does not exist" containerID="174676876ca90dd67a231481d4ff7aba4c935d61683283ea0f3b0fefaa63b2b6" Mar 14 07:47:19 crc kubenswrapper[4893]: I0314 07:47:19.866358 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"174676876ca90dd67a231481d4ff7aba4c935d61683283ea0f3b0fefaa63b2b6"} err="failed to get container status \"174676876ca90dd67a231481d4ff7aba4c935d61683283ea0f3b0fefaa63b2b6\": rpc error: code = NotFound desc = could not find container \"174676876ca90dd67a231481d4ff7aba4c935d61683283ea0f3b0fefaa63b2b6\": container with ID starting with 174676876ca90dd67a231481d4ff7aba4c935d61683283ea0f3b0fefaa63b2b6 not found: ID does not exist" Mar 14 07:47:19 crc kubenswrapper[4893]: I0314 07:47:19.872733 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1f5f96e-bb94-47fa-a1f4-50f174e7929a-utilities" (OuterVolumeSpecName: "utilities") pod "f1f5f96e-bb94-47fa-a1f4-50f174e7929a" (UID: "f1f5f96e-bb94-47fa-a1f4-50f174e7929a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:47:19 crc kubenswrapper[4893]: I0314 07:47:19.878150 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1f5f96e-bb94-47fa-a1f4-50f174e7929a-kube-api-access-hfngz" (OuterVolumeSpecName: "kube-api-access-hfngz") pod "f1f5f96e-bb94-47fa-a1f4-50f174e7929a" (UID: "f1f5f96e-bb94-47fa-a1f4-50f174e7929a"). InnerVolumeSpecName "kube-api-access-hfngz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:47:19 crc kubenswrapper[4893]: I0314 07:47:19.961556 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hfngz\" (UniqueName: \"kubernetes.io/projected/f1f5f96e-bb94-47fa-a1f4-50f174e7929a-kube-api-access-hfngz\") on node \"crc\" DevicePath \"\"" Mar 14 07:47:19 crc kubenswrapper[4893]: I0314 07:47:19.961596 4893 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1f5f96e-bb94-47fa-a1f4-50f174e7929a-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 07:47:20 crc kubenswrapper[4893]: I0314 07:47:20.005507 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1f5f96e-bb94-47fa-a1f4-50f174e7929a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f1f5f96e-bb94-47fa-a1f4-50f174e7929a" (UID: "f1f5f96e-bb94-47fa-a1f4-50f174e7929a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:47:20 crc kubenswrapper[4893]: I0314 07:47:20.063905 4893 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1f5f96e-bb94-47fa-a1f4-50f174e7929a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 07:47:20 crc kubenswrapper[4893]: I0314 07:47:20.124081 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m9zjw"] Mar 14 07:47:20 crc kubenswrapper[4893]: I0314 07:47:20.128602 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-m9zjw"] Mar 14 07:47:21 crc kubenswrapper[4893]: I0314 07:47:21.390428 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1f5f96e-bb94-47fa-a1f4-50f174e7929a" path="/var/lib/kubelet/pods/f1f5f96e-bb94-47fa-a1f4-50f174e7929a/volumes" Mar 14 07:48:00 crc kubenswrapper[4893]: I0314 07:48:00.143624 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557908-r98tz"] Mar 14 07:48:00 crc kubenswrapper[4893]: E0314 07:48:00.144482 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1f5f96e-bb94-47fa-a1f4-50f174e7929a" containerName="extract-content" Mar 14 07:48:00 crc kubenswrapper[4893]: I0314 07:48:00.144496 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1f5f96e-bb94-47fa-a1f4-50f174e7929a" containerName="extract-content" Mar 14 07:48:00 crc kubenswrapper[4893]: E0314 07:48:00.144510 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1f5f96e-bb94-47fa-a1f4-50f174e7929a" containerName="extract-utilities" Mar 14 07:48:00 crc kubenswrapper[4893]: I0314 07:48:00.144522 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1f5f96e-bb94-47fa-a1f4-50f174e7929a" containerName="extract-utilities" Mar 14 07:48:00 crc kubenswrapper[4893]: E0314 07:48:00.144548 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1f5f96e-bb94-47fa-a1f4-50f174e7929a" containerName="registry-server" Mar 14 07:48:00 crc kubenswrapper[4893]: I0314 07:48:00.144557 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1f5f96e-bb94-47fa-a1f4-50f174e7929a" containerName="registry-server" Mar 14 07:48:00 crc kubenswrapper[4893]: I0314 07:48:00.144696 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1f5f96e-bb94-47fa-a1f4-50f174e7929a" containerName="registry-server" Mar 14 07:48:00 crc kubenswrapper[4893]: I0314 07:48:00.145172 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557908-r98tz" Mar 14 07:48:00 crc kubenswrapper[4893]: I0314 07:48:00.147680 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 07:48:00 crc kubenswrapper[4893]: I0314 07:48:00.148293 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-44qb7" Mar 14 07:48:00 crc kubenswrapper[4893]: I0314 07:48:00.148571 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 07:48:00 crc kubenswrapper[4893]: I0314 07:48:00.150860 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557908-r98tz"] Mar 14 07:48:00 crc kubenswrapper[4893]: I0314 07:48:00.235292 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmxgm\" (UniqueName: \"kubernetes.io/projected/d8f5173f-4a2f-40f7-9732-4b1de949fa02-kube-api-access-wmxgm\") pod \"auto-csr-approver-29557908-r98tz\" (UID: \"d8f5173f-4a2f-40f7-9732-4b1de949fa02\") " pod="openshift-infra/auto-csr-approver-29557908-r98tz" Mar 14 07:48:00 crc kubenswrapper[4893]: I0314 07:48:00.336480 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmxgm\" (UniqueName: \"kubernetes.io/projected/d8f5173f-4a2f-40f7-9732-4b1de949fa02-kube-api-access-wmxgm\") pod \"auto-csr-approver-29557908-r98tz\" (UID: \"d8f5173f-4a2f-40f7-9732-4b1de949fa02\") " pod="openshift-infra/auto-csr-approver-29557908-r98tz" Mar 14 07:48:00 crc kubenswrapper[4893]: I0314 07:48:00.363522 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmxgm\" (UniqueName: \"kubernetes.io/projected/d8f5173f-4a2f-40f7-9732-4b1de949fa02-kube-api-access-wmxgm\") pod \"auto-csr-approver-29557908-r98tz\" (UID: \"d8f5173f-4a2f-40f7-9732-4b1de949fa02\") " pod="openshift-infra/auto-csr-approver-29557908-r98tz" Mar 14 07:48:00 crc kubenswrapper[4893]: I0314 07:48:00.462192 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557908-r98tz" Mar 14 07:48:00 crc kubenswrapper[4893]: I0314 07:48:00.865669 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557908-r98tz"] Mar 14 07:48:01 crc kubenswrapper[4893]: I0314 07:48:01.088344 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557908-r98tz" event={"ID":"d8f5173f-4a2f-40f7-9732-4b1de949fa02","Type":"ContainerStarted","Data":"353d1f1d877bbe211b0811c85394fcbdc36853bbba56010fc74da5c1a32882d3"} Mar 14 07:48:02 crc kubenswrapper[4893]: I0314 07:48:02.096484 4893 generic.go:334] "Generic (PLEG): container finished" podID="d8f5173f-4a2f-40f7-9732-4b1de949fa02" containerID="9eeeb353483d1e0337e44327b9d928c1f2157671fb4057af573157e2091cf534" exitCode=0 Mar 14 07:48:02 crc kubenswrapper[4893]: I0314 07:48:02.096820 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557908-r98tz" event={"ID":"d8f5173f-4a2f-40f7-9732-4b1de949fa02","Type":"ContainerDied","Data":"9eeeb353483d1e0337e44327b9d928c1f2157671fb4057af573157e2091cf534"} Mar 14 07:48:03 crc kubenswrapper[4893]: I0314 07:48:03.407947 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557908-r98tz" Mar 14 07:48:03 crc kubenswrapper[4893]: I0314 07:48:03.607737 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmxgm\" (UniqueName: \"kubernetes.io/projected/d8f5173f-4a2f-40f7-9732-4b1de949fa02-kube-api-access-wmxgm\") pod \"d8f5173f-4a2f-40f7-9732-4b1de949fa02\" (UID: \"d8f5173f-4a2f-40f7-9732-4b1de949fa02\") " Mar 14 07:48:03 crc kubenswrapper[4893]: I0314 07:48:03.613257 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8f5173f-4a2f-40f7-9732-4b1de949fa02-kube-api-access-wmxgm" (OuterVolumeSpecName: "kube-api-access-wmxgm") pod "d8f5173f-4a2f-40f7-9732-4b1de949fa02" (UID: "d8f5173f-4a2f-40f7-9732-4b1de949fa02"). InnerVolumeSpecName "kube-api-access-wmxgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:48:03 crc kubenswrapper[4893]: I0314 07:48:03.710348 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmxgm\" (UniqueName: \"kubernetes.io/projected/d8f5173f-4a2f-40f7-9732-4b1de949fa02-kube-api-access-wmxgm\") on node \"crc\" DevicePath \"\"" Mar 14 07:48:04 crc kubenswrapper[4893]: I0314 07:48:04.114945 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557908-r98tz" event={"ID":"d8f5173f-4a2f-40f7-9732-4b1de949fa02","Type":"ContainerDied","Data":"353d1f1d877bbe211b0811c85394fcbdc36853bbba56010fc74da5c1a32882d3"} Mar 14 07:48:04 crc kubenswrapper[4893]: I0314 07:48:04.115002 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="353d1f1d877bbe211b0811c85394fcbdc36853bbba56010fc74da5c1a32882d3" Mar 14 07:48:04 crc kubenswrapper[4893]: I0314 07:48:04.115010 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557908-r98tz" Mar 14 07:48:04 crc kubenswrapper[4893]: I0314 07:48:04.486869 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557902-bnbmw"] Mar 14 07:48:04 crc kubenswrapper[4893]: I0314 07:48:04.493870 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557902-bnbmw"] Mar 14 07:48:05 crc kubenswrapper[4893]: I0314 07:48:05.387018 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7cb9ac3-d81d-481c-be21-914866d0d3d8" path="/var/lib/kubelet/pods/a7cb9ac3-d81d-481c-be21-914866d0d3d8/volumes" Mar 14 07:48:07 crc kubenswrapper[4893]: I0314 07:48:07.753397 4893 scope.go:117] "RemoveContainer" containerID="c3f3a9afceb13e93fb754ab04f95b7f6bd1c323074479894b9e7909f1a705c75" Mar 14 07:48:59 crc kubenswrapper[4893]: I0314 07:48:59.731758 4893 patch_prober.go:28] interesting pod/machine-config-daemon-d4x6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:48:59 crc kubenswrapper[4893]: I0314 07:48:59.732491 4893 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:49:29 crc kubenswrapper[4893]: I0314 07:49:29.731102 4893 patch_prober.go:28] interesting pod/machine-config-daemon-d4x6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:49:29 crc kubenswrapper[4893]: I0314 07:49:29.732199 4893 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:49:59 crc kubenswrapper[4893]: I0314 07:49:59.731260 4893 patch_prober.go:28] interesting pod/machine-config-daemon-d4x6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:49:59 crc kubenswrapper[4893]: I0314 07:49:59.732099 4893 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:49:59 crc kubenswrapper[4893]: I0314 07:49:59.732157 4893 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" Mar 14 07:49:59 crc kubenswrapper[4893]: I0314 07:49:59.732707 4893 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7d4f888dda4f72e93fbe049c9579bc4d31994ffcfdb006bc2c321cea789b8be2"} pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 07:49:59 crc kubenswrapper[4893]: I0314 07:49:59.732776 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" containerID="cri-o://7d4f888dda4f72e93fbe049c9579bc4d31994ffcfdb006bc2c321cea789b8be2" gracePeriod=600 Mar 14 07:49:59 crc kubenswrapper[4893]: E0314 07:49:59.875831 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:49:59 crc kubenswrapper[4893]: I0314 07:49:59.955860 4893 generic.go:334] "Generic (PLEG): container finished" podID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerID="7d4f888dda4f72e93fbe049c9579bc4d31994ffcfdb006bc2c321cea789b8be2" exitCode=0 Mar 14 07:49:59 crc kubenswrapper[4893]: I0314 07:49:59.955900 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" event={"ID":"ad6724e5-48cf-4417-ae51-b1cb8c6af70d","Type":"ContainerDied","Data":"7d4f888dda4f72e93fbe049c9579bc4d31994ffcfdb006bc2c321cea789b8be2"} Mar 14 07:49:59 crc kubenswrapper[4893]: I0314 07:49:59.955935 4893 scope.go:117] "RemoveContainer" containerID="16139de9ab1a03adaba97c225dd8a3825dfe9dc665a321422feff65835b9630f" Mar 14 07:49:59 crc kubenswrapper[4893]: I0314 07:49:59.956441 4893 scope.go:117] "RemoveContainer" containerID="7d4f888dda4f72e93fbe049c9579bc4d31994ffcfdb006bc2c321cea789b8be2" Mar 14 07:49:59 crc kubenswrapper[4893]: E0314 07:49:59.956678 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:50:00 crc kubenswrapper[4893]: I0314 07:50:00.139179 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557910-grrsl"] Mar 14 07:50:00 crc kubenswrapper[4893]: E0314 07:50:00.139455 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8f5173f-4a2f-40f7-9732-4b1de949fa02" containerName="oc" Mar 14 07:50:00 crc kubenswrapper[4893]: I0314 07:50:00.139468 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8f5173f-4a2f-40f7-9732-4b1de949fa02" containerName="oc" Mar 14 07:50:00 crc kubenswrapper[4893]: I0314 07:50:00.139641 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8f5173f-4a2f-40f7-9732-4b1de949fa02" containerName="oc" Mar 14 07:50:00 crc kubenswrapper[4893]: I0314 07:50:00.140069 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557910-grrsl" Mar 14 07:50:00 crc kubenswrapper[4893]: I0314 07:50:00.147157 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 07:50:00 crc kubenswrapper[4893]: I0314 07:50:00.147383 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-44qb7" Mar 14 07:50:00 crc kubenswrapper[4893]: I0314 07:50:00.149482 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557910-grrsl"] Mar 14 07:50:00 crc kubenswrapper[4893]: I0314 07:50:00.152801 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 07:50:00 crc kubenswrapper[4893]: I0314 07:50:00.236407 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h45z7\" (UniqueName: \"kubernetes.io/projected/7665cdc0-87da-4725-970c-a7c3a61f0363-kube-api-access-h45z7\") pod \"auto-csr-approver-29557910-grrsl\" (UID: \"7665cdc0-87da-4725-970c-a7c3a61f0363\") " pod="openshift-infra/auto-csr-approver-29557910-grrsl" Mar 14 07:50:00 crc kubenswrapper[4893]: I0314 07:50:00.338991 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h45z7\" (UniqueName: \"kubernetes.io/projected/7665cdc0-87da-4725-970c-a7c3a61f0363-kube-api-access-h45z7\") pod \"auto-csr-approver-29557910-grrsl\" (UID: \"7665cdc0-87da-4725-970c-a7c3a61f0363\") " pod="openshift-infra/auto-csr-approver-29557910-grrsl" Mar 14 07:50:00 crc kubenswrapper[4893]: I0314 07:50:00.369201 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h45z7\" (UniqueName: \"kubernetes.io/projected/7665cdc0-87da-4725-970c-a7c3a61f0363-kube-api-access-h45z7\") pod \"auto-csr-approver-29557910-grrsl\" (UID: \"7665cdc0-87da-4725-970c-a7c3a61f0363\") " pod="openshift-infra/auto-csr-approver-29557910-grrsl" Mar 14 07:50:00 crc kubenswrapper[4893]: I0314 07:50:00.459626 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557910-grrsl" Mar 14 07:50:00 crc kubenswrapper[4893]: I0314 07:50:00.872413 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557910-grrsl"] Mar 14 07:50:00 crc kubenswrapper[4893]: I0314 07:50:00.966544 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557910-grrsl" event={"ID":"7665cdc0-87da-4725-970c-a7c3a61f0363","Type":"ContainerStarted","Data":"678bbd8bdd401e9a792b9553b6ead9b38a5c1bcffad79fad6fc6b0f02e79728c"} Mar 14 07:50:02 crc kubenswrapper[4893]: I0314 07:50:02.984943 4893 generic.go:334] "Generic (PLEG): container finished" podID="7665cdc0-87da-4725-970c-a7c3a61f0363" containerID="1f1d47a9a1f75643124c5cf82b9dba01964e9173ba78790670612495e6b4e585" exitCode=0 Mar 14 07:50:02 crc kubenswrapper[4893]: I0314 07:50:02.985022 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557910-grrsl" event={"ID":"7665cdc0-87da-4725-970c-a7c3a61f0363","Type":"ContainerDied","Data":"1f1d47a9a1f75643124c5cf82b9dba01964e9173ba78790670612495e6b4e585"} Mar 14 07:50:04 crc kubenswrapper[4893]: I0314 07:50:04.285699 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557910-grrsl" Mar 14 07:50:04 crc kubenswrapper[4893]: I0314 07:50:04.289748 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h45z7\" (UniqueName: \"kubernetes.io/projected/7665cdc0-87da-4725-970c-a7c3a61f0363-kube-api-access-h45z7\") pod \"7665cdc0-87da-4725-970c-a7c3a61f0363\" (UID: \"7665cdc0-87da-4725-970c-a7c3a61f0363\") " Mar 14 07:50:04 crc kubenswrapper[4893]: I0314 07:50:04.295917 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7665cdc0-87da-4725-970c-a7c3a61f0363-kube-api-access-h45z7" (OuterVolumeSpecName: "kube-api-access-h45z7") pod "7665cdc0-87da-4725-970c-a7c3a61f0363" (UID: "7665cdc0-87da-4725-970c-a7c3a61f0363"). InnerVolumeSpecName "kube-api-access-h45z7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:50:04 crc kubenswrapper[4893]: I0314 07:50:04.390890 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h45z7\" (UniqueName: \"kubernetes.io/projected/7665cdc0-87da-4725-970c-a7c3a61f0363-kube-api-access-h45z7\") on node \"crc\" DevicePath \"\"" Mar 14 07:50:05 crc kubenswrapper[4893]: I0314 07:50:05.005831 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557910-grrsl" event={"ID":"7665cdc0-87da-4725-970c-a7c3a61f0363","Type":"ContainerDied","Data":"678bbd8bdd401e9a792b9553b6ead9b38a5c1bcffad79fad6fc6b0f02e79728c"} Mar 14 07:50:05 crc kubenswrapper[4893]: I0314 07:50:05.005872 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="678bbd8bdd401e9a792b9553b6ead9b38a5c1bcffad79fad6fc6b0f02e79728c" Mar 14 07:50:05 crc kubenswrapper[4893]: I0314 07:50:05.005902 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557910-grrsl" Mar 14 07:50:05 crc kubenswrapper[4893]: I0314 07:50:05.344981 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557904-4z2xz"] Mar 14 07:50:05 crc kubenswrapper[4893]: I0314 07:50:05.351384 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557904-4z2xz"] Mar 14 07:50:05 crc kubenswrapper[4893]: I0314 07:50:05.385007 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36dcf765-b9cd-40a4-8794-40dd081e5d91" path="/var/lib/kubelet/pods/36dcf765-b9cd-40a4-8794-40dd081e5d91/volumes" Mar 14 07:50:07 crc kubenswrapper[4893]: I0314 07:50:07.835950 4893 scope.go:117] "RemoveContainer" containerID="d8e2b21cc83acc02487212d8644c920f9c242450d7179f285949fe5868385b5c" Mar 14 07:50:11 crc kubenswrapper[4893]: I0314 07:50:11.381292 4893 scope.go:117] "RemoveContainer" containerID="7d4f888dda4f72e93fbe049c9579bc4d31994ffcfdb006bc2c321cea789b8be2" Mar 14 07:50:11 crc kubenswrapper[4893]: E0314 07:50:11.382107 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:50:25 crc kubenswrapper[4893]: I0314 07:50:25.376751 4893 scope.go:117] "RemoveContainer" containerID="7d4f888dda4f72e93fbe049c9579bc4d31994ffcfdb006bc2c321cea789b8be2" Mar 14 07:50:25 crc kubenswrapper[4893]: E0314 07:50:25.377419 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:50:36 crc kubenswrapper[4893]: I0314 07:50:36.377030 4893 scope.go:117] "RemoveContainer" containerID="7d4f888dda4f72e93fbe049c9579bc4d31994ffcfdb006bc2c321cea789b8be2" Mar 14 07:50:36 crc kubenswrapper[4893]: E0314 07:50:36.377938 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:50:49 crc kubenswrapper[4893]: I0314 07:50:49.376708 4893 scope.go:117] "RemoveContainer" containerID="7d4f888dda4f72e93fbe049c9579bc4d31994ffcfdb006bc2c321cea789b8be2" Mar 14 07:50:49 crc kubenswrapper[4893]: E0314 07:50:49.377538 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:51:01 crc kubenswrapper[4893]: I0314 07:51:01.380180 4893 scope.go:117] "RemoveContainer" containerID="7d4f888dda4f72e93fbe049c9579bc4d31994ffcfdb006bc2c321cea789b8be2" Mar 14 07:51:01 crc kubenswrapper[4893]: E0314 07:51:01.380916 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:51:15 crc kubenswrapper[4893]: I0314 07:51:15.376935 4893 scope.go:117] "RemoveContainer" containerID="7d4f888dda4f72e93fbe049c9579bc4d31994ffcfdb006bc2c321cea789b8be2" Mar 14 07:51:15 crc kubenswrapper[4893]: E0314 07:51:15.377663 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:51:26 crc kubenswrapper[4893]: I0314 07:51:26.376559 4893 scope.go:117] "RemoveContainer" containerID="7d4f888dda4f72e93fbe049c9579bc4d31994ffcfdb006bc2c321cea789b8be2" Mar 14 07:51:26 crc kubenswrapper[4893]: E0314 07:51:26.377595 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:51:38 crc kubenswrapper[4893]: I0314 07:51:38.377158 4893 scope.go:117] "RemoveContainer" containerID="7d4f888dda4f72e93fbe049c9579bc4d31994ffcfdb006bc2c321cea789b8be2" Mar 14 07:51:38 crc kubenswrapper[4893]: E0314 07:51:38.377972 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:51:50 crc kubenswrapper[4893]: I0314 07:51:50.376089 4893 scope.go:117] "RemoveContainer" containerID="7d4f888dda4f72e93fbe049c9579bc4d31994ffcfdb006bc2c321cea789b8be2" Mar 14 07:51:50 crc kubenswrapper[4893]: E0314 07:51:50.376794 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:52:00 crc kubenswrapper[4893]: I0314 07:52:00.177649 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557912-9vhtd"] Mar 14 07:52:00 crc kubenswrapper[4893]: E0314 07:52:00.179205 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7665cdc0-87da-4725-970c-a7c3a61f0363" containerName="oc" Mar 14 07:52:00 crc kubenswrapper[4893]: I0314 07:52:00.179225 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="7665cdc0-87da-4725-970c-a7c3a61f0363" containerName="oc" Mar 14 07:52:00 crc kubenswrapper[4893]: I0314 07:52:00.183190 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="7665cdc0-87da-4725-970c-a7c3a61f0363" containerName="oc" Mar 14 07:52:00 crc kubenswrapper[4893]: I0314 07:52:00.184281 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557912-9vhtd"] Mar 14 07:52:00 crc kubenswrapper[4893]: I0314 07:52:00.184382 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557912-9vhtd" Mar 14 07:52:00 crc kubenswrapper[4893]: I0314 07:52:00.187507 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-44qb7" Mar 14 07:52:00 crc kubenswrapper[4893]: I0314 07:52:00.187979 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 07:52:00 crc kubenswrapper[4893]: I0314 07:52:00.191362 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 07:52:00 crc kubenswrapper[4893]: I0314 07:52:00.293110 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kx5h\" (UniqueName: \"kubernetes.io/projected/63ca36f5-9a7b-4f5c-ab5a-48f13abcaf6d-kube-api-access-7kx5h\") pod \"auto-csr-approver-29557912-9vhtd\" (UID: \"63ca36f5-9a7b-4f5c-ab5a-48f13abcaf6d\") " pod="openshift-infra/auto-csr-approver-29557912-9vhtd" Mar 14 07:52:00 crc kubenswrapper[4893]: I0314 07:52:00.394200 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kx5h\" (UniqueName: \"kubernetes.io/projected/63ca36f5-9a7b-4f5c-ab5a-48f13abcaf6d-kube-api-access-7kx5h\") pod \"auto-csr-approver-29557912-9vhtd\" (UID: \"63ca36f5-9a7b-4f5c-ab5a-48f13abcaf6d\") " pod="openshift-infra/auto-csr-approver-29557912-9vhtd" Mar 14 07:52:00 crc kubenswrapper[4893]: I0314 07:52:00.413786 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kx5h\" (UniqueName: \"kubernetes.io/projected/63ca36f5-9a7b-4f5c-ab5a-48f13abcaf6d-kube-api-access-7kx5h\") pod \"auto-csr-approver-29557912-9vhtd\" (UID: \"63ca36f5-9a7b-4f5c-ab5a-48f13abcaf6d\") " pod="openshift-infra/auto-csr-approver-29557912-9vhtd" Mar 14 07:52:00 crc kubenswrapper[4893]: I0314 07:52:00.518054 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557912-9vhtd" Mar 14 07:52:00 crc kubenswrapper[4893]: I0314 07:52:00.926792 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557912-9vhtd"] Mar 14 07:52:01 crc kubenswrapper[4893]: I0314 07:52:01.835217 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557912-9vhtd" event={"ID":"63ca36f5-9a7b-4f5c-ab5a-48f13abcaf6d","Type":"ContainerStarted","Data":"0eb4705ae3185dcea4437b2dc5cf5baea6d561fca9997f66d44953fb1188e6a6"} Mar 14 07:52:02 crc kubenswrapper[4893]: I0314 07:52:02.843209 4893 generic.go:334] "Generic (PLEG): container finished" podID="63ca36f5-9a7b-4f5c-ab5a-48f13abcaf6d" containerID="6392e154b8552643b629c7840e16b54867cfc0ce6f6256f55e0a98f48ce51b56" exitCode=0 Mar 14 07:52:02 crc kubenswrapper[4893]: I0314 07:52:02.843327 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557912-9vhtd" event={"ID":"63ca36f5-9a7b-4f5c-ab5a-48f13abcaf6d","Type":"ContainerDied","Data":"6392e154b8552643b629c7840e16b54867cfc0ce6f6256f55e0a98f48ce51b56"} Mar 14 07:52:03 crc kubenswrapper[4893]: I0314 07:52:03.377487 4893 scope.go:117] "RemoveContainer" containerID="7d4f888dda4f72e93fbe049c9579bc4d31994ffcfdb006bc2c321cea789b8be2" Mar 14 07:52:03 crc kubenswrapper[4893]: E0314 07:52:03.377753 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:52:04 crc kubenswrapper[4893]: I0314 07:52:04.102745 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557912-9vhtd" Mar 14 07:52:04 crc kubenswrapper[4893]: I0314 07:52:04.245028 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7kx5h\" (UniqueName: \"kubernetes.io/projected/63ca36f5-9a7b-4f5c-ab5a-48f13abcaf6d-kube-api-access-7kx5h\") pod \"63ca36f5-9a7b-4f5c-ab5a-48f13abcaf6d\" (UID: \"63ca36f5-9a7b-4f5c-ab5a-48f13abcaf6d\") " Mar 14 07:52:04 crc kubenswrapper[4893]: I0314 07:52:04.256859 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63ca36f5-9a7b-4f5c-ab5a-48f13abcaf6d-kube-api-access-7kx5h" (OuterVolumeSpecName: "kube-api-access-7kx5h") pod "63ca36f5-9a7b-4f5c-ab5a-48f13abcaf6d" (UID: "63ca36f5-9a7b-4f5c-ab5a-48f13abcaf6d"). InnerVolumeSpecName "kube-api-access-7kx5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:52:04 crc kubenswrapper[4893]: I0314 07:52:04.346650 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7kx5h\" (UniqueName: \"kubernetes.io/projected/63ca36f5-9a7b-4f5c-ab5a-48f13abcaf6d-kube-api-access-7kx5h\") on node \"crc\" DevicePath \"\"" Mar 14 07:52:04 crc kubenswrapper[4893]: I0314 07:52:04.856999 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557912-9vhtd" event={"ID":"63ca36f5-9a7b-4f5c-ab5a-48f13abcaf6d","Type":"ContainerDied","Data":"0eb4705ae3185dcea4437b2dc5cf5baea6d561fca9997f66d44953fb1188e6a6"} Mar 14 07:52:04 crc kubenswrapper[4893]: I0314 07:52:04.857039 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0eb4705ae3185dcea4437b2dc5cf5baea6d561fca9997f66d44953fb1188e6a6" Mar 14 07:52:04 crc kubenswrapper[4893]: I0314 07:52:04.857115 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557912-9vhtd" Mar 14 07:52:05 crc kubenswrapper[4893]: I0314 07:52:05.156705 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557906-wr4ng"] Mar 14 07:52:05 crc kubenswrapper[4893]: I0314 07:52:05.162056 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557906-wr4ng"] Mar 14 07:52:05 crc kubenswrapper[4893]: I0314 07:52:05.385621 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82a414a7-6d45-4bb0-a625-374719d0482e" path="/var/lib/kubelet/pods/82a414a7-6d45-4bb0-a625-374719d0482e/volumes" Mar 14 07:52:07 crc kubenswrapper[4893]: I0314 07:52:07.922662 4893 scope.go:117] "RemoveContainer" containerID="c208fead99477366a558d9bfd2e24e0fc68b80ef714a4b3bbdaf654cb3ceb973" Mar 14 07:52:16 crc kubenswrapper[4893]: I0314 07:52:16.376371 4893 scope.go:117] "RemoveContainer" containerID="7d4f888dda4f72e93fbe049c9579bc4d31994ffcfdb006bc2c321cea789b8be2" Mar 14 07:52:16 crc kubenswrapper[4893]: E0314 07:52:16.377292 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:52:28 crc kubenswrapper[4893]: I0314 07:52:28.376734 4893 scope.go:117] "RemoveContainer" containerID="7d4f888dda4f72e93fbe049c9579bc4d31994ffcfdb006bc2c321cea789b8be2" Mar 14 07:52:28 crc kubenswrapper[4893]: E0314 07:52:28.377581 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:52:38 crc kubenswrapper[4893]: I0314 07:52:38.471745 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zd2dn"] Mar 14 07:52:38 crc kubenswrapper[4893]: E0314 07:52:38.474837 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63ca36f5-9a7b-4f5c-ab5a-48f13abcaf6d" containerName="oc" Mar 14 07:52:38 crc kubenswrapper[4893]: I0314 07:52:38.475072 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="63ca36f5-9a7b-4f5c-ab5a-48f13abcaf6d" containerName="oc" Mar 14 07:52:38 crc kubenswrapper[4893]: I0314 07:52:38.475607 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="63ca36f5-9a7b-4f5c-ab5a-48f13abcaf6d" containerName="oc" Mar 14 07:52:38 crc kubenswrapper[4893]: I0314 07:52:38.478256 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zd2dn" Mar 14 07:52:38 crc kubenswrapper[4893]: I0314 07:52:38.479997 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zd2dn"] Mar 14 07:52:38 crc kubenswrapper[4893]: I0314 07:52:38.525697 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srsx6\" (UniqueName: \"kubernetes.io/projected/b4f7f615-c2b4-4c77-8032-c93e7070ccf8-kube-api-access-srsx6\") pod \"certified-operators-zd2dn\" (UID: \"b4f7f615-c2b4-4c77-8032-c93e7070ccf8\") " pod="openshift-marketplace/certified-operators-zd2dn" Mar 14 07:52:38 crc kubenswrapper[4893]: I0314 07:52:38.525781 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4f7f615-c2b4-4c77-8032-c93e7070ccf8-catalog-content\") pod \"certified-operators-zd2dn\" (UID: \"b4f7f615-c2b4-4c77-8032-c93e7070ccf8\") " pod="openshift-marketplace/certified-operators-zd2dn" Mar 14 07:52:38 crc kubenswrapper[4893]: I0314 07:52:38.525818 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4f7f615-c2b4-4c77-8032-c93e7070ccf8-utilities\") pod \"certified-operators-zd2dn\" (UID: \"b4f7f615-c2b4-4c77-8032-c93e7070ccf8\") " pod="openshift-marketplace/certified-operators-zd2dn" Mar 14 07:52:38 crc kubenswrapper[4893]: I0314 07:52:38.626844 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srsx6\" (UniqueName: \"kubernetes.io/projected/b4f7f615-c2b4-4c77-8032-c93e7070ccf8-kube-api-access-srsx6\") pod \"certified-operators-zd2dn\" (UID: \"b4f7f615-c2b4-4c77-8032-c93e7070ccf8\") " pod="openshift-marketplace/certified-operators-zd2dn" Mar 14 07:52:38 crc kubenswrapper[4893]: I0314 07:52:38.626925 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4f7f615-c2b4-4c77-8032-c93e7070ccf8-catalog-content\") pod \"certified-operators-zd2dn\" (UID: \"b4f7f615-c2b4-4c77-8032-c93e7070ccf8\") " pod="openshift-marketplace/certified-operators-zd2dn" Mar 14 07:52:38 crc kubenswrapper[4893]: I0314 07:52:38.626970 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4f7f615-c2b4-4c77-8032-c93e7070ccf8-utilities\") pod \"certified-operators-zd2dn\" (UID: \"b4f7f615-c2b4-4c77-8032-c93e7070ccf8\") " pod="openshift-marketplace/certified-operators-zd2dn" Mar 14 07:52:38 crc kubenswrapper[4893]: I0314 07:52:38.627563 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4f7f615-c2b4-4c77-8032-c93e7070ccf8-catalog-content\") pod \"certified-operators-zd2dn\" (UID: \"b4f7f615-c2b4-4c77-8032-c93e7070ccf8\") " pod="openshift-marketplace/certified-operators-zd2dn" Mar 14 07:52:38 crc kubenswrapper[4893]: I0314 07:52:38.627635 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4f7f615-c2b4-4c77-8032-c93e7070ccf8-utilities\") pod \"certified-operators-zd2dn\" (UID: \"b4f7f615-c2b4-4c77-8032-c93e7070ccf8\") " pod="openshift-marketplace/certified-operators-zd2dn" Mar 14 07:52:38 crc kubenswrapper[4893]: I0314 07:52:38.654458 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srsx6\" (UniqueName: \"kubernetes.io/projected/b4f7f615-c2b4-4c77-8032-c93e7070ccf8-kube-api-access-srsx6\") pod \"certified-operators-zd2dn\" (UID: \"b4f7f615-c2b4-4c77-8032-c93e7070ccf8\") " pod="openshift-marketplace/certified-operators-zd2dn" Mar 14 07:52:38 crc kubenswrapper[4893]: I0314 07:52:38.832359 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zd2dn" Mar 14 07:52:39 crc kubenswrapper[4893]: I0314 07:52:39.279654 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zd2dn"] Mar 14 07:52:39 crc kubenswrapper[4893]: I0314 07:52:39.376649 4893 scope.go:117] "RemoveContainer" containerID="7d4f888dda4f72e93fbe049c9579bc4d31994ffcfdb006bc2c321cea789b8be2" Mar 14 07:52:39 crc kubenswrapper[4893]: E0314 07:52:39.376905 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:52:40 crc kubenswrapper[4893]: I0314 07:52:40.125568 4893 generic.go:334] "Generic (PLEG): container finished" podID="b4f7f615-c2b4-4c77-8032-c93e7070ccf8" containerID="b16e6d69cdc848702c51742fd7699553b113f92ae33703d6a84cef5d3850b521" exitCode=0 Mar 14 07:52:40 crc kubenswrapper[4893]: I0314 07:52:40.125615 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zd2dn" event={"ID":"b4f7f615-c2b4-4c77-8032-c93e7070ccf8","Type":"ContainerDied","Data":"b16e6d69cdc848702c51742fd7699553b113f92ae33703d6a84cef5d3850b521"} Mar 14 07:52:40 crc kubenswrapper[4893]: I0314 07:52:40.125882 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zd2dn" event={"ID":"b4f7f615-c2b4-4c77-8032-c93e7070ccf8","Type":"ContainerStarted","Data":"74f2fdba170318d8a76d3b00d86c76e52128d893f07ccd2bd45a53cc738682a3"} Mar 14 07:52:40 crc kubenswrapper[4893]: I0314 07:52:40.127361 4893 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 07:52:40 crc kubenswrapper[4893]: I0314 07:52:40.875127 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-28kmr"] Mar 14 07:52:40 crc kubenswrapper[4893]: I0314 07:52:40.876882 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-28kmr" Mar 14 07:52:40 crc kubenswrapper[4893]: I0314 07:52:40.942194 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-28kmr"] Mar 14 07:52:41 crc kubenswrapper[4893]: I0314 07:52:41.058178 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/782509e8-7ca5-4dc5-a8ff-e909e25a22e5-catalog-content\") pod \"community-operators-28kmr\" (UID: \"782509e8-7ca5-4dc5-a8ff-e909e25a22e5\") " pod="openshift-marketplace/community-operators-28kmr" Mar 14 07:52:41 crc kubenswrapper[4893]: I0314 07:52:41.058244 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/782509e8-7ca5-4dc5-a8ff-e909e25a22e5-utilities\") pod \"community-operators-28kmr\" (UID: \"782509e8-7ca5-4dc5-a8ff-e909e25a22e5\") " pod="openshift-marketplace/community-operators-28kmr" Mar 14 07:52:41 crc kubenswrapper[4893]: I0314 07:52:41.058292 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gk9fc\" (UniqueName: \"kubernetes.io/projected/782509e8-7ca5-4dc5-a8ff-e909e25a22e5-kube-api-access-gk9fc\") pod \"community-operators-28kmr\" (UID: \"782509e8-7ca5-4dc5-a8ff-e909e25a22e5\") " pod="openshift-marketplace/community-operators-28kmr" Mar 14 07:52:41 crc kubenswrapper[4893]: I0314 07:52:41.149777 4893 generic.go:334] "Generic (PLEG): container finished" podID="b4f7f615-c2b4-4c77-8032-c93e7070ccf8" containerID="26667422a675b1a051ac8520da8ee05c3a629ad9a67e6a9e403c005cc85f8c75" exitCode=0 Mar 14 07:52:41 crc kubenswrapper[4893]: I0314 07:52:41.149834 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zd2dn" event={"ID":"b4f7f615-c2b4-4c77-8032-c93e7070ccf8","Type":"ContainerDied","Data":"26667422a675b1a051ac8520da8ee05c3a629ad9a67e6a9e403c005cc85f8c75"} Mar 14 07:52:41 crc kubenswrapper[4893]: I0314 07:52:41.158998 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gk9fc\" (UniqueName: \"kubernetes.io/projected/782509e8-7ca5-4dc5-a8ff-e909e25a22e5-kube-api-access-gk9fc\") pod \"community-operators-28kmr\" (UID: \"782509e8-7ca5-4dc5-a8ff-e909e25a22e5\") " pod="openshift-marketplace/community-operators-28kmr" Mar 14 07:52:41 crc kubenswrapper[4893]: I0314 07:52:41.159236 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/782509e8-7ca5-4dc5-a8ff-e909e25a22e5-catalog-content\") pod \"community-operators-28kmr\" (UID: \"782509e8-7ca5-4dc5-a8ff-e909e25a22e5\") " pod="openshift-marketplace/community-operators-28kmr" Mar 14 07:52:41 crc kubenswrapper[4893]: I0314 07:52:41.159298 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/782509e8-7ca5-4dc5-a8ff-e909e25a22e5-utilities\") pod \"community-operators-28kmr\" (UID: \"782509e8-7ca5-4dc5-a8ff-e909e25a22e5\") " pod="openshift-marketplace/community-operators-28kmr" Mar 14 07:52:41 crc kubenswrapper[4893]: I0314 07:52:41.162367 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/782509e8-7ca5-4dc5-a8ff-e909e25a22e5-utilities\") pod \"community-operators-28kmr\" (UID: \"782509e8-7ca5-4dc5-a8ff-e909e25a22e5\") " pod="openshift-marketplace/community-operators-28kmr" Mar 14 07:52:41 crc kubenswrapper[4893]: I0314 07:52:41.162840 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/782509e8-7ca5-4dc5-a8ff-e909e25a22e5-catalog-content\") pod \"community-operators-28kmr\" (UID: \"782509e8-7ca5-4dc5-a8ff-e909e25a22e5\") " pod="openshift-marketplace/community-operators-28kmr" Mar 14 07:52:41 crc kubenswrapper[4893]: I0314 07:52:41.194507 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gk9fc\" (UniqueName: \"kubernetes.io/projected/782509e8-7ca5-4dc5-a8ff-e909e25a22e5-kube-api-access-gk9fc\") pod \"community-operators-28kmr\" (UID: \"782509e8-7ca5-4dc5-a8ff-e909e25a22e5\") " pod="openshift-marketplace/community-operators-28kmr" Mar 14 07:52:41 crc kubenswrapper[4893]: I0314 07:52:41.493408 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-28kmr" Mar 14 07:52:42 crc kubenswrapper[4893]: I0314 07:52:41.751107 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-28kmr"] Mar 14 07:52:42 crc kubenswrapper[4893]: I0314 07:52:42.157832 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zd2dn" event={"ID":"b4f7f615-c2b4-4c77-8032-c93e7070ccf8","Type":"ContainerStarted","Data":"bd7bd3e05acc487ed63973d202afd152c30b4a8df4bba7419d0b172aba5d06a7"} Mar 14 07:52:42 crc kubenswrapper[4893]: I0314 07:52:42.159039 4893 generic.go:334] "Generic (PLEG): container finished" podID="782509e8-7ca5-4dc5-a8ff-e909e25a22e5" containerID="1008681b7ef98dd7cc85ef330f9cb1a711261049c0ba96df4612fc68ae64af57" exitCode=0 Mar 14 07:52:42 crc kubenswrapper[4893]: I0314 07:52:42.159079 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-28kmr" event={"ID":"782509e8-7ca5-4dc5-a8ff-e909e25a22e5","Type":"ContainerDied","Data":"1008681b7ef98dd7cc85ef330f9cb1a711261049c0ba96df4612fc68ae64af57"} Mar 14 07:52:42 crc kubenswrapper[4893]: I0314 07:52:42.159103 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-28kmr" event={"ID":"782509e8-7ca5-4dc5-a8ff-e909e25a22e5","Type":"ContainerStarted","Data":"b96cfdf3938fe44921679355ceea1e70eace577e01d87b565d104675c4add143"} Mar 14 07:52:42 crc kubenswrapper[4893]: I0314 07:52:42.180876 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zd2dn" podStartSLOduration=2.778562372 podStartE2EDuration="4.180844261s" podCreationTimestamp="2026-03-14 07:52:38 +0000 UTC" firstStartedPulling="2026-03-14 07:52:40.127171869 +0000 UTC m=+3239.389348661" lastFinishedPulling="2026-03-14 07:52:41.529453758 +0000 UTC m=+3240.791630550" observedRunningTime="2026-03-14 07:52:42.177282386 +0000 UTC m=+3241.439459188" watchObservedRunningTime="2026-03-14 07:52:42.180844261 +0000 UTC m=+3241.443021053" Mar 14 07:52:43 crc kubenswrapper[4893]: I0314 07:52:43.168447 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-28kmr" event={"ID":"782509e8-7ca5-4dc5-a8ff-e909e25a22e5","Type":"ContainerStarted","Data":"031dbeaeb84359cf7332c6e62fd161f890b55664cf7f3b301355f7d756f64665"} Mar 14 07:52:44 crc kubenswrapper[4893]: I0314 07:52:44.190945 4893 generic.go:334] "Generic (PLEG): container finished" podID="782509e8-7ca5-4dc5-a8ff-e909e25a22e5" containerID="031dbeaeb84359cf7332c6e62fd161f890b55664cf7f3b301355f7d756f64665" exitCode=0 Mar 14 07:52:44 crc kubenswrapper[4893]: I0314 07:52:44.191079 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-28kmr" event={"ID":"782509e8-7ca5-4dc5-a8ff-e909e25a22e5","Type":"ContainerDied","Data":"031dbeaeb84359cf7332c6e62fd161f890b55664cf7f3b301355f7d756f64665"} Mar 14 07:52:45 crc kubenswrapper[4893]: I0314 07:52:45.202063 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-28kmr" event={"ID":"782509e8-7ca5-4dc5-a8ff-e909e25a22e5","Type":"ContainerStarted","Data":"43b4b2b25cb867f9ddabbc2246f9bbbdaed06ebb25722593e29754a44ad8cfdc"} Mar 14 07:52:45 crc kubenswrapper[4893]: I0314 07:52:45.247591 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-28kmr" podStartSLOduration=2.778781964 podStartE2EDuration="5.247507688s" podCreationTimestamp="2026-03-14 07:52:40 +0000 UTC" firstStartedPulling="2026-03-14 07:52:42.160212587 +0000 UTC m=+3241.422389379" lastFinishedPulling="2026-03-14 07:52:44.628938301 +0000 UTC m=+3243.891115103" observedRunningTime="2026-03-14 07:52:45.241913054 +0000 UTC m=+3244.504089866" watchObservedRunningTime="2026-03-14 07:52:45.247507688 +0000 UTC m=+3244.509684490" Mar 14 07:52:48 crc kubenswrapper[4893]: I0314 07:52:48.833210 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zd2dn" Mar 14 07:52:48 crc kubenswrapper[4893]: I0314 07:52:48.834377 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zd2dn" Mar 14 07:52:48 crc kubenswrapper[4893]: I0314 07:52:48.901928 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zd2dn" Mar 14 07:52:49 crc kubenswrapper[4893]: I0314 07:52:49.272744 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zd2dn" Mar 14 07:52:49 crc kubenswrapper[4893]: I0314 07:52:49.318541 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zd2dn"] Mar 14 07:52:50 crc kubenswrapper[4893]: I0314 07:52:50.376923 4893 scope.go:117] "RemoveContainer" containerID="7d4f888dda4f72e93fbe049c9579bc4d31994ffcfdb006bc2c321cea789b8be2" Mar 14 07:52:50 crc kubenswrapper[4893]: E0314 07:52:50.377417 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:52:51 crc kubenswrapper[4893]: I0314 07:52:51.251702 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zd2dn" podUID="b4f7f615-c2b4-4c77-8032-c93e7070ccf8" containerName="registry-server" containerID="cri-o://bd7bd3e05acc487ed63973d202afd152c30b4a8df4bba7419d0b172aba5d06a7" gracePeriod=2 Mar 14 07:52:51 crc kubenswrapper[4893]: I0314 07:52:51.494034 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-28kmr" Mar 14 07:52:51 crc kubenswrapper[4893]: I0314 07:52:51.494386 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-28kmr" Mar 14 07:52:51 crc kubenswrapper[4893]: I0314 07:52:51.535584 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-28kmr" Mar 14 07:52:52 crc kubenswrapper[4893]: I0314 07:52:52.177083 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zd2dn" Mar 14 07:52:52 crc kubenswrapper[4893]: I0314 07:52:52.260351 4893 generic.go:334] "Generic (PLEG): container finished" podID="b4f7f615-c2b4-4c77-8032-c93e7070ccf8" containerID="bd7bd3e05acc487ed63973d202afd152c30b4a8df4bba7419d0b172aba5d06a7" exitCode=0 Mar 14 07:52:52 crc kubenswrapper[4893]: I0314 07:52:52.260405 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zd2dn" Mar 14 07:52:52 crc kubenswrapper[4893]: I0314 07:52:52.260433 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zd2dn" event={"ID":"b4f7f615-c2b4-4c77-8032-c93e7070ccf8","Type":"ContainerDied","Data":"bd7bd3e05acc487ed63973d202afd152c30b4a8df4bba7419d0b172aba5d06a7"} Mar 14 07:52:52 crc kubenswrapper[4893]: I0314 07:52:52.260504 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zd2dn" event={"ID":"b4f7f615-c2b4-4c77-8032-c93e7070ccf8","Type":"ContainerDied","Data":"74f2fdba170318d8a76d3b00d86c76e52128d893f07ccd2bd45a53cc738682a3"} Mar 14 07:52:52 crc kubenswrapper[4893]: I0314 07:52:52.260539 4893 scope.go:117] "RemoveContainer" containerID="bd7bd3e05acc487ed63973d202afd152c30b4a8df4bba7419d0b172aba5d06a7" Mar 14 07:52:52 crc kubenswrapper[4893]: I0314 07:52:52.275116 4893 scope.go:117] "RemoveContainer" containerID="26667422a675b1a051ac8520da8ee05c3a629ad9a67e6a9e403c005cc85f8c75" Mar 14 07:52:52 crc kubenswrapper[4893]: I0314 07:52:52.293720 4893 scope.go:117] "RemoveContainer" containerID="b16e6d69cdc848702c51742fd7699553b113f92ae33703d6a84cef5d3850b521" Mar 14 07:52:52 crc kubenswrapper[4893]: I0314 07:52:52.316822 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-28kmr" Mar 14 07:52:52 crc kubenswrapper[4893]: I0314 07:52:52.317166 4893 scope.go:117] "RemoveContainer" containerID="bd7bd3e05acc487ed63973d202afd152c30b4a8df4bba7419d0b172aba5d06a7" Mar 14 07:52:52 crc kubenswrapper[4893]: E0314 07:52:52.317500 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd7bd3e05acc487ed63973d202afd152c30b4a8df4bba7419d0b172aba5d06a7\": container with ID starting with bd7bd3e05acc487ed63973d202afd152c30b4a8df4bba7419d0b172aba5d06a7 not found: ID does not exist" containerID="bd7bd3e05acc487ed63973d202afd152c30b4a8df4bba7419d0b172aba5d06a7" Mar 14 07:52:52 crc kubenswrapper[4893]: I0314 07:52:52.317546 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd7bd3e05acc487ed63973d202afd152c30b4a8df4bba7419d0b172aba5d06a7"} err="failed to get container status \"bd7bd3e05acc487ed63973d202afd152c30b4a8df4bba7419d0b172aba5d06a7\": rpc error: code = NotFound desc = could not find container \"bd7bd3e05acc487ed63973d202afd152c30b4a8df4bba7419d0b172aba5d06a7\": container with ID starting with bd7bd3e05acc487ed63973d202afd152c30b4a8df4bba7419d0b172aba5d06a7 not found: ID does not exist" Mar 14 07:52:52 crc kubenswrapper[4893]: I0314 07:52:52.317568 4893 scope.go:117] "RemoveContainer" containerID="26667422a675b1a051ac8520da8ee05c3a629ad9a67e6a9e403c005cc85f8c75" Mar 14 07:52:52 crc kubenswrapper[4893]: E0314 07:52:52.317811 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26667422a675b1a051ac8520da8ee05c3a629ad9a67e6a9e403c005cc85f8c75\": container with ID starting with 26667422a675b1a051ac8520da8ee05c3a629ad9a67e6a9e403c005cc85f8c75 not found: ID does not exist" containerID="26667422a675b1a051ac8520da8ee05c3a629ad9a67e6a9e403c005cc85f8c75" Mar 14 07:52:52 crc kubenswrapper[4893]: I0314 07:52:52.317834 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26667422a675b1a051ac8520da8ee05c3a629ad9a67e6a9e403c005cc85f8c75"} err="failed to get container status \"26667422a675b1a051ac8520da8ee05c3a629ad9a67e6a9e403c005cc85f8c75\": rpc error: code = NotFound desc = could not find container \"26667422a675b1a051ac8520da8ee05c3a629ad9a67e6a9e403c005cc85f8c75\": container with ID starting with 26667422a675b1a051ac8520da8ee05c3a629ad9a67e6a9e403c005cc85f8c75 not found: ID does not exist" Mar 14 07:52:52 crc kubenswrapper[4893]: I0314 07:52:52.317848 4893 scope.go:117] "RemoveContainer" containerID="b16e6d69cdc848702c51742fd7699553b113f92ae33703d6a84cef5d3850b521" Mar 14 07:52:52 crc kubenswrapper[4893]: E0314 07:52:52.318059 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b16e6d69cdc848702c51742fd7699553b113f92ae33703d6a84cef5d3850b521\": container with ID starting with b16e6d69cdc848702c51742fd7699553b113f92ae33703d6a84cef5d3850b521 not found: ID does not exist" containerID="b16e6d69cdc848702c51742fd7699553b113f92ae33703d6a84cef5d3850b521" Mar 14 07:52:52 crc kubenswrapper[4893]: I0314 07:52:52.318080 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b16e6d69cdc848702c51742fd7699553b113f92ae33703d6a84cef5d3850b521"} err="failed to get container status \"b16e6d69cdc848702c51742fd7699553b113f92ae33703d6a84cef5d3850b521\": rpc error: code = NotFound desc = could not find container \"b16e6d69cdc848702c51742fd7699553b113f92ae33703d6a84cef5d3850b521\": container with ID starting with b16e6d69cdc848702c51742fd7699553b113f92ae33703d6a84cef5d3850b521 not found: ID does not exist" Mar 14 07:52:52 crc kubenswrapper[4893]: I0314 07:52:52.346927 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4f7f615-c2b4-4c77-8032-c93e7070ccf8-utilities\") pod \"b4f7f615-c2b4-4c77-8032-c93e7070ccf8\" (UID: \"b4f7f615-c2b4-4c77-8032-c93e7070ccf8\") " Mar 14 07:52:52 crc kubenswrapper[4893]: I0314 07:52:52.347041 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4f7f615-c2b4-4c77-8032-c93e7070ccf8-catalog-content\") pod \"b4f7f615-c2b4-4c77-8032-c93e7070ccf8\" (UID: \"b4f7f615-c2b4-4c77-8032-c93e7070ccf8\") " Mar 14 07:52:52 crc kubenswrapper[4893]: I0314 07:52:52.347132 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srsx6\" (UniqueName: \"kubernetes.io/projected/b4f7f615-c2b4-4c77-8032-c93e7070ccf8-kube-api-access-srsx6\") pod \"b4f7f615-c2b4-4c77-8032-c93e7070ccf8\" (UID: \"b4f7f615-c2b4-4c77-8032-c93e7070ccf8\") " Mar 14 07:52:52 crc kubenswrapper[4893]: I0314 07:52:52.348106 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4f7f615-c2b4-4c77-8032-c93e7070ccf8-utilities" (OuterVolumeSpecName: "utilities") pod "b4f7f615-c2b4-4c77-8032-c93e7070ccf8" (UID: "b4f7f615-c2b4-4c77-8032-c93e7070ccf8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:52:52 crc kubenswrapper[4893]: I0314 07:52:52.352502 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4f7f615-c2b4-4c77-8032-c93e7070ccf8-kube-api-access-srsx6" (OuterVolumeSpecName: "kube-api-access-srsx6") pod "b4f7f615-c2b4-4c77-8032-c93e7070ccf8" (UID: "b4f7f615-c2b4-4c77-8032-c93e7070ccf8"). InnerVolumeSpecName "kube-api-access-srsx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:52:52 crc kubenswrapper[4893]: I0314 07:52:52.419411 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4f7f615-c2b4-4c77-8032-c93e7070ccf8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b4f7f615-c2b4-4c77-8032-c93e7070ccf8" (UID: "b4f7f615-c2b4-4c77-8032-c93e7070ccf8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:52:52 crc kubenswrapper[4893]: I0314 07:52:52.449759 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srsx6\" (UniqueName: \"kubernetes.io/projected/b4f7f615-c2b4-4c77-8032-c93e7070ccf8-kube-api-access-srsx6\") on node \"crc\" DevicePath \"\"" Mar 14 07:52:52 crc kubenswrapper[4893]: I0314 07:52:52.449798 4893 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4f7f615-c2b4-4c77-8032-c93e7070ccf8-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 07:52:52 crc kubenswrapper[4893]: I0314 07:52:52.449815 4893 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4f7f615-c2b4-4c77-8032-c93e7070ccf8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 07:52:52 crc kubenswrapper[4893]: I0314 07:52:52.597056 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zd2dn"] Mar 14 07:52:52 crc kubenswrapper[4893]: I0314 07:52:52.605346 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zd2dn"] Mar 14 07:52:53 crc kubenswrapper[4893]: I0314 07:52:53.390865 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4f7f615-c2b4-4c77-8032-c93e7070ccf8" path="/var/lib/kubelet/pods/b4f7f615-c2b4-4c77-8032-c93e7070ccf8/volumes" Mar 14 07:52:54 crc kubenswrapper[4893]: I0314 07:52:54.346293 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-28kmr"] Mar 14 07:52:54 crc kubenswrapper[4893]: I0314 07:52:54.346595 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-28kmr" podUID="782509e8-7ca5-4dc5-a8ff-e909e25a22e5" containerName="registry-server" containerID="cri-o://43b4b2b25cb867f9ddabbc2246f9bbbdaed06ebb25722593e29754a44ad8cfdc" gracePeriod=2 Mar 14 07:52:54 crc kubenswrapper[4893]: I0314 07:52:54.745322 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-28kmr" Mar 14 07:52:54 crc kubenswrapper[4893]: I0314 07:52:54.806670 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gk9fc\" (UniqueName: \"kubernetes.io/projected/782509e8-7ca5-4dc5-a8ff-e909e25a22e5-kube-api-access-gk9fc\") pod \"782509e8-7ca5-4dc5-a8ff-e909e25a22e5\" (UID: \"782509e8-7ca5-4dc5-a8ff-e909e25a22e5\") " Mar 14 07:52:54 crc kubenswrapper[4893]: I0314 07:52:54.806742 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/782509e8-7ca5-4dc5-a8ff-e909e25a22e5-utilities\") pod \"782509e8-7ca5-4dc5-a8ff-e909e25a22e5\" (UID: \"782509e8-7ca5-4dc5-a8ff-e909e25a22e5\") " Mar 14 07:52:54 crc kubenswrapper[4893]: I0314 07:52:54.806821 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/782509e8-7ca5-4dc5-a8ff-e909e25a22e5-catalog-content\") pod \"782509e8-7ca5-4dc5-a8ff-e909e25a22e5\" (UID: \"782509e8-7ca5-4dc5-a8ff-e909e25a22e5\") " Mar 14 07:52:54 crc kubenswrapper[4893]: I0314 07:52:54.807927 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/782509e8-7ca5-4dc5-a8ff-e909e25a22e5-utilities" (OuterVolumeSpecName: "utilities") pod "782509e8-7ca5-4dc5-a8ff-e909e25a22e5" (UID: "782509e8-7ca5-4dc5-a8ff-e909e25a22e5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:52:54 crc kubenswrapper[4893]: I0314 07:52:54.814280 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/782509e8-7ca5-4dc5-a8ff-e909e25a22e5-kube-api-access-gk9fc" (OuterVolumeSpecName: "kube-api-access-gk9fc") pod "782509e8-7ca5-4dc5-a8ff-e909e25a22e5" (UID: "782509e8-7ca5-4dc5-a8ff-e909e25a22e5"). InnerVolumeSpecName "kube-api-access-gk9fc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:52:54 crc kubenswrapper[4893]: I0314 07:52:54.861743 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/782509e8-7ca5-4dc5-a8ff-e909e25a22e5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "782509e8-7ca5-4dc5-a8ff-e909e25a22e5" (UID: "782509e8-7ca5-4dc5-a8ff-e909e25a22e5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:52:54 crc kubenswrapper[4893]: I0314 07:52:54.909004 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gk9fc\" (UniqueName: \"kubernetes.io/projected/782509e8-7ca5-4dc5-a8ff-e909e25a22e5-kube-api-access-gk9fc\") on node \"crc\" DevicePath \"\"" Mar 14 07:52:54 crc kubenswrapper[4893]: I0314 07:52:54.909067 4893 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/782509e8-7ca5-4dc5-a8ff-e909e25a22e5-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 07:52:54 crc kubenswrapper[4893]: I0314 07:52:54.909088 4893 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/782509e8-7ca5-4dc5-a8ff-e909e25a22e5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 07:52:55 crc kubenswrapper[4893]: I0314 07:52:55.292141 4893 generic.go:334] "Generic (PLEG): container finished" podID="782509e8-7ca5-4dc5-a8ff-e909e25a22e5" containerID="43b4b2b25cb867f9ddabbc2246f9bbbdaed06ebb25722593e29754a44ad8cfdc" exitCode=0 Mar 14 07:52:55 crc kubenswrapper[4893]: I0314 07:52:55.292213 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-28kmr" event={"ID":"782509e8-7ca5-4dc5-a8ff-e909e25a22e5","Type":"ContainerDied","Data":"43b4b2b25cb867f9ddabbc2246f9bbbdaed06ebb25722593e29754a44ad8cfdc"} Mar 14 07:52:55 crc kubenswrapper[4893]: I0314 07:52:55.292231 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-28kmr" Mar 14 07:52:55 crc kubenswrapper[4893]: I0314 07:52:55.292270 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-28kmr" event={"ID":"782509e8-7ca5-4dc5-a8ff-e909e25a22e5","Type":"ContainerDied","Data":"b96cfdf3938fe44921679355ceea1e70eace577e01d87b565d104675c4add143"} Mar 14 07:52:55 crc kubenswrapper[4893]: I0314 07:52:55.292304 4893 scope.go:117] "RemoveContainer" containerID="43b4b2b25cb867f9ddabbc2246f9bbbdaed06ebb25722593e29754a44ad8cfdc" Mar 14 07:52:55 crc kubenswrapper[4893]: I0314 07:52:55.313969 4893 scope.go:117] "RemoveContainer" containerID="031dbeaeb84359cf7332c6e62fd161f890b55664cf7f3b301355f7d756f64665" Mar 14 07:52:55 crc kubenswrapper[4893]: I0314 07:52:55.333198 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-28kmr"] Mar 14 07:52:55 crc kubenswrapper[4893]: I0314 07:52:55.345239 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-28kmr"] Mar 14 07:52:55 crc kubenswrapper[4893]: I0314 07:52:55.351380 4893 scope.go:117] "RemoveContainer" containerID="1008681b7ef98dd7cc85ef330f9cb1a711261049c0ba96df4612fc68ae64af57" Mar 14 07:52:55 crc kubenswrapper[4893]: I0314 07:52:55.375059 4893 scope.go:117] "RemoveContainer" containerID="43b4b2b25cb867f9ddabbc2246f9bbbdaed06ebb25722593e29754a44ad8cfdc" Mar 14 07:52:55 crc kubenswrapper[4893]: E0314 07:52:55.375570 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43b4b2b25cb867f9ddabbc2246f9bbbdaed06ebb25722593e29754a44ad8cfdc\": container with ID starting with 43b4b2b25cb867f9ddabbc2246f9bbbdaed06ebb25722593e29754a44ad8cfdc not found: ID does not exist" containerID="43b4b2b25cb867f9ddabbc2246f9bbbdaed06ebb25722593e29754a44ad8cfdc" Mar 14 07:52:55 crc kubenswrapper[4893]: I0314 07:52:55.375613 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43b4b2b25cb867f9ddabbc2246f9bbbdaed06ebb25722593e29754a44ad8cfdc"} err="failed to get container status \"43b4b2b25cb867f9ddabbc2246f9bbbdaed06ebb25722593e29754a44ad8cfdc\": rpc error: code = NotFound desc = could not find container \"43b4b2b25cb867f9ddabbc2246f9bbbdaed06ebb25722593e29754a44ad8cfdc\": container with ID starting with 43b4b2b25cb867f9ddabbc2246f9bbbdaed06ebb25722593e29754a44ad8cfdc not found: ID does not exist" Mar 14 07:52:55 crc kubenswrapper[4893]: I0314 07:52:55.375643 4893 scope.go:117] "RemoveContainer" containerID="031dbeaeb84359cf7332c6e62fd161f890b55664cf7f3b301355f7d756f64665" Mar 14 07:52:55 crc kubenswrapper[4893]: E0314 07:52:55.376164 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"031dbeaeb84359cf7332c6e62fd161f890b55664cf7f3b301355f7d756f64665\": container with ID starting with 031dbeaeb84359cf7332c6e62fd161f890b55664cf7f3b301355f7d756f64665 not found: ID does not exist" containerID="031dbeaeb84359cf7332c6e62fd161f890b55664cf7f3b301355f7d756f64665" Mar 14 07:52:55 crc kubenswrapper[4893]: I0314 07:52:55.376191 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"031dbeaeb84359cf7332c6e62fd161f890b55664cf7f3b301355f7d756f64665"} err="failed to get container status \"031dbeaeb84359cf7332c6e62fd161f890b55664cf7f3b301355f7d756f64665\": rpc error: code = NotFound desc = could not find container \"031dbeaeb84359cf7332c6e62fd161f890b55664cf7f3b301355f7d756f64665\": container with ID starting with 031dbeaeb84359cf7332c6e62fd161f890b55664cf7f3b301355f7d756f64665 not found: ID does not exist" Mar 14 07:52:55 crc kubenswrapper[4893]: I0314 07:52:55.376235 4893 scope.go:117] "RemoveContainer" containerID="1008681b7ef98dd7cc85ef330f9cb1a711261049c0ba96df4612fc68ae64af57" Mar 14 07:52:55 crc kubenswrapper[4893]: E0314 07:52:55.376665 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1008681b7ef98dd7cc85ef330f9cb1a711261049c0ba96df4612fc68ae64af57\": container with ID starting with 1008681b7ef98dd7cc85ef330f9cb1a711261049c0ba96df4612fc68ae64af57 not found: ID does not exist" containerID="1008681b7ef98dd7cc85ef330f9cb1a711261049c0ba96df4612fc68ae64af57" Mar 14 07:52:55 crc kubenswrapper[4893]: I0314 07:52:55.376740 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1008681b7ef98dd7cc85ef330f9cb1a711261049c0ba96df4612fc68ae64af57"} err="failed to get container status \"1008681b7ef98dd7cc85ef330f9cb1a711261049c0ba96df4612fc68ae64af57\": rpc error: code = NotFound desc = could not find container \"1008681b7ef98dd7cc85ef330f9cb1a711261049c0ba96df4612fc68ae64af57\": container with ID starting with 1008681b7ef98dd7cc85ef330f9cb1a711261049c0ba96df4612fc68ae64af57 not found: ID does not exist" Mar 14 07:52:55 crc kubenswrapper[4893]: I0314 07:52:55.386130 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="782509e8-7ca5-4dc5-a8ff-e909e25a22e5" path="/var/lib/kubelet/pods/782509e8-7ca5-4dc5-a8ff-e909e25a22e5/volumes" Mar 14 07:53:04 crc kubenswrapper[4893]: I0314 07:53:04.376927 4893 scope.go:117] "RemoveContainer" containerID="7d4f888dda4f72e93fbe049c9579bc4d31994ffcfdb006bc2c321cea789b8be2" Mar 14 07:53:04 crc kubenswrapper[4893]: E0314 07:53:04.377863 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:53:16 crc kubenswrapper[4893]: I0314 07:53:16.377254 4893 scope.go:117] "RemoveContainer" containerID="7d4f888dda4f72e93fbe049c9579bc4d31994ffcfdb006bc2c321cea789b8be2" Mar 14 07:53:16 crc kubenswrapper[4893]: E0314 07:53:16.378392 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:53:28 crc kubenswrapper[4893]: I0314 07:53:28.377233 4893 scope.go:117] "RemoveContainer" containerID="7d4f888dda4f72e93fbe049c9579bc4d31994ffcfdb006bc2c321cea789b8be2" Mar 14 07:53:28 crc kubenswrapper[4893]: E0314 07:53:28.378273 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:53:40 crc kubenswrapper[4893]: I0314 07:53:40.377317 4893 scope.go:117] "RemoveContainer" containerID="7d4f888dda4f72e93fbe049c9579bc4d31994ffcfdb006bc2c321cea789b8be2" Mar 14 07:53:40 crc kubenswrapper[4893]: E0314 07:53:40.378088 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:53:53 crc kubenswrapper[4893]: I0314 07:53:53.376372 4893 scope.go:117] "RemoveContainer" containerID="7d4f888dda4f72e93fbe049c9579bc4d31994ffcfdb006bc2c321cea789b8be2" Mar 14 07:53:53 crc kubenswrapper[4893]: E0314 07:53:53.377248 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:54:00 crc kubenswrapper[4893]: I0314 07:54:00.142254 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557914-dnfgf"] Mar 14 07:54:00 crc kubenswrapper[4893]: E0314 07:54:00.143006 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="782509e8-7ca5-4dc5-a8ff-e909e25a22e5" containerName="extract-utilities" Mar 14 07:54:00 crc kubenswrapper[4893]: I0314 07:54:00.143239 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="782509e8-7ca5-4dc5-a8ff-e909e25a22e5" containerName="extract-utilities" Mar 14 07:54:00 crc kubenswrapper[4893]: E0314 07:54:00.143261 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4f7f615-c2b4-4c77-8032-c93e7070ccf8" containerName="registry-server" Mar 14 07:54:00 crc kubenswrapper[4893]: I0314 07:54:00.143271 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4f7f615-c2b4-4c77-8032-c93e7070ccf8" containerName="registry-server" Mar 14 07:54:00 crc kubenswrapper[4893]: E0314 07:54:00.143283 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="782509e8-7ca5-4dc5-a8ff-e909e25a22e5" containerName="extract-content" Mar 14 07:54:00 crc kubenswrapper[4893]: I0314 07:54:00.143291 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="782509e8-7ca5-4dc5-a8ff-e909e25a22e5" containerName="extract-content" Mar 14 07:54:00 crc kubenswrapper[4893]: E0314 07:54:00.143302 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="782509e8-7ca5-4dc5-a8ff-e909e25a22e5" containerName="registry-server" Mar 14 07:54:00 crc kubenswrapper[4893]: I0314 07:54:00.143309 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="782509e8-7ca5-4dc5-a8ff-e909e25a22e5" containerName="registry-server" Mar 14 07:54:00 crc kubenswrapper[4893]: E0314 07:54:00.143325 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4f7f615-c2b4-4c77-8032-c93e7070ccf8" containerName="extract-utilities" Mar 14 07:54:00 crc kubenswrapper[4893]: I0314 07:54:00.143332 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4f7f615-c2b4-4c77-8032-c93e7070ccf8" containerName="extract-utilities" Mar 14 07:54:00 crc kubenswrapper[4893]: E0314 07:54:00.143349 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4f7f615-c2b4-4c77-8032-c93e7070ccf8" containerName="extract-content" Mar 14 07:54:00 crc kubenswrapper[4893]: I0314 07:54:00.143357 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4f7f615-c2b4-4c77-8032-c93e7070ccf8" containerName="extract-content" Mar 14 07:54:00 crc kubenswrapper[4893]: I0314 07:54:00.143577 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="782509e8-7ca5-4dc5-a8ff-e909e25a22e5" containerName="registry-server" Mar 14 07:54:00 crc kubenswrapper[4893]: I0314 07:54:00.143610 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4f7f615-c2b4-4c77-8032-c93e7070ccf8" containerName="registry-server" Mar 14 07:54:00 crc kubenswrapper[4893]: I0314 07:54:00.144077 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557914-dnfgf" Mar 14 07:54:00 crc kubenswrapper[4893]: I0314 07:54:00.146220 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 07:54:00 crc kubenswrapper[4893]: I0314 07:54:00.146450 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-44qb7" Mar 14 07:54:00 crc kubenswrapper[4893]: I0314 07:54:00.147187 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 07:54:00 crc kubenswrapper[4893]: I0314 07:54:00.156578 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557914-dnfgf"] Mar 14 07:54:00 crc kubenswrapper[4893]: I0314 07:54:00.211244 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j96jf\" (UniqueName: \"kubernetes.io/projected/e6df7899-9f4b-4121-8025-9290657a6bd3-kube-api-access-j96jf\") pod \"auto-csr-approver-29557914-dnfgf\" (UID: \"e6df7899-9f4b-4121-8025-9290657a6bd3\") " pod="openshift-infra/auto-csr-approver-29557914-dnfgf" Mar 14 07:54:00 crc kubenswrapper[4893]: I0314 07:54:00.312015 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j96jf\" (UniqueName: \"kubernetes.io/projected/e6df7899-9f4b-4121-8025-9290657a6bd3-kube-api-access-j96jf\") pod \"auto-csr-approver-29557914-dnfgf\" (UID: \"e6df7899-9f4b-4121-8025-9290657a6bd3\") " pod="openshift-infra/auto-csr-approver-29557914-dnfgf" Mar 14 07:54:00 crc kubenswrapper[4893]: I0314 07:54:00.329135 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j96jf\" (UniqueName: \"kubernetes.io/projected/e6df7899-9f4b-4121-8025-9290657a6bd3-kube-api-access-j96jf\") pod \"auto-csr-approver-29557914-dnfgf\" (UID: \"e6df7899-9f4b-4121-8025-9290657a6bd3\") " pod="openshift-infra/auto-csr-approver-29557914-dnfgf" Mar 14 07:54:00 crc kubenswrapper[4893]: I0314 07:54:00.472976 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557914-dnfgf" Mar 14 07:54:00 crc kubenswrapper[4893]: I0314 07:54:00.885493 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557914-dnfgf"] Mar 14 07:54:01 crc kubenswrapper[4893]: I0314 07:54:01.789980 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557914-dnfgf" event={"ID":"e6df7899-9f4b-4121-8025-9290657a6bd3","Type":"ContainerStarted","Data":"314907893c42599968ce80ce3cdb5dc40e40870f2ad70051b56b490fa6a40dfb"} Mar 14 07:54:02 crc kubenswrapper[4893]: I0314 07:54:02.797870 4893 generic.go:334] "Generic (PLEG): container finished" podID="e6df7899-9f4b-4121-8025-9290657a6bd3" containerID="800ac63c5921614b16f269925d4144d36ef0e0f9e0e588031b796af66f99317f" exitCode=0 Mar 14 07:54:02 crc kubenswrapper[4893]: I0314 07:54:02.797924 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557914-dnfgf" event={"ID":"e6df7899-9f4b-4121-8025-9290657a6bd3","Type":"ContainerDied","Data":"800ac63c5921614b16f269925d4144d36ef0e0f9e0e588031b796af66f99317f"} Mar 14 07:54:04 crc kubenswrapper[4893]: I0314 07:54:04.056192 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557914-dnfgf" Mar 14 07:54:04 crc kubenswrapper[4893]: I0314 07:54:04.164967 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j96jf\" (UniqueName: \"kubernetes.io/projected/e6df7899-9f4b-4121-8025-9290657a6bd3-kube-api-access-j96jf\") pod \"e6df7899-9f4b-4121-8025-9290657a6bd3\" (UID: \"e6df7899-9f4b-4121-8025-9290657a6bd3\") " Mar 14 07:54:04 crc kubenswrapper[4893]: I0314 07:54:04.171638 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6df7899-9f4b-4121-8025-9290657a6bd3-kube-api-access-j96jf" (OuterVolumeSpecName: "kube-api-access-j96jf") pod "e6df7899-9f4b-4121-8025-9290657a6bd3" (UID: "e6df7899-9f4b-4121-8025-9290657a6bd3"). InnerVolumeSpecName "kube-api-access-j96jf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:54:04 crc kubenswrapper[4893]: I0314 07:54:04.266060 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j96jf\" (UniqueName: \"kubernetes.io/projected/e6df7899-9f4b-4121-8025-9290657a6bd3-kube-api-access-j96jf\") on node \"crc\" DevicePath \"\"" Mar 14 07:54:04 crc kubenswrapper[4893]: I0314 07:54:04.812919 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557914-dnfgf" event={"ID":"e6df7899-9f4b-4121-8025-9290657a6bd3","Type":"ContainerDied","Data":"314907893c42599968ce80ce3cdb5dc40e40870f2ad70051b56b490fa6a40dfb"} Mar 14 07:54:04 crc kubenswrapper[4893]: I0314 07:54:04.812956 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="314907893c42599968ce80ce3cdb5dc40e40870f2ad70051b56b490fa6a40dfb" Mar 14 07:54:04 crc kubenswrapper[4893]: I0314 07:54:04.812982 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557914-dnfgf" Mar 14 07:54:05 crc kubenswrapper[4893]: I0314 07:54:05.142738 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557908-r98tz"] Mar 14 07:54:05 crc kubenswrapper[4893]: I0314 07:54:05.147618 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557908-r98tz"] Mar 14 07:54:05 crc kubenswrapper[4893]: I0314 07:54:05.377263 4893 scope.go:117] "RemoveContainer" containerID="7d4f888dda4f72e93fbe049c9579bc4d31994ffcfdb006bc2c321cea789b8be2" Mar 14 07:54:05 crc kubenswrapper[4893]: E0314 07:54:05.377866 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:54:05 crc kubenswrapper[4893]: I0314 07:54:05.384333 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8f5173f-4a2f-40f7-9732-4b1de949fa02" path="/var/lib/kubelet/pods/d8f5173f-4a2f-40f7-9732-4b1de949fa02/volumes" Mar 14 07:54:08 crc kubenswrapper[4893]: I0314 07:54:08.036074 4893 scope.go:117] "RemoveContainer" containerID="9eeeb353483d1e0337e44327b9d928c1f2157671fb4057af573157e2091cf534" Mar 14 07:54:18 crc kubenswrapper[4893]: I0314 07:54:18.377220 4893 scope.go:117] "RemoveContainer" containerID="7d4f888dda4f72e93fbe049c9579bc4d31994ffcfdb006bc2c321cea789b8be2" Mar 14 07:54:18 crc kubenswrapper[4893]: E0314 07:54:18.377970 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:54:33 crc kubenswrapper[4893]: I0314 07:54:33.376314 4893 scope.go:117] "RemoveContainer" containerID="7d4f888dda4f72e93fbe049c9579bc4d31994ffcfdb006bc2c321cea789b8be2" Mar 14 07:54:33 crc kubenswrapper[4893]: E0314 07:54:33.377106 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:54:44 crc kubenswrapper[4893]: I0314 07:54:44.376913 4893 scope.go:117] "RemoveContainer" containerID="7d4f888dda4f72e93fbe049c9579bc4d31994ffcfdb006bc2c321cea789b8be2" Mar 14 07:54:44 crc kubenswrapper[4893]: E0314 07:54:44.377865 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:54:58 crc kubenswrapper[4893]: I0314 07:54:58.377051 4893 scope.go:117] "RemoveContainer" containerID="7d4f888dda4f72e93fbe049c9579bc4d31994ffcfdb006bc2c321cea789b8be2" Mar 14 07:54:58 crc kubenswrapper[4893]: E0314 07:54:58.378182 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 07:55:09 crc kubenswrapper[4893]: I0314 07:55:09.377017 4893 scope.go:117] "RemoveContainer" containerID="7d4f888dda4f72e93fbe049c9579bc4d31994ffcfdb006bc2c321cea789b8be2" Mar 14 07:55:10 crc kubenswrapper[4893]: I0314 07:55:10.261190 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" event={"ID":"ad6724e5-48cf-4417-ae51-b1cb8c6af70d","Type":"ContainerStarted","Data":"32e856774aaaa7f3a020545a154284566b18a34781ac94de4a1754e21d05aa2e"} Mar 14 07:56:00 crc kubenswrapper[4893]: I0314 07:56:00.151159 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557916-vrrd2"] Mar 14 07:56:00 crc kubenswrapper[4893]: E0314 07:56:00.152415 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6df7899-9f4b-4121-8025-9290657a6bd3" containerName="oc" Mar 14 07:56:00 crc kubenswrapper[4893]: I0314 07:56:00.152433 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6df7899-9f4b-4121-8025-9290657a6bd3" containerName="oc" Mar 14 07:56:00 crc kubenswrapper[4893]: I0314 07:56:00.152644 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6df7899-9f4b-4121-8025-9290657a6bd3" containerName="oc" Mar 14 07:56:00 crc kubenswrapper[4893]: I0314 07:56:00.153257 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557916-vrrd2" Mar 14 07:56:00 crc kubenswrapper[4893]: I0314 07:56:00.160681 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 07:56:00 crc kubenswrapper[4893]: I0314 07:56:00.160913 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-44qb7" Mar 14 07:56:00 crc kubenswrapper[4893]: I0314 07:56:00.160949 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 07:56:00 crc kubenswrapper[4893]: I0314 07:56:00.162851 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557916-vrrd2"] Mar 14 07:56:00 crc kubenswrapper[4893]: I0314 07:56:00.234288 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhn6n\" (UniqueName: \"kubernetes.io/projected/d9cabd36-ad52-4161-94a7-aaa72b6eeb9c-kube-api-access-lhn6n\") pod \"auto-csr-approver-29557916-vrrd2\" (UID: \"d9cabd36-ad52-4161-94a7-aaa72b6eeb9c\") " pod="openshift-infra/auto-csr-approver-29557916-vrrd2" Mar 14 07:56:00 crc kubenswrapper[4893]: I0314 07:56:00.335617 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhn6n\" (UniqueName: \"kubernetes.io/projected/d9cabd36-ad52-4161-94a7-aaa72b6eeb9c-kube-api-access-lhn6n\") pod \"auto-csr-approver-29557916-vrrd2\" (UID: \"d9cabd36-ad52-4161-94a7-aaa72b6eeb9c\") " pod="openshift-infra/auto-csr-approver-29557916-vrrd2" Mar 14 07:56:00 crc kubenswrapper[4893]: I0314 07:56:00.363334 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhn6n\" (UniqueName: \"kubernetes.io/projected/d9cabd36-ad52-4161-94a7-aaa72b6eeb9c-kube-api-access-lhn6n\") pod \"auto-csr-approver-29557916-vrrd2\" (UID: \"d9cabd36-ad52-4161-94a7-aaa72b6eeb9c\") " pod="openshift-infra/auto-csr-approver-29557916-vrrd2" Mar 14 07:56:00 crc kubenswrapper[4893]: I0314 07:56:00.473653 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557916-vrrd2" Mar 14 07:56:00 crc kubenswrapper[4893]: I0314 07:56:00.904047 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557916-vrrd2"] Mar 14 07:56:01 crc kubenswrapper[4893]: I0314 07:56:01.644461 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557916-vrrd2" event={"ID":"d9cabd36-ad52-4161-94a7-aaa72b6eeb9c","Type":"ContainerStarted","Data":"99a817b44ca0fa8aa50b27b620212de86e9860474d3a455feb376860505e195d"} Mar 14 07:56:02 crc kubenswrapper[4893]: I0314 07:56:02.654176 4893 generic.go:334] "Generic (PLEG): container finished" podID="d9cabd36-ad52-4161-94a7-aaa72b6eeb9c" containerID="dc8fcd96f1d56cc44be15d9af74681fe2a4ad8c84f52bb28d81cc79d0c576c2d" exitCode=0 Mar 14 07:56:02 crc kubenswrapper[4893]: I0314 07:56:02.654228 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557916-vrrd2" event={"ID":"d9cabd36-ad52-4161-94a7-aaa72b6eeb9c","Type":"ContainerDied","Data":"dc8fcd96f1d56cc44be15d9af74681fe2a4ad8c84f52bb28d81cc79d0c576c2d"} Mar 14 07:56:03 crc kubenswrapper[4893]: I0314 07:56:03.966683 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557916-vrrd2" Mar 14 07:56:04 crc kubenswrapper[4893]: I0314 07:56:04.085063 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhn6n\" (UniqueName: \"kubernetes.io/projected/d9cabd36-ad52-4161-94a7-aaa72b6eeb9c-kube-api-access-lhn6n\") pod \"d9cabd36-ad52-4161-94a7-aaa72b6eeb9c\" (UID: \"d9cabd36-ad52-4161-94a7-aaa72b6eeb9c\") " Mar 14 07:56:04 crc kubenswrapper[4893]: I0314 07:56:04.090091 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9cabd36-ad52-4161-94a7-aaa72b6eeb9c-kube-api-access-lhn6n" (OuterVolumeSpecName: "kube-api-access-lhn6n") pod "d9cabd36-ad52-4161-94a7-aaa72b6eeb9c" (UID: "d9cabd36-ad52-4161-94a7-aaa72b6eeb9c"). InnerVolumeSpecName "kube-api-access-lhn6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:56:04 crc kubenswrapper[4893]: I0314 07:56:04.186712 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhn6n\" (UniqueName: \"kubernetes.io/projected/d9cabd36-ad52-4161-94a7-aaa72b6eeb9c-kube-api-access-lhn6n\") on node \"crc\" DevicePath \"\"" Mar 14 07:56:04 crc kubenswrapper[4893]: I0314 07:56:04.679653 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557916-vrrd2" event={"ID":"d9cabd36-ad52-4161-94a7-aaa72b6eeb9c","Type":"ContainerDied","Data":"99a817b44ca0fa8aa50b27b620212de86e9860474d3a455feb376860505e195d"} Mar 14 07:56:04 crc kubenswrapper[4893]: I0314 07:56:04.679694 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99a817b44ca0fa8aa50b27b620212de86e9860474d3a455feb376860505e195d" Mar 14 07:56:04 crc kubenswrapper[4893]: I0314 07:56:04.679754 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557916-vrrd2" Mar 14 07:56:05 crc kubenswrapper[4893]: I0314 07:56:05.063014 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557910-grrsl"] Mar 14 07:56:05 crc kubenswrapper[4893]: I0314 07:56:05.070566 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557910-grrsl"] Mar 14 07:56:05 crc kubenswrapper[4893]: I0314 07:56:05.385153 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7665cdc0-87da-4725-970c-a7c3a61f0363" path="/var/lib/kubelet/pods/7665cdc0-87da-4725-970c-a7c3a61f0363/volumes" Mar 14 07:56:08 crc kubenswrapper[4893]: I0314 07:56:08.137878 4893 scope.go:117] "RemoveContainer" containerID="1f1d47a9a1f75643124c5cf82b9dba01964e9173ba78790670612495e6b4e585" Mar 14 07:57:29 crc kubenswrapper[4893]: I0314 07:57:29.731500 4893 patch_prober.go:28] interesting pod/machine-config-daemon-d4x6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:57:29 crc kubenswrapper[4893]: I0314 07:57:29.732205 4893 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:57:34 crc kubenswrapper[4893]: I0314 07:57:34.082823 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hmqms"] Mar 14 07:57:34 crc kubenswrapper[4893]: E0314 07:57:34.083423 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9cabd36-ad52-4161-94a7-aaa72b6eeb9c" containerName="oc" Mar 14 07:57:34 crc kubenswrapper[4893]: I0314 07:57:34.083434 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9cabd36-ad52-4161-94a7-aaa72b6eeb9c" containerName="oc" Mar 14 07:57:34 crc kubenswrapper[4893]: I0314 07:57:34.083583 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9cabd36-ad52-4161-94a7-aaa72b6eeb9c" containerName="oc" Mar 14 07:57:34 crc kubenswrapper[4893]: I0314 07:57:34.084456 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hmqms" Mar 14 07:57:34 crc kubenswrapper[4893]: I0314 07:57:34.095703 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hmqms"] Mar 14 07:57:34 crc kubenswrapper[4893]: I0314 07:57:34.250986 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rr56\" (UniqueName: \"kubernetes.io/projected/5f5fef37-5270-4275-bc08-2226d0f92ef1-kube-api-access-7rr56\") pod \"redhat-operators-hmqms\" (UID: \"5f5fef37-5270-4275-bc08-2226d0f92ef1\") " pod="openshift-marketplace/redhat-operators-hmqms" Mar 14 07:57:34 crc kubenswrapper[4893]: I0314 07:57:34.251057 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f5fef37-5270-4275-bc08-2226d0f92ef1-utilities\") pod \"redhat-operators-hmqms\" (UID: \"5f5fef37-5270-4275-bc08-2226d0f92ef1\") " pod="openshift-marketplace/redhat-operators-hmqms" Mar 14 07:57:34 crc kubenswrapper[4893]: I0314 07:57:34.251293 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f5fef37-5270-4275-bc08-2226d0f92ef1-catalog-content\") pod \"redhat-operators-hmqms\" (UID: \"5f5fef37-5270-4275-bc08-2226d0f92ef1\") " pod="openshift-marketplace/redhat-operators-hmqms" Mar 14 07:57:34 crc kubenswrapper[4893]: I0314 07:57:34.352138 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f5fef37-5270-4275-bc08-2226d0f92ef1-utilities\") pod \"redhat-operators-hmqms\" (UID: \"5f5fef37-5270-4275-bc08-2226d0f92ef1\") " pod="openshift-marketplace/redhat-operators-hmqms" Mar 14 07:57:34 crc kubenswrapper[4893]: I0314 07:57:34.352213 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f5fef37-5270-4275-bc08-2226d0f92ef1-catalog-content\") pod \"redhat-operators-hmqms\" (UID: \"5f5fef37-5270-4275-bc08-2226d0f92ef1\") " pod="openshift-marketplace/redhat-operators-hmqms" Mar 14 07:57:34 crc kubenswrapper[4893]: I0314 07:57:34.352277 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rr56\" (UniqueName: \"kubernetes.io/projected/5f5fef37-5270-4275-bc08-2226d0f92ef1-kube-api-access-7rr56\") pod \"redhat-operators-hmqms\" (UID: \"5f5fef37-5270-4275-bc08-2226d0f92ef1\") " pod="openshift-marketplace/redhat-operators-hmqms" Mar 14 07:57:34 crc kubenswrapper[4893]: I0314 07:57:34.352739 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f5fef37-5270-4275-bc08-2226d0f92ef1-utilities\") pod \"redhat-operators-hmqms\" (UID: \"5f5fef37-5270-4275-bc08-2226d0f92ef1\") " pod="openshift-marketplace/redhat-operators-hmqms" Mar 14 07:57:34 crc kubenswrapper[4893]: I0314 07:57:34.352883 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f5fef37-5270-4275-bc08-2226d0f92ef1-catalog-content\") pod \"redhat-operators-hmqms\" (UID: \"5f5fef37-5270-4275-bc08-2226d0f92ef1\") " pod="openshift-marketplace/redhat-operators-hmqms" Mar 14 07:57:34 crc kubenswrapper[4893]: I0314 07:57:34.373155 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rr56\" (UniqueName: \"kubernetes.io/projected/5f5fef37-5270-4275-bc08-2226d0f92ef1-kube-api-access-7rr56\") pod \"redhat-operators-hmqms\" (UID: \"5f5fef37-5270-4275-bc08-2226d0f92ef1\") " pod="openshift-marketplace/redhat-operators-hmqms" Mar 14 07:57:34 crc kubenswrapper[4893]: I0314 07:57:34.413815 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hmqms" Mar 14 07:57:34 crc kubenswrapper[4893]: I0314 07:57:34.652227 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hmqms"] Mar 14 07:57:35 crc kubenswrapper[4893]: I0314 07:57:35.306281 4893 generic.go:334] "Generic (PLEG): container finished" podID="5f5fef37-5270-4275-bc08-2226d0f92ef1" containerID="2c7c66c2266531e7dbd254ac85a6bf8c4cd7e26a59eb6e46c102471fe8f95d08" exitCode=0 Mar 14 07:57:35 crc kubenswrapper[4893]: I0314 07:57:35.306389 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hmqms" event={"ID":"5f5fef37-5270-4275-bc08-2226d0f92ef1","Type":"ContainerDied","Data":"2c7c66c2266531e7dbd254ac85a6bf8c4cd7e26a59eb6e46c102471fe8f95d08"} Mar 14 07:57:35 crc kubenswrapper[4893]: I0314 07:57:35.306885 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hmqms" event={"ID":"5f5fef37-5270-4275-bc08-2226d0f92ef1","Type":"ContainerStarted","Data":"8e8bee8a17658c43d82c9b43d345931a97bb165775bdb94b32a0c187fba4271f"} Mar 14 07:57:36 crc kubenswrapper[4893]: I0314 07:57:36.315773 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hmqms" event={"ID":"5f5fef37-5270-4275-bc08-2226d0f92ef1","Type":"ContainerStarted","Data":"3aecc1105a1bf53f3ae4b5f6d699ee6535085dd6207f6ae50244f21916e3390e"} Mar 14 07:57:37 crc kubenswrapper[4893]: I0314 07:57:37.325622 4893 generic.go:334] "Generic (PLEG): container finished" podID="5f5fef37-5270-4275-bc08-2226d0f92ef1" containerID="3aecc1105a1bf53f3ae4b5f6d699ee6535085dd6207f6ae50244f21916e3390e" exitCode=0 Mar 14 07:57:37 crc kubenswrapper[4893]: I0314 07:57:37.325700 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hmqms" event={"ID":"5f5fef37-5270-4275-bc08-2226d0f92ef1","Type":"ContainerDied","Data":"3aecc1105a1bf53f3ae4b5f6d699ee6535085dd6207f6ae50244f21916e3390e"} Mar 14 07:57:38 crc kubenswrapper[4893]: I0314 07:57:38.381394 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hmqms" event={"ID":"5f5fef37-5270-4275-bc08-2226d0f92ef1","Type":"ContainerStarted","Data":"022017edc9ae47a9157f9eac4950a5e4ae73a8bf4fe52997c238e80787643535"} Mar 14 07:57:38 crc kubenswrapper[4893]: I0314 07:57:38.537801 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hmqms" podStartSLOduration=2.111175857 podStartE2EDuration="4.537776015s" podCreationTimestamp="2026-03-14 07:57:34 +0000 UTC" firstStartedPulling="2026-03-14 07:57:35.307901543 +0000 UTC m=+3534.570078335" lastFinishedPulling="2026-03-14 07:57:37.734501701 +0000 UTC m=+3536.996678493" observedRunningTime="2026-03-14 07:57:38.528965718 +0000 UTC m=+3537.791142530" watchObservedRunningTime="2026-03-14 07:57:38.537776015 +0000 UTC m=+3537.799952807" Mar 14 07:57:44 crc kubenswrapper[4893]: I0314 07:57:44.414769 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hmqms" Mar 14 07:57:44 crc kubenswrapper[4893]: I0314 07:57:44.416488 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hmqms" Mar 14 07:57:44 crc kubenswrapper[4893]: I0314 07:57:44.456807 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hmqms" Mar 14 07:57:45 crc kubenswrapper[4893]: I0314 07:57:45.460509 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hmqms" Mar 14 07:57:45 crc kubenswrapper[4893]: I0314 07:57:45.503910 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hmqms"] Mar 14 07:57:47 crc kubenswrapper[4893]: I0314 07:57:47.439006 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hmqms" podUID="5f5fef37-5270-4275-bc08-2226d0f92ef1" containerName="registry-server" containerID="cri-o://022017edc9ae47a9157f9eac4950a5e4ae73a8bf4fe52997c238e80787643535" gracePeriod=2 Mar 14 07:57:49 crc kubenswrapper[4893]: I0314 07:57:49.468272 4893 generic.go:334] "Generic (PLEG): container finished" podID="5f5fef37-5270-4275-bc08-2226d0f92ef1" containerID="022017edc9ae47a9157f9eac4950a5e4ae73a8bf4fe52997c238e80787643535" exitCode=0 Mar 14 07:57:49 crc kubenswrapper[4893]: I0314 07:57:49.468412 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hmqms" event={"ID":"5f5fef37-5270-4275-bc08-2226d0f92ef1","Type":"ContainerDied","Data":"022017edc9ae47a9157f9eac4950a5e4ae73a8bf4fe52997c238e80787643535"} Mar 14 07:57:49 crc kubenswrapper[4893]: I0314 07:57:49.587496 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hmqms" Mar 14 07:57:49 crc kubenswrapper[4893]: I0314 07:57:49.775921 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f5fef37-5270-4275-bc08-2226d0f92ef1-catalog-content\") pod \"5f5fef37-5270-4275-bc08-2226d0f92ef1\" (UID: \"5f5fef37-5270-4275-bc08-2226d0f92ef1\") " Mar 14 07:57:49 crc kubenswrapper[4893]: I0314 07:57:49.776032 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f5fef37-5270-4275-bc08-2226d0f92ef1-utilities\") pod \"5f5fef37-5270-4275-bc08-2226d0f92ef1\" (UID: \"5f5fef37-5270-4275-bc08-2226d0f92ef1\") " Mar 14 07:57:49 crc kubenswrapper[4893]: I0314 07:57:49.776082 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rr56\" (UniqueName: \"kubernetes.io/projected/5f5fef37-5270-4275-bc08-2226d0f92ef1-kube-api-access-7rr56\") pod \"5f5fef37-5270-4275-bc08-2226d0f92ef1\" (UID: \"5f5fef37-5270-4275-bc08-2226d0f92ef1\") " Mar 14 07:57:49 crc kubenswrapper[4893]: I0314 07:57:49.776803 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f5fef37-5270-4275-bc08-2226d0f92ef1-utilities" (OuterVolumeSpecName: "utilities") pod "5f5fef37-5270-4275-bc08-2226d0f92ef1" (UID: "5f5fef37-5270-4275-bc08-2226d0f92ef1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:57:49 crc kubenswrapper[4893]: I0314 07:57:49.781597 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f5fef37-5270-4275-bc08-2226d0f92ef1-kube-api-access-7rr56" (OuterVolumeSpecName: "kube-api-access-7rr56") pod "5f5fef37-5270-4275-bc08-2226d0f92ef1" (UID: "5f5fef37-5270-4275-bc08-2226d0f92ef1"). InnerVolumeSpecName "kube-api-access-7rr56". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:57:49 crc kubenswrapper[4893]: I0314 07:57:49.877736 4893 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f5fef37-5270-4275-bc08-2226d0f92ef1-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 07:57:49 crc kubenswrapper[4893]: I0314 07:57:49.878477 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rr56\" (UniqueName: \"kubernetes.io/projected/5f5fef37-5270-4275-bc08-2226d0f92ef1-kube-api-access-7rr56\") on node \"crc\" DevicePath \"\"" Mar 14 07:57:49 crc kubenswrapper[4893]: I0314 07:57:49.942341 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f5fef37-5270-4275-bc08-2226d0f92ef1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5f5fef37-5270-4275-bc08-2226d0f92ef1" (UID: "5f5fef37-5270-4275-bc08-2226d0f92ef1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:57:49 crc kubenswrapper[4893]: I0314 07:57:49.979728 4893 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f5fef37-5270-4275-bc08-2226d0f92ef1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 07:57:50 crc kubenswrapper[4893]: I0314 07:57:50.477329 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hmqms" event={"ID":"5f5fef37-5270-4275-bc08-2226d0f92ef1","Type":"ContainerDied","Data":"8e8bee8a17658c43d82c9b43d345931a97bb165775bdb94b32a0c187fba4271f"} Mar 14 07:57:50 crc kubenswrapper[4893]: I0314 07:57:50.477401 4893 scope.go:117] "RemoveContainer" containerID="022017edc9ae47a9157f9eac4950a5e4ae73a8bf4fe52997c238e80787643535" Mar 14 07:57:50 crc kubenswrapper[4893]: I0314 07:57:50.477404 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hmqms" Mar 14 07:57:50 crc kubenswrapper[4893]: I0314 07:57:50.496149 4893 scope.go:117] "RemoveContainer" containerID="3aecc1105a1bf53f3ae4b5f6d699ee6535085dd6207f6ae50244f21916e3390e" Mar 14 07:57:50 crc kubenswrapper[4893]: I0314 07:57:50.515767 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hmqms"] Mar 14 07:57:50 crc kubenswrapper[4893]: I0314 07:57:50.519813 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hmqms"] Mar 14 07:57:50 crc kubenswrapper[4893]: I0314 07:57:50.544592 4893 scope.go:117] "RemoveContainer" containerID="2c7c66c2266531e7dbd254ac85a6bf8c4cd7e26a59eb6e46c102471fe8f95d08" Mar 14 07:57:51 crc kubenswrapper[4893]: I0314 07:57:51.410533 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f5fef37-5270-4275-bc08-2226d0f92ef1" path="/var/lib/kubelet/pods/5f5fef37-5270-4275-bc08-2226d0f92ef1/volumes" Mar 14 07:57:59 crc kubenswrapper[4893]: I0314 07:57:59.731408 4893 patch_prober.go:28] interesting pod/machine-config-daemon-d4x6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:57:59 crc kubenswrapper[4893]: I0314 07:57:59.732146 4893 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:58:00 crc kubenswrapper[4893]: I0314 07:58:00.149295 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557918-qn6hf"] Mar 14 07:58:00 crc kubenswrapper[4893]: E0314 07:58:00.149989 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f5fef37-5270-4275-bc08-2226d0f92ef1" containerName="registry-server" Mar 14 07:58:00 crc kubenswrapper[4893]: I0314 07:58:00.150010 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f5fef37-5270-4275-bc08-2226d0f92ef1" containerName="registry-server" Mar 14 07:58:00 crc kubenswrapper[4893]: E0314 07:58:00.150052 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f5fef37-5270-4275-bc08-2226d0f92ef1" containerName="extract-content" Mar 14 07:58:00 crc kubenswrapper[4893]: I0314 07:58:00.150059 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f5fef37-5270-4275-bc08-2226d0f92ef1" containerName="extract-content" Mar 14 07:58:00 crc kubenswrapper[4893]: E0314 07:58:00.150074 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f5fef37-5270-4275-bc08-2226d0f92ef1" containerName="extract-utilities" Mar 14 07:58:00 crc kubenswrapper[4893]: I0314 07:58:00.150082 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f5fef37-5270-4275-bc08-2226d0f92ef1" containerName="extract-utilities" Mar 14 07:58:00 crc kubenswrapper[4893]: I0314 07:58:00.150241 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f5fef37-5270-4275-bc08-2226d0f92ef1" containerName="registry-server" Mar 14 07:58:00 crc kubenswrapper[4893]: I0314 07:58:00.150835 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557918-qn6hf" Mar 14 07:58:00 crc kubenswrapper[4893]: I0314 07:58:00.154082 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 07:58:00 crc kubenswrapper[4893]: I0314 07:58:00.154310 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-44qb7" Mar 14 07:58:00 crc kubenswrapper[4893]: I0314 07:58:00.154517 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557918-qn6hf"] Mar 14 07:58:00 crc kubenswrapper[4893]: I0314 07:58:00.155667 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 07:58:00 crc kubenswrapper[4893]: I0314 07:58:00.332682 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6xff\" (UniqueName: \"kubernetes.io/projected/1235048d-af3c-4bfb-9211-d22046eabef2-kube-api-access-w6xff\") pod \"auto-csr-approver-29557918-qn6hf\" (UID: \"1235048d-af3c-4bfb-9211-d22046eabef2\") " pod="openshift-infra/auto-csr-approver-29557918-qn6hf" Mar 14 07:58:00 crc kubenswrapper[4893]: I0314 07:58:00.434563 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6xff\" (UniqueName: \"kubernetes.io/projected/1235048d-af3c-4bfb-9211-d22046eabef2-kube-api-access-w6xff\") pod \"auto-csr-approver-29557918-qn6hf\" (UID: \"1235048d-af3c-4bfb-9211-d22046eabef2\") " pod="openshift-infra/auto-csr-approver-29557918-qn6hf" Mar 14 07:58:00 crc kubenswrapper[4893]: I0314 07:58:00.453617 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6xff\" (UniqueName: \"kubernetes.io/projected/1235048d-af3c-4bfb-9211-d22046eabef2-kube-api-access-w6xff\") pod \"auto-csr-approver-29557918-qn6hf\" (UID: \"1235048d-af3c-4bfb-9211-d22046eabef2\") " pod="openshift-infra/auto-csr-approver-29557918-qn6hf" Mar 14 07:58:00 crc kubenswrapper[4893]: I0314 07:58:00.468255 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557918-qn6hf" Mar 14 07:58:00 crc kubenswrapper[4893]: I0314 07:58:00.870603 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557918-qn6hf"] Mar 14 07:58:00 crc kubenswrapper[4893]: I0314 07:58:00.877543 4893 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 07:58:01 crc kubenswrapper[4893]: I0314 07:58:01.566358 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557918-qn6hf" event={"ID":"1235048d-af3c-4bfb-9211-d22046eabef2","Type":"ContainerStarted","Data":"581027bb375e1aecdd751932fe752d13e0d7d92878424b507309aad748940142"} Mar 14 07:58:02 crc kubenswrapper[4893]: I0314 07:58:02.573570 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557918-qn6hf" event={"ID":"1235048d-af3c-4bfb-9211-d22046eabef2","Type":"ContainerStarted","Data":"f51044283b1cb891aec40f5870424c5ec47527a01af0aba95c783a491c45a2fc"} Mar 14 07:58:02 crc kubenswrapper[4893]: I0314 07:58:02.586905 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557918-qn6hf" podStartSLOduration=1.253018306 podStartE2EDuration="2.586887926s" podCreationTimestamp="2026-03-14 07:58:00 +0000 UTC" firstStartedPulling="2026-03-14 07:58:00.877334361 +0000 UTC m=+3560.139511163" lastFinishedPulling="2026-03-14 07:58:02.211203991 +0000 UTC m=+3561.473380783" observedRunningTime="2026-03-14 07:58:02.585230059 +0000 UTC m=+3561.847406851" watchObservedRunningTime="2026-03-14 07:58:02.586887926 +0000 UTC m=+3561.849064718" Mar 14 07:58:03 crc kubenswrapper[4893]: I0314 07:58:03.580392 4893 generic.go:334] "Generic (PLEG): container finished" podID="1235048d-af3c-4bfb-9211-d22046eabef2" containerID="f51044283b1cb891aec40f5870424c5ec47527a01af0aba95c783a491c45a2fc" exitCode=0 Mar 14 07:58:03 crc kubenswrapper[4893]: I0314 07:58:03.580446 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557918-qn6hf" event={"ID":"1235048d-af3c-4bfb-9211-d22046eabef2","Type":"ContainerDied","Data":"f51044283b1cb891aec40f5870424c5ec47527a01af0aba95c783a491c45a2fc"} Mar 14 07:58:04 crc kubenswrapper[4893]: I0314 07:58:04.829746 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557918-qn6hf" Mar 14 07:58:04 crc kubenswrapper[4893]: I0314 07:58:04.994337 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6xff\" (UniqueName: \"kubernetes.io/projected/1235048d-af3c-4bfb-9211-d22046eabef2-kube-api-access-w6xff\") pod \"1235048d-af3c-4bfb-9211-d22046eabef2\" (UID: \"1235048d-af3c-4bfb-9211-d22046eabef2\") " Mar 14 07:58:05 crc kubenswrapper[4893]: I0314 07:58:05.000243 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1235048d-af3c-4bfb-9211-d22046eabef2-kube-api-access-w6xff" (OuterVolumeSpecName: "kube-api-access-w6xff") pod "1235048d-af3c-4bfb-9211-d22046eabef2" (UID: "1235048d-af3c-4bfb-9211-d22046eabef2"). InnerVolumeSpecName "kube-api-access-w6xff". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:58:05 crc kubenswrapper[4893]: I0314 07:58:05.097566 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w6xff\" (UniqueName: \"kubernetes.io/projected/1235048d-af3c-4bfb-9211-d22046eabef2-kube-api-access-w6xff\") on node \"crc\" DevicePath \"\"" Mar 14 07:58:05 crc kubenswrapper[4893]: I0314 07:58:05.593475 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557918-qn6hf" event={"ID":"1235048d-af3c-4bfb-9211-d22046eabef2","Type":"ContainerDied","Data":"581027bb375e1aecdd751932fe752d13e0d7d92878424b507309aad748940142"} Mar 14 07:58:05 crc kubenswrapper[4893]: I0314 07:58:05.593813 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="581027bb375e1aecdd751932fe752d13e0d7d92878424b507309aad748940142" Mar 14 07:58:05 crc kubenswrapper[4893]: I0314 07:58:05.593512 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557918-qn6hf" Mar 14 07:58:05 crc kubenswrapper[4893]: I0314 07:58:05.646933 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557912-9vhtd"] Mar 14 07:58:05 crc kubenswrapper[4893]: I0314 07:58:05.652027 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557912-9vhtd"] Mar 14 07:58:07 crc kubenswrapper[4893]: I0314 07:58:07.389087 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63ca36f5-9a7b-4f5c-ab5a-48f13abcaf6d" path="/var/lib/kubelet/pods/63ca36f5-9a7b-4f5c-ab5a-48f13abcaf6d/volumes" Mar 14 07:58:08 crc kubenswrapper[4893]: I0314 07:58:08.216473 4893 scope.go:117] "RemoveContainer" containerID="6392e154b8552643b629c7840e16b54867cfc0ce6f6256f55e0a98f48ce51b56" Mar 14 07:58:29 crc kubenswrapper[4893]: I0314 07:58:29.731086 4893 patch_prober.go:28] interesting pod/machine-config-daemon-d4x6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:58:29 crc kubenswrapper[4893]: I0314 07:58:29.731789 4893 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:58:29 crc kubenswrapper[4893]: I0314 07:58:29.731840 4893 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" Mar 14 07:58:29 crc kubenswrapper[4893]: I0314 07:58:29.732650 4893 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"32e856774aaaa7f3a020545a154284566b18a34781ac94de4a1754e21d05aa2e"} pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 07:58:29 crc kubenswrapper[4893]: I0314 07:58:29.732705 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" containerID="cri-o://32e856774aaaa7f3a020545a154284566b18a34781ac94de4a1754e21d05aa2e" gracePeriod=600 Mar 14 07:58:30 crc kubenswrapper[4893]: I0314 07:58:30.791372 4893 generic.go:334] "Generic (PLEG): container finished" podID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerID="32e856774aaaa7f3a020545a154284566b18a34781ac94de4a1754e21d05aa2e" exitCode=0 Mar 14 07:58:30 crc kubenswrapper[4893]: I0314 07:58:30.791392 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" event={"ID":"ad6724e5-48cf-4417-ae51-b1cb8c6af70d","Type":"ContainerDied","Data":"32e856774aaaa7f3a020545a154284566b18a34781ac94de4a1754e21d05aa2e"} Mar 14 07:58:30 crc kubenswrapper[4893]: I0314 07:58:30.792098 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" event={"ID":"ad6724e5-48cf-4417-ae51-b1cb8c6af70d","Type":"ContainerStarted","Data":"e12005c3784aac2361996712f043ac7b2eec25802043ed8145c0668053f67c8d"} Mar 14 07:58:30 crc kubenswrapper[4893]: I0314 07:58:30.792145 4893 scope.go:117] "RemoveContainer" containerID="7d4f888dda4f72e93fbe049c9579bc4d31994ffcfdb006bc2c321cea789b8be2" Mar 14 07:58:31 crc kubenswrapper[4893]: I0314 07:58:31.865787 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pdglt"] Mar 14 07:58:31 crc kubenswrapper[4893]: E0314 07:58:31.866936 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1235048d-af3c-4bfb-9211-d22046eabef2" containerName="oc" Mar 14 07:58:31 crc kubenswrapper[4893]: I0314 07:58:31.867021 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="1235048d-af3c-4bfb-9211-d22046eabef2" containerName="oc" Mar 14 07:58:31 crc kubenswrapper[4893]: I0314 07:58:31.867271 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="1235048d-af3c-4bfb-9211-d22046eabef2" containerName="oc" Mar 14 07:58:31 crc kubenswrapper[4893]: I0314 07:58:31.868583 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pdglt" Mar 14 07:58:31 crc kubenswrapper[4893]: I0314 07:58:31.877732 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pdglt"] Mar 14 07:58:31 crc kubenswrapper[4893]: I0314 07:58:31.986272 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2812f64d-f792-4646-8368-274d9f3e9c46-catalog-content\") pod \"redhat-marketplace-pdglt\" (UID: \"2812f64d-f792-4646-8368-274d9f3e9c46\") " pod="openshift-marketplace/redhat-marketplace-pdglt" Mar 14 07:58:31 crc kubenswrapper[4893]: I0314 07:58:31.986383 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2812f64d-f792-4646-8368-274d9f3e9c46-utilities\") pod \"redhat-marketplace-pdglt\" (UID: \"2812f64d-f792-4646-8368-274d9f3e9c46\") " pod="openshift-marketplace/redhat-marketplace-pdglt" Mar 14 07:58:31 crc kubenswrapper[4893]: I0314 07:58:31.986465 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbmfw\" (UniqueName: \"kubernetes.io/projected/2812f64d-f792-4646-8368-274d9f3e9c46-kube-api-access-xbmfw\") pod \"redhat-marketplace-pdglt\" (UID: \"2812f64d-f792-4646-8368-274d9f3e9c46\") " pod="openshift-marketplace/redhat-marketplace-pdglt" Mar 14 07:58:32 crc kubenswrapper[4893]: I0314 07:58:32.088170 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2812f64d-f792-4646-8368-274d9f3e9c46-utilities\") pod \"redhat-marketplace-pdglt\" (UID: \"2812f64d-f792-4646-8368-274d9f3e9c46\") " pod="openshift-marketplace/redhat-marketplace-pdglt" Mar 14 07:58:32 crc kubenswrapper[4893]: I0314 07:58:32.088256 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbmfw\" (UniqueName: \"kubernetes.io/projected/2812f64d-f792-4646-8368-274d9f3e9c46-kube-api-access-xbmfw\") pod \"redhat-marketplace-pdglt\" (UID: \"2812f64d-f792-4646-8368-274d9f3e9c46\") " pod="openshift-marketplace/redhat-marketplace-pdglt" Mar 14 07:58:32 crc kubenswrapper[4893]: I0314 07:58:32.088313 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2812f64d-f792-4646-8368-274d9f3e9c46-catalog-content\") pod \"redhat-marketplace-pdglt\" (UID: \"2812f64d-f792-4646-8368-274d9f3e9c46\") " pod="openshift-marketplace/redhat-marketplace-pdglt" Mar 14 07:58:32 crc kubenswrapper[4893]: I0314 07:58:32.088862 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2812f64d-f792-4646-8368-274d9f3e9c46-utilities\") pod \"redhat-marketplace-pdglt\" (UID: \"2812f64d-f792-4646-8368-274d9f3e9c46\") " pod="openshift-marketplace/redhat-marketplace-pdglt" Mar 14 07:58:32 crc kubenswrapper[4893]: I0314 07:58:32.088911 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2812f64d-f792-4646-8368-274d9f3e9c46-catalog-content\") pod \"redhat-marketplace-pdglt\" (UID: \"2812f64d-f792-4646-8368-274d9f3e9c46\") " pod="openshift-marketplace/redhat-marketplace-pdglt" Mar 14 07:58:32 crc kubenswrapper[4893]: I0314 07:58:32.110648 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbmfw\" (UniqueName: \"kubernetes.io/projected/2812f64d-f792-4646-8368-274d9f3e9c46-kube-api-access-xbmfw\") pod \"redhat-marketplace-pdglt\" (UID: \"2812f64d-f792-4646-8368-274d9f3e9c46\") " pod="openshift-marketplace/redhat-marketplace-pdglt" Mar 14 07:58:32 crc kubenswrapper[4893]: I0314 07:58:32.186998 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pdglt" Mar 14 07:58:32 crc kubenswrapper[4893]: I0314 07:58:32.596364 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pdglt"] Mar 14 07:58:32 crc kubenswrapper[4893]: I0314 07:58:32.808649 4893 generic.go:334] "Generic (PLEG): container finished" podID="2812f64d-f792-4646-8368-274d9f3e9c46" containerID="00986dda815626dade93d271cc03a82d742f44b4a9be55791ea353b1341f7c9e" exitCode=0 Mar 14 07:58:32 crc kubenswrapper[4893]: I0314 07:58:32.808723 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pdglt" event={"ID":"2812f64d-f792-4646-8368-274d9f3e9c46","Type":"ContainerDied","Data":"00986dda815626dade93d271cc03a82d742f44b4a9be55791ea353b1341f7c9e"} Mar 14 07:58:32 crc kubenswrapper[4893]: I0314 07:58:32.809017 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pdglt" event={"ID":"2812f64d-f792-4646-8368-274d9f3e9c46","Type":"ContainerStarted","Data":"463cf4c4071241ff9847c0ec76acc90a27aa51fb7ccbbfede0487235923b9932"} Mar 14 07:58:33 crc kubenswrapper[4893]: I0314 07:58:33.817132 4893 generic.go:334] "Generic (PLEG): container finished" podID="2812f64d-f792-4646-8368-274d9f3e9c46" containerID="762f68d74378cad91be6d0bb8fb79e349ec62e19e4d3969540078f1b1b243289" exitCode=0 Mar 14 07:58:33 crc kubenswrapper[4893]: I0314 07:58:33.817172 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pdglt" event={"ID":"2812f64d-f792-4646-8368-274d9f3e9c46","Type":"ContainerDied","Data":"762f68d74378cad91be6d0bb8fb79e349ec62e19e4d3969540078f1b1b243289"} Mar 14 07:58:34 crc kubenswrapper[4893]: I0314 07:58:34.833679 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pdglt" event={"ID":"2812f64d-f792-4646-8368-274d9f3e9c46","Type":"ContainerStarted","Data":"603373d9e4aae4e55af08f5cf43deffa3e518c85bf24c5d651abf8943a735f35"} Mar 14 07:58:34 crc kubenswrapper[4893]: I0314 07:58:34.857847 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pdglt" podStartSLOduration=2.356500465 podStartE2EDuration="3.857826446s" podCreationTimestamp="2026-03-14 07:58:31 +0000 UTC" firstStartedPulling="2026-03-14 07:58:32.810099404 +0000 UTC m=+3592.072276196" lastFinishedPulling="2026-03-14 07:58:34.311425385 +0000 UTC m=+3593.573602177" observedRunningTime="2026-03-14 07:58:34.852125148 +0000 UTC m=+3594.114301950" watchObservedRunningTime="2026-03-14 07:58:34.857826446 +0000 UTC m=+3594.120003238" Mar 14 07:58:42 crc kubenswrapper[4893]: I0314 07:58:42.188158 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pdglt" Mar 14 07:58:42 crc kubenswrapper[4893]: I0314 07:58:42.188736 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pdglt" Mar 14 07:58:42 crc kubenswrapper[4893]: I0314 07:58:42.226467 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pdglt" Mar 14 07:58:42 crc kubenswrapper[4893]: I0314 07:58:42.934310 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pdglt" Mar 14 07:58:42 crc kubenswrapper[4893]: I0314 07:58:42.972862 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pdglt"] Mar 14 07:58:44 crc kubenswrapper[4893]: I0314 07:58:44.907130 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pdglt" podUID="2812f64d-f792-4646-8368-274d9f3e9c46" containerName="registry-server" containerID="cri-o://603373d9e4aae4e55af08f5cf43deffa3e518c85bf24c5d651abf8943a735f35" gracePeriod=2 Mar 14 07:58:45 crc kubenswrapper[4893]: I0314 07:58:45.267012 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pdglt" Mar 14 07:58:45 crc kubenswrapper[4893]: I0314 07:58:45.386380 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbmfw\" (UniqueName: \"kubernetes.io/projected/2812f64d-f792-4646-8368-274d9f3e9c46-kube-api-access-xbmfw\") pod \"2812f64d-f792-4646-8368-274d9f3e9c46\" (UID: \"2812f64d-f792-4646-8368-274d9f3e9c46\") " Mar 14 07:58:45 crc kubenswrapper[4893]: I0314 07:58:45.386620 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2812f64d-f792-4646-8368-274d9f3e9c46-catalog-content\") pod \"2812f64d-f792-4646-8368-274d9f3e9c46\" (UID: \"2812f64d-f792-4646-8368-274d9f3e9c46\") " Mar 14 07:58:45 crc kubenswrapper[4893]: I0314 07:58:45.386709 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2812f64d-f792-4646-8368-274d9f3e9c46-utilities\") pod \"2812f64d-f792-4646-8368-274d9f3e9c46\" (UID: \"2812f64d-f792-4646-8368-274d9f3e9c46\") " Mar 14 07:58:45 crc kubenswrapper[4893]: I0314 07:58:45.388422 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2812f64d-f792-4646-8368-274d9f3e9c46-utilities" (OuterVolumeSpecName: "utilities") pod "2812f64d-f792-4646-8368-274d9f3e9c46" (UID: "2812f64d-f792-4646-8368-274d9f3e9c46"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:58:45 crc kubenswrapper[4893]: I0314 07:58:45.394253 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2812f64d-f792-4646-8368-274d9f3e9c46-kube-api-access-xbmfw" (OuterVolumeSpecName: "kube-api-access-xbmfw") pod "2812f64d-f792-4646-8368-274d9f3e9c46" (UID: "2812f64d-f792-4646-8368-274d9f3e9c46"). InnerVolumeSpecName "kube-api-access-xbmfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:58:45 crc kubenswrapper[4893]: I0314 07:58:45.413959 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2812f64d-f792-4646-8368-274d9f3e9c46-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2812f64d-f792-4646-8368-274d9f3e9c46" (UID: "2812f64d-f792-4646-8368-274d9f3e9c46"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:58:45 crc kubenswrapper[4893]: I0314 07:58:45.489134 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbmfw\" (UniqueName: \"kubernetes.io/projected/2812f64d-f792-4646-8368-274d9f3e9c46-kube-api-access-xbmfw\") on node \"crc\" DevicePath \"\"" Mar 14 07:58:45 crc kubenswrapper[4893]: I0314 07:58:45.489171 4893 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2812f64d-f792-4646-8368-274d9f3e9c46-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 07:58:45 crc kubenswrapper[4893]: I0314 07:58:45.489181 4893 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2812f64d-f792-4646-8368-274d9f3e9c46-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 07:58:45 crc kubenswrapper[4893]: I0314 07:58:45.916479 4893 generic.go:334] "Generic (PLEG): container finished" podID="2812f64d-f792-4646-8368-274d9f3e9c46" containerID="603373d9e4aae4e55af08f5cf43deffa3e518c85bf24c5d651abf8943a735f35" exitCode=0 Mar 14 07:58:45 crc kubenswrapper[4893]: I0314 07:58:45.916901 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pdglt" event={"ID":"2812f64d-f792-4646-8368-274d9f3e9c46","Type":"ContainerDied","Data":"603373d9e4aae4e55af08f5cf43deffa3e518c85bf24c5d651abf8943a735f35"} Mar 14 07:58:45 crc kubenswrapper[4893]: I0314 07:58:45.916927 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pdglt" event={"ID":"2812f64d-f792-4646-8368-274d9f3e9c46","Type":"ContainerDied","Data":"463cf4c4071241ff9847c0ec76acc90a27aa51fb7ccbbfede0487235923b9932"} Mar 14 07:58:45 crc kubenswrapper[4893]: I0314 07:58:45.916943 4893 scope.go:117] "RemoveContainer" containerID="603373d9e4aae4e55af08f5cf43deffa3e518c85bf24c5d651abf8943a735f35" Mar 14 07:58:45 crc kubenswrapper[4893]: I0314 07:58:45.917080 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pdglt" Mar 14 07:58:45 crc kubenswrapper[4893]: I0314 07:58:45.945391 4893 scope.go:117] "RemoveContainer" containerID="762f68d74378cad91be6d0bb8fb79e349ec62e19e4d3969540078f1b1b243289" Mar 14 07:58:45 crc kubenswrapper[4893]: I0314 07:58:45.949163 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pdglt"] Mar 14 07:58:45 crc kubenswrapper[4893]: I0314 07:58:45.958829 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pdglt"] Mar 14 07:58:45 crc kubenswrapper[4893]: I0314 07:58:45.979409 4893 scope.go:117] "RemoveContainer" containerID="00986dda815626dade93d271cc03a82d742f44b4a9be55791ea353b1341f7c9e" Mar 14 07:58:45 crc kubenswrapper[4893]: I0314 07:58:45.993514 4893 scope.go:117] "RemoveContainer" containerID="603373d9e4aae4e55af08f5cf43deffa3e518c85bf24c5d651abf8943a735f35" Mar 14 07:58:45 crc kubenswrapper[4893]: E0314 07:58:45.993955 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"603373d9e4aae4e55af08f5cf43deffa3e518c85bf24c5d651abf8943a735f35\": container with ID starting with 603373d9e4aae4e55af08f5cf43deffa3e518c85bf24c5d651abf8943a735f35 not found: ID does not exist" containerID="603373d9e4aae4e55af08f5cf43deffa3e518c85bf24c5d651abf8943a735f35" Mar 14 07:58:45 crc kubenswrapper[4893]: I0314 07:58:45.993999 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"603373d9e4aae4e55af08f5cf43deffa3e518c85bf24c5d651abf8943a735f35"} err="failed to get container status \"603373d9e4aae4e55af08f5cf43deffa3e518c85bf24c5d651abf8943a735f35\": rpc error: code = NotFound desc = could not find container \"603373d9e4aae4e55af08f5cf43deffa3e518c85bf24c5d651abf8943a735f35\": container with ID starting with 603373d9e4aae4e55af08f5cf43deffa3e518c85bf24c5d651abf8943a735f35 not found: ID does not exist" Mar 14 07:58:45 crc kubenswrapper[4893]: I0314 07:58:45.994073 4893 scope.go:117] "RemoveContainer" containerID="762f68d74378cad91be6d0bb8fb79e349ec62e19e4d3969540078f1b1b243289" Mar 14 07:58:45 crc kubenswrapper[4893]: E0314 07:58:45.994419 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"762f68d74378cad91be6d0bb8fb79e349ec62e19e4d3969540078f1b1b243289\": container with ID starting with 762f68d74378cad91be6d0bb8fb79e349ec62e19e4d3969540078f1b1b243289 not found: ID does not exist" containerID="762f68d74378cad91be6d0bb8fb79e349ec62e19e4d3969540078f1b1b243289" Mar 14 07:58:45 crc kubenswrapper[4893]: I0314 07:58:45.994468 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"762f68d74378cad91be6d0bb8fb79e349ec62e19e4d3969540078f1b1b243289"} err="failed to get container status \"762f68d74378cad91be6d0bb8fb79e349ec62e19e4d3969540078f1b1b243289\": rpc error: code = NotFound desc = could not find container \"762f68d74378cad91be6d0bb8fb79e349ec62e19e4d3969540078f1b1b243289\": container with ID starting with 762f68d74378cad91be6d0bb8fb79e349ec62e19e4d3969540078f1b1b243289 not found: ID does not exist" Mar 14 07:58:45 crc kubenswrapper[4893]: I0314 07:58:45.994486 4893 scope.go:117] "RemoveContainer" containerID="00986dda815626dade93d271cc03a82d742f44b4a9be55791ea353b1341f7c9e" Mar 14 07:58:45 crc kubenswrapper[4893]: E0314 07:58:45.994742 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00986dda815626dade93d271cc03a82d742f44b4a9be55791ea353b1341f7c9e\": container with ID starting with 00986dda815626dade93d271cc03a82d742f44b4a9be55791ea353b1341f7c9e not found: ID does not exist" containerID="00986dda815626dade93d271cc03a82d742f44b4a9be55791ea353b1341f7c9e" Mar 14 07:58:45 crc kubenswrapper[4893]: I0314 07:58:45.994768 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00986dda815626dade93d271cc03a82d742f44b4a9be55791ea353b1341f7c9e"} err="failed to get container status \"00986dda815626dade93d271cc03a82d742f44b4a9be55791ea353b1341f7c9e\": rpc error: code = NotFound desc = could not find container \"00986dda815626dade93d271cc03a82d742f44b4a9be55791ea353b1341f7c9e\": container with ID starting with 00986dda815626dade93d271cc03a82d742f44b4a9be55791ea353b1341f7c9e not found: ID does not exist" Mar 14 07:58:47 crc kubenswrapper[4893]: I0314 07:58:47.385685 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2812f64d-f792-4646-8368-274d9f3e9c46" path="/var/lib/kubelet/pods/2812f64d-f792-4646-8368-274d9f3e9c46/volumes" Mar 14 08:00:00 crc kubenswrapper[4893]: I0314 08:00:00.161585 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557920-5g6pt"] Mar 14 08:00:00 crc kubenswrapper[4893]: E0314 08:00:00.162513 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2812f64d-f792-4646-8368-274d9f3e9c46" containerName="registry-server" Mar 14 08:00:00 crc kubenswrapper[4893]: I0314 08:00:00.162550 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="2812f64d-f792-4646-8368-274d9f3e9c46" containerName="registry-server" Mar 14 08:00:00 crc kubenswrapper[4893]: E0314 08:00:00.162573 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2812f64d-f792-4646-8368-274d9f3e9c46" containerName="extract-content" Mar 14 08:00:00 crc kubenswrapper[4893]: I0314 08:00:00.162580 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="2812f64d-f792-4646-8368-274d9f3e9c46" containerName="extract-content" Mar 14 08:00:00 crc kubenswrapper[4893]: E0314 08:00:00.162598 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2812f64d-f792-4646-8368-274d9f3e9c46" containerName="extract-utilities" Mar 14 08:00:00 crc kubenswrapper[4893]: I0314 08:00:00.162606 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="2812f64d-f792-4646-8368-274d9f3e9c46" containerName="extract-utilities" Mar 14 08:00:00 crc kubenswrapper[4893]: I0314 08:00:00.162763 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="2812f64d-f792-4646-8368-274d9f3e9c46" containerName="registry-server" Mar 14 08:00:00 crc kubenswrapper[4893]: I0314 08:00:00.163310 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557920-5g6pt" Mar 14 08:00:00 crc kubenswrapper[4893]: I0314 08:00:00.167890 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 14 08:00:00 crc kubenswrapper[4893]: I0314 08:00:00.168316 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 14 08:00:00 crc kubenswrapper[4893]: I0314 08:00:00.178891 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557920-5g6pt"] Mar 14 08:00:00 crc kubenswrapper[4893]: I0314 08:00:00.247331 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5631551a-67c1-42ad-8cea-3e735558e2db-secret-volume\") pod \"collect-profiles-29557920-5g6pt\" (UID: \"5631551a-67c1-42ad-8cea-3e735558e2db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557920-5g6pt" Mar 14 08:00:00 crc kubenswrapper[4893]: I0314 08:00:00.247683 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5631551a-67c1-42ad-8cea-3e735558e2db-config-volume\") pod \"collect-profiles-29557920-5g6pt\" (UID: \"5631551a-67c1-42ad-8cea-3e735558e2db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557920-5g6pt" Mar 14 08:00:00 crc kubenswrapper[4893]: I0314 08:00:00.247751 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pw9fn\" (UniqueName: \"kubernetes.io/projected/5631551a-67c1-42ad-8cea-3e735558e2db-kube-api-access-pw9fn\") pod \"collect-profiles-29557920-5g6pt\" (UID: \"5631551a-67c1-42ad-8cea-3e735558e2db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557920-5g6pt" Mar 14 08:00:00 crc kubenswrapper[4893]: I0314 08:00:00.257473 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557920-2t5wd"] Mar 14 08:00:00 crc kubenswrapper[4893]: I0314 08:00:00.258548 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557920-2t5wd" Mar 14 08:00:00 crc kubenswrapper[4893]: I0314 08:00:00.263404 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 08:00:00 crc kubenswrapper[4893]: I0314 08:00:00.263504 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 08:00:00 crc kubenswrapper[4893]: I0314 08:00:00.263779 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-44qb7" Mar 14 08:00:00 crc kubenswrapper[4893]: I0314 08:00:00.269959 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557920-2t5wd"] Mar 14 08:00:00 crc kubenswrapper[4893]: I0314 08:00:00.348912 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5631551a-67c1-42ad-8cea-3e735558e2db-secret-volume\") pod \"collect-profiles-29557920-5g6pt\" (UID: \"5631551a-67c1-42ad-8cea-3e735558e2db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557920-5g6pt" Mar 14 08:00:00 crc kubenswrapper[4893]: I0314 08:00:00.348985 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5631551a-67c1-42ad-8cea-3e735558e2db-config-volume\") pod \"collect-profiles-29557920-5g6pt\" (UID: \"5631551a-67c1-42ad-8cea-3e735558e2db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557920-5g6pt" Mar 14 08:00:00 crc kubenswrapper[4893]: I0314 08:00:00.349024 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cjtw\" (UniqueName: \"kubernetes.io/projected/1cc321fc-d0f5-4d3a-94a2-51680b2cbb12-kube-api-access-5cjtw\") pod \"auto-csr-approver-29557920-2t5wd\" (UID: \"1cc321fc-d0f5-4d3a-94a2-51680b2cbb12\") " pod="openshift-infra/auto-csr-approver-29557920-2t5wd" Mar 14 08:00:00 crc kubenswrapper[4893]: I0314 08:00:00.349114 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pw9fn\" (UniqueName: \"kubernetes.io/projected/5631551a-67c1-42ad-8cea-3e735558e2db-kube-api-access-pw9fn\") pod \"collect-profiles-29557920-5g6pt\" (UID: \"5631551a-67c1-42ad-8cea-3e735558e2db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557920-5g6pt" Mar 14 08:00:00 crc kubenswrapper[4893]: I0314 08:00:00.350149 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5631551a-67c1-42ad-8cea-3e735558e2db-config-volume\") pod \"collect-profiles-29557920-5g6pt\" (UID: \"5631551a-67c1-42ad-8cea-3e735558e2db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557920-5g6pt" Mar 14 08:00:00 crc kubenswrapper[4893]: I0314 08:00:00.354424 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5631551a-67c1-42ad-8cea-3e735558e2db-secret-volume\") pod \"collect-profiles-29557920-5g6pt\" (UID: \"5631551a-67c1-42ad-8cea-3e735558e2db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557920-5g6pt" Mar 14 08:00:00 crc kubenswrapper[4893]: I0314 08:00:00.365491 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pw9fn\" (UniqueName: \"kubernetes.io/projected/5631551a-67c1-42ad-8cea-3e735558e2db-kube-api-access-pw9fn\") pod \"collect-profiles-29557920-5g6pt\" (UID: \"5631551a-67c1-42ad-8cea-3e735558e2db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557920-5g6pt" Mar 14 08:00:00 crc kubenswrapper[4893]: I0314 08:00:00.450031 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cjtw\" (UniqueName: \"kubernetes.io/projected/1cc321fc-d0f5-4d3a-94a2-51680b2cbb12-kube-api-access-5cjtw\") pod \"auto-csr-approver-29557920-2t5wd\" (UID: \"1cc321fc-d0f5-4d3a-94a2-51680b2cbb12\") " pod="openshift-infra/auto-csr-approver-29557920-2t5wd" Mar 14 08:00:00 crc kubenswrapper[4893]: I0314 08:00:00.465130 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cjtw\" (UniqueName: \"kubernetes.io/projected/1cc321fc-d0f5-4d3a-94a2-51680b2cbb12-kube-api-access-5cjtw\") pod \"auto-csr-approver-29557920-2t5wd\" (UID: \"1cc321fc-d0f5-4d3a-94a2-51680b2cbb12\") " pod="openshift-infra/auto-csr-approver-29557920-2t5wd" Mar 14 08:00:00 crc kubenswrapper[4893]: I0314 08:00:00.486535 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557920-5g6pt" Mar 14 08:00:00 crc kubenswrapper[4893]: I0314 08:00:00.575939 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557920-2t5wd" Mar 14 08:00:00 crc kubenswrapper[4893]: I0314 08:00:00.916487 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557920-5g6pt"] Mar 14 08:00:01 crc kubenswrapper[4893]: I0314 08:00:01.011950 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557920-2t5wd"] Mar 14 08:00:01 crc kubenswrapper[4893]: W0314 08:00:01.017692 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1cc321fc_d0f5_4d3a_94a2_51680b2cbb12.slice/crio-eff26481908b7fdd1e39ff70b1e7455ca7cf2b866162600350f38590e24d2088 WatchSource:0}: Error finding container eff26481908b7fdd1e39ff70b1e7455ca7cf2b866162600350f38590e24d2088: Status 404 returned error can't find the container with id eff26481908b7fdd1e39ff70b1e7455ca7cf2b866162600350f38590e24d2088 Mar 14 08:00:01 crc kubenswrapper[4893]: I0314 08:00:01.527748 4893 generic.go:334] "Generic (PLEG): container finished" podID="5631551a-67c1-42ad-8cea-3e735558e2db" containerID="63dd343d39cda292c1079e99d54f833a79a12b6a9af41c23e124e812f73c9451" exitCode=0 Mar 14 08:00:01 crc kubenswrapper[4893]: I0314 08:00:01.527805 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557920-5g6pt" event={"ID":"5631551a-67c1-42ad-8cea-3e735558e2db","Type":"ContainerDied","Data":"63dd343d39cda292c1079e99d54f833a79a12b6a9af41c23e124e812f73c9451"} Mar 14 08:00:01 crc kubenswrapper[4893]: I0314 08:00:01.528199 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557920-5g6pt" event={"ID":"5631551a-67c1-42ad-8cea-3e735558e2db","Type":"ContainerStarted","Data":"57c662be395ee1df115f449c51fcd0f868e4ee04582e127a6bfaca4320f35d44"} Mar 14 08:00:01 crc kubenswrapper[4893]: I0314 08:00:01.529170 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557920-2t5wd" event={"ID":"1cc321fc-d0f5-4d3a-94a2-51680b2cbb12","Type":"ContainerStarted","Data":"eff26481908b7fdd1e39ff70b1e7455ca7cf2b866162600350f38590e24d2088"} Mar 14 08:00:02 crc kubenswrapper[4893]: I0314 08:00:02.770244 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557920-5g6pt" Mar 14 08:00:02 crc kubenswrapper[4893]: I0314 08:00:02.886405 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pw9fn\" (UniqueName: \"kubernetes.io/projected/5631551a-67c1-42ad-8cea-3e735558e2db-kube-api-access-pw9fn\") pod \"5631551a-67c1-42ad-8cea-3e735558e2db\" (UID: \"5631551a-67c1-42ad-8cea-3e735558e2db\") " Mar 14 08:00:02 crc kubenswrapper[4893]: I0314 08:00:02.886546 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5631551a-67c1-42ad-8cea-3e735558e2db-config-volume\") pod \"5631551a-67c1-42ad-8cea-3e735558e2db\" (UID: \"5631551a-67c1-42ad-8cea-3e735558e2db\") " Mar 14 08:00:02 crc kubenswrapper[4893]: I0314 08:00:02.886580 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5631551a-67c1-42ad-8cea-3e735558e2db-secret-volume\") pod \"5631551a-67c1-42ad-8cea-3e735558e2db\" (UID: \"5631551a-67c1-42ad-8cea-3e735558e2db\") " Mar 14 08:00:02 crc kubenswrapper[4893]: I0314 08:00:02.887356 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5631551a-67c1-42ad-8cea-3e735558e2db-config-volume" (OuterVolumeSpecName: "config-volume") pod "5631551a-67c1-42ad-8cea-3e735558e2db" (UID: "5631551a-67c1-42ad-8cea-3e735558e2db"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:00:02 crc kubenswrapper[4893]: I0314 08:00:02.893182 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5631551a-67c1-42ad-8cea-3e735558e2db-kube-api-access-pw9fn" (OuterVolumeSpecName: "kube-api-access-pw9fn") pod "5631551a-67c1-42ad-8cea-3e735558e2db" (UID: "5631551a-67c1-42ad-8cea-3e735558e2db"). InnerVolumeSpecName "kube-api-access-pw9fn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:00:02 crc kubenswrapper[4893]: I0314 08:00:02.893737 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5631551a-67c1-42ad-8cea-3e735558e2db-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5631551a-67c1-42ad-8cea-3e735558e2db" (UID: "5631551a-67c1-42ad-8cea-3e735558e2db"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:00:02 crc kubenswrapper[4893]: I0314 08:00:02.987891 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pw9fn\" (UniqueName: \"kubernetes.io/projected/5631551a-67c1-42ad-8cea-3e735558e2db-kube-api-access-pw9fn\") on node \"crc\" DevicePath \"\"" Mar 14 08:00:02 crc kubenswrapper[4893]: I0314 08:00:02.988113 4893 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5631551a-67c1-42ad-8cea-3e735558e2db-config-volume\") on node \"crc\" DevicePath \"\"" Mar 14 08:00:02 crc kubenswrapper[4893]: I0314 08:00:02.988200 4893 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5631551a-67c1-42ad-8cea-3e735558e2db-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 14 08:00:03 crc kubenswrapper[4893]: I0314 08:00:03.544675 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557920-5g6pt" event={"ID":"5631551a-67c1-42ad-8cea-3e735558e2db","Type":"ContainerDied","Data":"57c662be395ee1df115f449c51fcd0f868e4ee04582e127a6bfaca4320f35d44"} Mar 14 08:00:03 crc kubenswrapper[4893]: I0314 08:00:03.544717 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="57c662be395ee1df115f449c51fcd0f868e4ee04582e127a6bfaca4320f35d44" Mar 14 08:00:03 crc kubenswrapper[4893]: I0314 08:00:03.544760 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557920-5g6pt" Mar 14 08:00:03 crc kubenswrapper[4893]: I0314 08:00:03.842348 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557875-qf9nd"] Mar 14 08:00:03 crc kubenswrapper[4893]: I0314 08:00:03.847324 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557875-qf9nd"] Mar 14 08:00:05 crc kubenswrapper[4893]: I0314 08:00:05.385056 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a27ae91a-070d-493c-a949-3ae6ffe8d411" path="/var/lib/kubelet/pods/a27ae91a-070d-493c-a949-3ae6ffe8d411/volumes" Mar 14 08:00:08 crc kubenswrapper[4893]: I0314 08:00:08.324597 4893 scope.go:117] "RemoveContainer" containerID="edbf875ce088881f17ee94c2d60bb6b28c0e3260767d018d787baece34481800" Mar 14 08:00:25 crc kubenswrapper[4893]: I0314 08:00:25.706881 4893 generic.go:334] "Generic (PLEG): container finished" podID="1cc321fc-d0f5-4d3a-94a2-51680b2cbb12" containerID="8ccba8a80c4d3b3b00968d284f510a5117e0b3aa5a15da14d259d5b48c258fc4" exitCode=0 Mar 14 08:00:25 crc kubenswrapper[4893]: I0314 08:00:25.707370 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557920-2t5wd" event={"ID":"1cc321fc-d0f5-4d3a-94a2-51680b2cbb12","Type":"ContainerDied","Data":"8ccba8a80c4d3b3b00968d284f510a5117e0b3aa5a15da14d259d5b48c258fc4"} Mar 14 08:00:26 crc kubenswrapper[4893]: I0314 08:00:26.978440 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557920-2t5wd" Mar 14 08:00:27 crc kubenswrapper[4893]: I0314 08:00:27.021496 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5cjtw\" (UniqueName: \"kubernetes.io/projected/1cc321fc-d0f5-4d3a-94a2-51680b2cbb12-kube-api-access-5cjtw\") pod \"1cc321fc-d0f5-4d3a-94a2-51680b2cbb12\" (UID: \"1cc321fc-d0f5-4d3a-94a2-51680b2cbb12\") " Mar 14 08:00:27 crc kubenswrapper[4893]: I0314 08:00:27.028022 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cc321fc-d0f5-4d3a-94a2-51680b2cbb12-kube-api-access-5cjtw" (OuterVolumeSpecName: "kube-api-access-5cjtw") pod "1cc321fc-d0f5-4d3a-94a2-51680b2cbb12" (UID: "1cc321fc-d0f5-4d3a-94a2-51680b2cbb12"). InnerVolumeSpecName "kube-api-access-5cjtw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:00:27 crc kubenswrapper[4893]: I0314 08:00:27.122808 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5cjtw\" (UniqueName: \"kubernetes.io/projected/1cc321fc-d0f5-4d3a-94a2-51680b2cbb12-kube-api-access-5cjtw\") on node \"crc\" DevicePath \"\"" Mar 14 08:00:27 crc kubenswrapper[4893]: I0314 08:00:27.726241 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557920-2t5wd" event={"ID":"1cc321fc-d0f5-4d3a-94a2-51680b2cbb12","Type":"ContainerDied","Data":"eff26481908b7fdd1e39ff70b1e7455ca7cf2b866162600350f38590e24d2088"} Mar 14 08:00:27 crc kubenswrapper[4893]: I0314 08:00:27.726614 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eff26481908b7fdd1e39ff70b1e7455ca7cf2b866162600350f38590e24d2088" Mar 14 08:00:27 crc kubenswrapper[4893]: I0314 08:00:27.726281 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557920-2t5wd" Mar 14 08:00:28 crc kubenswrapper[4893]: I0314 08:00:28.028362 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557914-dnfgf"] Mar 14 08:00:28 crc kubenswrapper[4893]: I0314 08:00:28.033722 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557914-dnfgf"] Mar 14 08:00:29 crc kubenswrapper[4893]: I0314 08:00:29.384909 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6df7899-9f4b-4121-8025-9290657a6bd3" path="/var/lib/kubelet/pods/e6df7899-9f4b-4121-8025-9290657a6bd3/volumes" Mar 14 08:00:29 crc kubenswrapper[4893]: I0314 08:00:29.732394 4893 patch_prober.go:28] interesting pod/machine-config-daemon-d4x6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:00:29 crc kubenswrapper[4893]: I0314 08:00:29.732627 4893 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:00:59 crc kubenswrapper[4893]: I0314 08:00:59.730902 4893 patch_prober.go:28] interesting pod/machine-config-daemon-d4x6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:00:59 crc kubenswrapper[4893]: I0314 08:00:59.731507 4893 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:01:08 crc kubenswrapper[4893]: I0314 08:01:08.365982 4893 scope.go:117] "RemoveContainer" containerID="800ac63c5921614b16f269925d4144d36ef0e0f9e0e588031b796af66f99317f" Mar 14 08:01:29 crc kubenswrapper[4893]: I0314 08:01:29.731680 4893 patch_prober.go:28] interesting pod/machine-config-daemon-d4x6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:01:29 crc kubenswrapper[4893]: I0314 08:01:29.732345 4893 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:01:29 crc kubenswrapper[4893]: I0314 08:01:29.732410 4893 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" Mar 14 08:01:29 crc kubenswrapper[4893]: I0314 08:01:29.733264 4893 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e12005c3784aac2361996712f043ac7b2eec25802043ed8145c0668053f67c8d"} pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 08:01:29 crc kubenswrapper[4893]: I0314 08:01:29.733339 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" containerID="cri-o://e12005c3784aac2361996712f043ac7b2eec25802043ed8145c0668053f67c8d" gracePeriod=600 Mar 14 08:01:29 crc kubenswrapper[4893]: E0314 08:01:29.856243 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:01:30 crc kubenswrapper[4893]: I0314 08:01:30.222805 4893 generic.go:334] "Generic (PLEG): container finished" podID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerID="e12005c3784aac2361996712f043ac7b2eec25802043ed8145c0668053f67c8d" exitCode=0 Mar 14 08:01:30 crc kubenswrapper[4893]: I0314 08:01:30.222860 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" event={"ID":"ad6724e5-48cf-4417-ae51-b1cb8c6af70d","Type":"ContainerDied","Data":"e12005c3784aac2361996712f043ac7b2eec25802043ed8145c0668053f67c8d"} Mar 14 08:01:30 crc kubenswrapper[4893]: I0314 08:01:30.222919 4893 scope.go:117] "RemoveContainer" containerID="32e856774aaaa7f3a020545a154284566b18a34781ac94de4a1754e21d05aa2e" Mar 14 08:01:30 crc kubenswrapper[4893]: I0314 08:01:30.223506 4893 scope.go:117] "RemoveContainer" containerID="e12005c3784aac2361996712f043ac7b2eec25802043ed8145c0668053f67c8d" Mar 14 08:01:30 crc kubenswrapper[4893]: E0314 08:01:30.223846 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:01:42 crc kubenswrapper[4893]: I0314 08:01:42.376594 4893 scope.go:117] "RemoveContainer" containerID="e12005c3784aac2361996712f043ac7b2eec25802043ed8145c0668053f67c8d" Mar 14 08:01:42 crc kubenswrapper[4893]: E0314 08:01:42.377469 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:01:56 crc kubenswrapper[4893]: I0314 08:01:56.377606 4893 scope.go:117] "RemoveContainer" containerID="e12005c3784aac2361996712f043ac7b2eec25802043ed8145c0668053f67c8d" Mar 14 08:01:56 crc kubenswrapper[4893]: E0314 08:01:56.380512 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:02:00 crc kubenswrapper[4893]: I0314 08:02:00.179218 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557922-v7wgf"] Mar 14 08:02:00 crc kubenswrapper[4893]: E0314 08:02:00.180054 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cc321fc-d0f5-4d3a-94a2-51680b2cbb12" containerName="oc" Mar 14 08:02:00 crc kubenswrapper[4893]: I0314 08:02:00.180068 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cc321fc-d0f5-4d3a-94a2-51680b2cbb12" containerName="oc" Mar 14 08:02:00 crc kubenswrapper[4893]: E0314 08:02:00.180092 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5631551a-67c1-42ad-8cea-3e735558e2db" containerName="collect-profiles" Mar 14 08:02:00 crc kubenswrapper[4893]: I0314 08:02:00.180098 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="5631551a-67c1-42ad-8cea-3e735558e2db" containerName="collect-profiles" Mar 14 08:02:00 crc kubenswrapper[4893]: I0314 08:02:00.180253 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="5631551a-67c1-42ad-8cea-3e735558e2db" containerName="collect-profiles" Mar 14 08:02:00 crc kubenswrapper[4893]: I0314 08:02:00.180264 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cc321fc-d0f5-4d3a-94a2-51680b2cbb12" containerName="oc" Mar 14 08:02:00 crc kubenswrapper[4893]: I0314 08:02:00.180744 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557922-v7wgf" Mar 14 08:02:00 crc kubenswrapper[4893]: I0314 08:02:00.183108 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 08:02:00 crc kubenswrapper[4893]: I0314 08:02:00.183365 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 08:02:00 crc kubenswrapper[4893]: I0314 08:02:00.184835 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-44qb7" Mar 14 08:02:00 crc kubenswrapper[4893]: I0314 08:02:00.201261 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557922-v7wgf"] Mar 14 08:02:00 crc kubenswrapper[4893]: I0314 08:02:00.371835 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4r5p\" (UniqueName: \"kubernetes.io/projected/6f084cb0-1c92-4a6a-879a-83c7e30161b9-kube-api-access-n4r5p\") pod \"auto-csr-approver-29557922-v7wgf\" (UID: \"6f084cb0-1c92-4a6a-879a-83c7e30161b9\") " pod="openshift-infra/auto-csr-approver-29557922-v7wgf" Mar 14 08:02:00 crc kubenswrapper[4893]: I0314 08:02:00.473004 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4r5p\" (UniqueName: \"kubernetes.io/projected/6f084cb0-1c92-4a6a-879a-83c7e30161b9-kube-api-access-n4r5p\") pod \"auto-csr-approver-29557922-v7wgf\" (UID: \"6f084cb0-1c92-4a6a-879a-83c7e30161b9\") " pod="openshift-infra/auto-csr-approver-29557922-v7wgf" Mar 14 08:02:00 crc kubenswrapper[4893]: I0314 08:02:00.501187 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4r5p\" (UniqueName: \"kubernetes.io/projected/6f084cb0-1c92-4a6a-879a-83c7e30161b9-kube-api-access-n4r5p\") pod \"auto-csr-approver-29557922-v7wgf\" (UID: \"6f084cb0-1c92-4a6a-879a-83c7e30161b9\") " pod="openshift-infra/auto-csr-approver-29557922-v7wgf" Mar 14 08:02:00 crc kubenswrapper[4893]: I0314 08:02:00.798630 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557922-v7wgf" Mar 14 08:02:01 crc kubenswrapper[4893]: I0314 08:02:01.238961 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557922-v7wgf"] Mar 14 08:02:01 crc kubenswrapper[4893]: I0314 08:02:01.465348 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557922-v7wgf" event={"ID":"6f084cb0-1c92-4a6a-879a-83c7e30161b9","Type":"ContainerStarted","Data":"160692d1d49189f225a155087aef8d3b67b30ae0bcf1dafdb0f7386806154e7e"} Mar 14 08:02:02 crc kubenswrapper[4893]: I0314 08:02:02.472252 4893 generic.go:334] "Generic (PLEG): container finished" podID="6f084cb0-1c92-4a6a-879a-83c7e30161b9" containerID="319687da446251c9089809ac2740feba427fc38527c13ba548c0dc1e69aa05d8" exitCode=0 Mar 14 08:02:02 crc kubenswrapper[4893]: I0314 08:02:02.472330 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557922-v7wgf" event={"ID":"6f084cb0-1c92-4a6a-879a-83c7e30161b9","Type":"ContainerDied","Data":"319687da446251c9089809ac2740feba427fc38527c13ba548c0dc1e69aa05d8"} Mar 14 08:02:03 crc kubenswrapper[4893]: I0314 08:02:03.783715 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557922-v7wgf" Mar 14 08:02:03 crc kubenswrapper[4893]: I0314 08:02:03.916770 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4r5p\" (UniqueName: \"kubernetes.io/projected/6f084cb0-1c92-4a6a-879a-83c7e30161b9-kube-api-access-n4r5p\") pod \"6f084cb0-1c92-4a6a-879a-83c7e30161b9\" (UID: \"6f084cb0-1c92-4a6a-879a-83c7e30161b9\") " Mar 14 08:02:03 crc kubenswrapper[4893]: I0314 08:02:03.925878 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f084cb0-1c92-4a6a-879a-83c7e30161b9-kube-api-access-n4r5p" (OuterVolumeSpecName: "kube-api-access-n4r5p") pod "6f084cb0-1c92-4a6a-879a-83c7e30161b9" (UID: "6f084cb0-1c92-4a6a-879a-83c7e30161b9"). InnerVolumeSpecName "kube-api-access-n4r5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:02:04 crc kubenswrapper[4893]: I0314 08:02:04.018112 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4r5p\" (UniqueName: \"kubernetes.io/projected/6f084cb0-1c92-4a6a-879a-83c7e30161b9-kube-api-access-n4r5p\") on node \"crc\" DevicePath \"\"" Mar 14 08:02:04 crc kubenswrapper[4893]: I0314 08:02:04.487851 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557922-v7wgf" event={"ID":"6f084cb0-1c92-4a6a-879a-83c7e30161b9","Type":"ContainerDied","Data":"160692d1d49189f225a155087aef8d3b67b30ae0bcf1dafdb0f7386806154e7e"} Mar 14 08:02:04 crc kubenswrapper[4893]: I0314 08:02:04.488179 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="160692d1d49189f225a155087aef8d3b67b30ae0bcf1dafdb0f7386806154e7e" Mar 14 08:02:04 crc kubenswrapper[4893]: I0314 08:02:04.487921 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557922-v7wgf" Mar 14 08:02:04 crc kubenswrapper[4893]: I0314 08:02:04.847122 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557916-vrrd2"] Mar 14 08:02:04 crc kubenswrapper[4893]: I0314 08:02:04.851661 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557916-vrrd2"] Mar 14 08:02:05 crc kubenswrapper[4893]: I0314 08:02:05.384369 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9cabd36-ad52-4161-94a7-aaa72b6eeb9c" path="/var/lib/kubelet/pods/d9cabd36-ad52-4161-94a7-aaa72b6eeb9c/volumes" Mar 14 08:02:08 crc kubenswrapper[4893]: I0314 08:02:08.376363 4893 scope.go:117] "RemoveContainer" containerID="e12005c3784aac2361996712f043ac7b2eec25802043ed8145c0668053f67c8d" Mar 14 08:02:08 crc kubenswrapper[4893]: E0314 08:02:08.376961 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:02:08 crc kubenswrapper[4893]: I0314 08:02:08.428063 4893 scope.go:117] "RemoveContainer" containerID="dc8fcd96f1d56cc44be15d9af74681fe2a4ad8c84f52bb28d81cc79d0c576c2d" Mar 14 08:02:19 crc kubenswrapper[4893]: I0314 08:02:19.376204 4893 scope.go:117] "RemoveContainer" containerID="e12005c3784aac2361996712f043ac7b2eec25802043ed8145c0668053f67c8d" Mar 14 08:02:19 crc kubenswrapper[4893]: E0314 08:02:19.376935 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:02:30 crc kubenswrapper[4893]: I0314 08:02:30.376945 4893 scope.go:117] "RemoveContainer" containerID="e12005c3784aac2361996712f043ac7b2eec25802043ed8145c0668053f67c8d" Mar 14 08:02:30 crc kubenswrapper[4893]: E0314 08:02:30.377472 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:02:43 crc kubenswrapper[4893]: I0314 08:02:43.377144 4893 scope.go:117] "RemoveContainer" containerID="e12005c3784aac2361996712f043ac7b2eec25802043ed8145c0668053f67c8d" Mar 14 08:02:43 crc kubenswrapper[4893]: E0314 08:02:43.378632 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:02:57 crc kubenswrapper[4893]: I0314 08:02:57.377189 4893 scope.go:117] "RemoveContainer" containerID="e12005c3784aac2361996712f043ac7b2eec25802043ed8145c0668053f67c8d" Mar 14 08:02:57 crc kubenswrapper[4893]: E0314 08:02:57.377985 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:03:09 crc kubenswrapper[4893]: I0314 08:03:09.377024 4893 scope.go:117] "RemoveContainer" containerID="e12005c3784aac2361996712f043ac7b2eec25802043ed8145c0668053f67c8d" Mar 14 08:03:09 crc kubenswrapper[4893]: E0314 08:03:09.378344 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:03:24 crc kubenswrapper[4893]: I0314 08:03:24.376229 4893 scope.go:117] "RemoveContainer" containerID="e12005c3784aac2361996712f043ac7b2eec25802043ed8145c0668053f67c8d" Mar 14 08:03:24 crc kubenswrapper[4893]: E0314 08:03:24.377004 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:03:35 crc kubenswrapper[4893]: I0314 08:03:35.345569 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-r9hd5"] Mar 14 08:03:35 crc kubenswrapper[4893]: E0314 08:03:35.346659 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f084cb0-1c92-4a6a-879a-83c7e30161b9" containerName="oc" Mar 14 08:03:35 crc kubenswrapper[4893]: I0314 08:03:35.346680 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f084cb0-1c92-4a6a-879a-83c7e30161b9" containerName="oc" Mar 14 08:03:35 crc kubenswrapper[4893]: I0314 08:03:35.346904 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f084cb0-1c92-4a6a-879a-83c7e30161b9" containerName="oc" Mar 14 08:03:35 crc kubenswrapper[4893]: I0314 08:03:35.348252 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r9hd5" Mar 14 08:03:35 crc kubenswrapper[4893]: I0314 08:03:35.361385 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r9hd5"] Mar 14 08:03:35 crc kubenswrapper[4893]: I0314 08:03:35.404409 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-588d6\" (UniqueName: \"kubernetes.io/projected/fd6b7d1b-fbb9-4da9-91ae-aa87f2f5ccd5-kube-api-access-588d6\") pod \"community-operators-r9hd5\" (UID: \"fd6b7d1b-fbb9-4da9-91ae-aa87f2f5ccd5\") " pod="openshift-marketplace/community-operators-r9hd5" Mar 14 08:03:35 crc kubenswrapper[4893]: I0314 08:03:35.404985 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd6b7d1b-fbb9-4da9-91ae-aa87f2f5ccd5-catalog-content\") pod \"community-operators-r9hd5\" (UID: \"fd6b7d1b-fbb9-4da9-91ae-aa87f2f5ccd5\") " pod="openshift-marketplace/community-operators-r9hd5" Mar 14 08:03:35 crc kubenswrapper[4893]: I0314 08:03:35.405348 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd6b7d1b-fbb9-4da9-91ae-aa87f2f5ccd5-utilities\") pod \"community-operators-r9hd5\" (UID: \"fd6b7d1b-fbb9-4da9-91ae-aa87f2f5ccd5\") " pod="openshift-marketplace/community-operators-r9hd5" Mar 14 08:03:35 crc kubenswrapper[4893]: I0314 08:03:35.505942 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-588d6\" (UniqueName: \"kubernetes.io/projected/fd6b7d1b-fbb9-4da9-91ae-aa87f2f5ccd5-kube-api-access-588d6\") pod \"community-operators-r9hd5\" (UID: \"fd6b7d1b-fbb9-4da9-91ae-aa87f2f5ccd5\") " pod="openshift-marketplace/community-operators-r9hd5" Mar 14 08:03:35 crc kubenswrapper[4893]: I0314 08:03:35.506021 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd6b7d1b-fbb9-4da9-91ae-aa87f2f5ccd5-catalog-content\") pod \"community-operators-r9hd5\" (UID: \"fd6b7d1b-fbb9-4da9-91ae-aa87f2f5ccd5\") " pod="openshift-marketplace/community-operators-r9hd5" Mar 14 08:03:35 crc kubenswrapper[4893]: I0314 08:03:35.506049 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd6b7d1b-fbb9-4da9-91ae-aa87f2f5ccd5-utilities\") pod \"community-operators-r9hd5\" (UID: \"fd6b7d1b-fbb9-4da9-91ae-aa87f2f5ccd5\") " pod="openshift-marketplace/community-operators-r9hd5" Mar 14 08:03:35 crc kubenswrapper[4893]: I0314 08:03:35.506782 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd6b7d1b-fbb9-4da9-91ae-aa87f2f5ccd5-utilities\") pod \"community-operators-r9hd5\" (UID: \"fd6b7d1b-fbb9-4da9-91ae-aa87f2f5ccd5\") " pod="openshift-marketplace/community-operators-r9hd5" Mar 14 08:03:35 crc kubenswrapper[4893]: I0314 08:03:35.506843 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd6b7d1b-fbb9-4da9-91ae-aa87f2f5ccd5-catalog-content\") pod \"community-operators-r9hd5\" (UID: \"fd6b7d1b-fbb9-4da9-91ae-aa87f2f5ccd5\") " pod="openshift-marketplace/community-operators-r9hd5" Mar 14 08:03:35 crc kubenswrapper[4893]: I0314 08:03:35.525278 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-588d6\" (UniqueName: \"kubernetes.io/projected/fd6b7d1b-fbb9-4da9-91ae-aa87f2f5ccd5-kube-api-access-588d6\") pod \"community-operators-r9hd5\" (UID: \"fd6b7d1b-fbb9-4da9-91ae-aa87f2f5ccd5\") " pod="openshift-marketplace/community-operators-r9hd5" Mar 14 08:03:35 crc kubenswrapper[4893]: I0314 08:03:35.689653 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r9hd5" Mar 14 08:03:35 crc kubenswrapper[4893]: I0314 08:03:35.993407 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r9hd5"] Mar 14 08:03:36 crc kubenswrapper[4893]: I0314 08:03:36.136831 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r9hd5" event={"ID":"fd6b7d1b-fbb9-4da9-91ae-aa87f2f5ccd5","Type":"ContainerStarted","Data":"d99d20dafb69b216ceed735f9d79831d7e8cd00c078c80f97e6f205c34f6fa4c"} Mar 14 08:03:37 crc kubenswrapper[4893]: I0314 08:03:37.151399 4893 generic.go:334] "Generic (PLEG): container finished" podID="fd6b7d1b-fbb9-4da9-91ae-aa87f2f5ccd5" containerID="17cb7a2505d2c592e56bca69738f7300296b1c07b2658606eb253aaf6be431e9" exitCode=0 Mar 14 08:03:37 crc kubenswrapper[4893]: I0314 08:03:37.151736 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r9hd5" event={"ID":"fd6b7d1b-fbb9-4da9-91ae-aa87f2f5ccd5","Type":"ContainerDied","Data":"17cb7a2505d2c592e56bca69738f7300296b1c07b2658606eb253aaf6be431e9"} Mar 14 08:03:37 crc kubenswrapper[4893]: I0314 08:03:37.156025 4893 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 08:03:38 crc kubenswrapper[4893]: I0314 08:03:38.377371 4893 scope.go:117] "RemoveContainer" containerID="e12005c3784aac2361996712f043ac7b2eec25802043ed8145c0668053f67c8d" Mar 14 08:03:38 crc kubenswrapper[4893]: E0314 08:03:38.378063 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:03:39 crc kubenswrapper[4893]: I0314 08:03:39.169813 4893 generic.go:334] "Generic (PLEG): container finished" podID="fd6b7d1b-fbb9-4da9-91ae-aa87f2f5ccd5" containerID="d1e52910981ce00ca45153fbf9766b36894f9df1442dacb1403079dce9544721" exitCode=0 Mar 14 08:03:39 crc kubenswrapper[4893]: I0314 08:03:39.169871 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r9hd5" event={"ID":"fd6b7d1b-fbb9-4da9-91ae-aa87f2f5ccd5","Type":"ContainerDied","Data":"d1e52910981ce00ca45153fbf9766b36894f9df1442dacb1403079dce9544721"} Mar 14 08:03:40 crc kubenswrapper[4893]: I0314 08:03:40.184370 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r9hd5" event={"ID":"fd6b7d1b-fbb9-4da9-91ae-aa87f2f5ccd5","Type":"ContainerStarted","Data":"d2ac5ff0e4b18fe44d79199ca9d88df89c9db47a7b7a8951d1a1db0c3de3c740"} Mar 14 08:03:40 crc kubenswrapper[4893]: I0314 08:03:40.218904 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-r9hd5" podStartSLOduration=2.750349005 podStartE2EDuration="5.218882632s" podCreationTimestamp="2026-03-14 08:03:35 +0000 UTC" firstStartedPulling="2026-03-14 08:03:37.15546891 +0000 UTC m=+3896.417645722" lastFinishedPulling="2026-03-14 08:03:39.624002547 +0000 UTC m=+3898.886179349" observedRunningTime="2026-03-14 08:03:40.211654246 +0000 UTC m=+3899.473831048" watchObservedRunningTime="2026-03-14 08:03:40.218882632 +0000 UTC m=+3899.481059434" Mar 14 08:03:45 crc kubenswrapper[4893]: I0314 08:03:45.689991 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-r9hd5" Mar 14 08:03:45 crc kubenswrapper[4893]: I0314 08:03:45.690811 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-r9hd5" Mar 14 08:03:45 crc kubenswrapper[4893]: I0314 08:03:45.728902 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-r9hd5" Mar 14 08:03:46 crc kubenswrapper[4893]: I0314 08:03:46.291453 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-r9hd5" Mar 14 08:03:46 crc kubenswrapper[4893]: I0314 08:03:46.335193 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r9hd5"] Mar 14 08:03:48 crc kubenswrapper[4893]: I0314 08:03:48.243814 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-r9hd5" podUID="fd6b7d1b-fbb9-4da9-91ae-aa87f2f5ccd5" containerName="registry-server" containerID="cri-o://d2ac5ff0e4b18fe44d79199ca9d88df89c9db47a7b7a8951d1a1db0c3de3c740" gracePeriod=2 Mar 14 08:03:48 crc kubenswrapper[4893]: I0314 08:03:48.612773 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r9hd5" Mar 14 08:03:48 crc kubenswrapper[4893]: I0314 08:03:48.803781 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd6b7d1b-fbb9-4da9-91ae-aa87f2f5ccd5-catalog-content\") pod \"fd6b7d1b-fbb9-4da9-91ae-aa87f2f5ccd5\" (UID: \"fd6b7d1b-fbb9-4da9-91ae-aa87f2f5ccd5\") " Mar 14 08:03:48 crc kubenswrapper[4893]: I0314 08:03:48.803851 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd6b7d1b-fbb9-4da9-91ae-aa87f2f5ccd5-utilities\") pod \"fd6b7d1b-fbb9-4da9-91ae-aa87f2f5ccd5\" (UID: \"fd6b7d1b-fbb9-4da9-91ae-aa87f2f5ccd5\") " Mar 14 08:03:48 crc kubenswrapper[4893]: I0314 08:03:48.803941 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-588d6\" (UniqueName: \"kubernetes.io/projected/fd6b7d1b-fbb9-4da9-91ae-aa87f2f5ccd5-kube-api-access-588d6\") pod \"fd6b7d1b-fbb9-4da9-91ae-aa87f2f5ccd5\" (UID: \"fd6b7d1b-fbb9-4da9-91ae-aa87f2f5ccd5\") " Mar 14 08:03:48 crc kubenswrapper[4893]: I0314 08:03:48.804985 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd6b7d1b-fbb9-4da9-91ae-aa87f2f5ccd5-utilities" (OuterVolumeSpecName: "utilities") pod "fd6b7d1b-fbb9-4da9-91ae-aa87f2f5ccd5" (UID: "fd6b7d1b-fbb9-4da9-91ae-aa87f2f5ccd5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:03:48 crc kubenswrapper[4893]: I0314 08:03:48.809766 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd6b7d1b-fbb9-4da9-91ae-aa87f2f5ccd5-kube-api-access-588d6" (OuterVolumeSpecName: "kube-api-access-588d6") pod "fd6b7d1b-fbb9-4da9-91ae-aa87f2f5ccd5" (UID: "fd6b7d1b-fbb9-4da9-91ae-aa87f2f5ccd5"). InnerVolumeSpecName "kube-api-access-588d6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:03:48 crc kubenswrapper[4893]: I0314 08:03:48.876346 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd6b7d1b-fbb9-4da9-91ae-aa87f2f5ccd5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fd6b7d1b-fbb9-4da9-91ae-aa87f2f5ccd5" (UID: "fd6b7d1b-fbb9-4da9-91ae-aa87f2f5ccd5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:03:48 crc kubenswrapper[4893]: I0314 08:03:48.905768 4893 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd6b7d1b-fbb9-4da9-91ae-aa87f2f5ccd5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 08:03:48 crc kubenswrapper[4893]: I0314 08:03:48.905800 4893 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd6b7d1b-fbb9-4da9-91ae-aa87f2f5ccd5-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 08:03:48 crc kubenswrapper[4893]: I0314 08:03:48.905811 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-588d6\" (UniqueName: \"kubernetes.io/projected/fd6b7d1b-fbb9-4da9-91ae-aa87f2f5ccd5-kube-api-access-588d6\") on node \"crc\" DevicePath \"\"" Mar 14 08:03:49 crc kubenswrapper[4893]: I0314 08:03:49.254228 4893 generic.go:334] "Generic (PLEG): container finished" podID="fd6b7d1b-fbb9-4da9-91ae-aa87f2f5ccd5" containerID="d2ac5ff0e4b18fe44d79199ca9d88df89c9db47a7b7a8951d1a1db0c3de3c740" exitCode=0 Mar 14 08:03:49 crc kubenswrapper[4893]: I0314 08:03:49.254304 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r9hd5" Mar 14 08:03:49 crc kubenswrapper[4893]: I0314 08:03:49.254300 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r9hd5" event={"ID":"fd6b7d1b-fbb9-4da9-91ae-aa87f2f5ccd5","Type":"ContainerDied","Data":"d2ac5ff0e4b18fe44d79199ca9d88df89c9db47a7b7a8951d1a1db0c3de3c740"} Mar 14 08:03:49 crc kubenswrapper[4893]: I0314 08:03:49.254725 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r9hd5" event={"ID":"fd6b7d1b-fbb9-4da9-91ae-aa87f2f5ccd5","Type":"ContainerDied","Data":"d99d20dafb69b216ceed735f9d79831d7e8cd00c078c80f97e6f205c34f6fa4c"} Mar 14 08:03:49 crc kubenswrapper[4893]: I0314 08:03:49.254749 4893 scope.go:117] "RemoveContainer" containerID="d2ac5ff0e4b18fe44d79199ca9d88df89c9db47a7b7a8951d1a1db0c3de3c740" Mar 14 08:03:49 crc kubenswrapper[4893]: I0314 08:03:49.278146 4893 scope.go:117] "RemoveContainer" containerID="d1e52910981ce00ca45153fbf9766b36894f9df1442dacb1403079dce9544721" Mar 14 08:03:49 crc kubenswrapper[4893]: I0314 08:03:49.304841 4893 scope.go:117] "RemoveContainer" containerID="17cb7a2505d2c592e56bca69738f7300296b1c07b2658606eb253aaf6be431e9" Mar 14 08:03:49 crc kubenswrapper[4893]: I0314 08:03:49.305031 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r9hd5"] Mar 14 08:03:49 crc kubenswrapper[4893]: I0314 08:03:49.309614 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-r9hd5"] Mar 14 08:03:49 crc kubenswrapper[4893]: I0314 08:03:49.337583 4893 scope.go:117] "RemoveContainer" containerID="d2ac5ff0e4b18fe44d79199ca9d88df89c9db47a7b7a8951d1a1db0c3de3c740" Mar 14 08:03:49 crc kubenswrapper[4893]: E0314 08:03:49.337982 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2ac5ff0e4b18fe44d79199ca9d88df89c9db47a7b7a8951d1a1db0c3de3c740\": container with ID starting with d2ac5ff0e4b18fe44d79199ca9d88df89c9db47a7b7a8951d1a1db0c3de3c740 not found: ID does not exist" containerID="d2ac5ff0e4b18fe44d79199ca9d88df89c9db47a7b7a8951d1a1db0c3de3c740" Mar 14 08:03:49 crc kubenswrapper[4893]: I0314 08:03:49.338030 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2ac5ff0e4b18fe44d79199ca9d88df89c9db47a7b7a8951d1a1db0c3de3c740"} err="failed to get container status \"d2ac5ff0e4b18fe44d79199ca9d88df89c9db47a7b7a8951d1a1db0c3de3c740\": rpc error: code = NotFound desc = could not find container \"d2ac5ff0e4b18fe44d79199ca9d88df89c9db47a7b7a8951d1a1db0c3de3c740\": container with ID starting with d2ac5ff0e4b18fe44d79199ca9d88df89c9db47a7b7a8951d1a1db0c3de3c740 not found: ID does not exist" Mar 14 08:03:49 crc kubenswrapper[4893]: I0314 08:03:49.338058 4893 scope.go:117] "RemoveContainer" containerID="d1e52910981ce00ca45153fbf9766b36894f9df1442dacb1403079dce9544721" Mar 14 08:03:49 crc kubenswrapper[4893]: E0314 08:03:49.338354 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1e52910981ce00ca45153fbf9766b36894f9df1442dacb1403079dce9544721\": container with ID starting with d1e52910981ce00ca45153fbf9766b36894f9df1442dacb1403079dce9544721 not found: ID does not exist" containerID="d1e52910981ce00ca45153fbf9766b36894f9df1442dacb1403079dce9544721" Mar 14 08:03:49 crc kubenswrapper[4893]: I0314 08:03:49.338384 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1e52910981ce00ca45153fbf9766b36894f9df1442dacb1403079dce9544721"} err="failed to get container status \"d1e52910981ce00ca45153fbf9766b36894f9df1442dacb1403079dce9544721\": rpc error: code = NotFound desc = could not find container \"d1e52910981ce00ca45153fbf9766b36894f9df1442dacb1403079dce9544721\": container with ID starting with d1e52910981ce00ca45153fbf9766b36894f9df1442dacb1403079dce9544721 not found: ID does not exist" Mar 14 08:03:49 crc kubenswrapper[4893]: I0314 08:03:49.338407 4893 scope.go:117] "RemoveContainer" containerID="17cb7a2505d2c592e56bca69738f7300296b1c07b2658606eb253aaf6be431e9" Mar 14 08:03:49 crc kubenswrapper[4893]: E0314 08:03:49.338740 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17cb7a2505d2c592e56bca69738f7300296b1c07b2658606eb253aaf6be431e9\": container with ID starting with 17cb7a2505d2c592e56bca69738f7300296b1c07b2658606eb253aaf6be431e9 not found: ID does not exist" containerID="17cb7a2505d2c592e56bca69738f7300296b1c07b2658606eb253aaf6be431e9" Mar 14 08:03:49 crc kubenswrapper[4893]: I0314 08:03:49.338776 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17cb7a2505d2c592e56bca69738f7300296b1c07b2658606eb253aaf6be431e9"} err="failed to get container status \"17cb7a2505d2c592e56bca69738f7300296b1c07b2658606eb253aaf6be431e9\": rpc error: code = NotFound desc = could not find container \"17cb7a2505d2c592e56bca69738f7300296b1c07b2658606eb253aaf6be431e9\": container with ID starting with 17cb7a2505d2c592e56bca69738f7300296b1c07b2658606eb253aaf6be431e9 not found: ID does not exist" Mar 14 08:03:49 crc kubenswrapper[4893]: I0314 08:03:49.384813 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd6b7d1b-fbb9-4da9-91ae-aa87f2f5ccd5" path="/var/lib/kubelet/pods/fd6b7d1b-fbb9-4da9-91ae-aa87f2f5ccd5/volumes" Mar 14 08:03:50 crc kubenswrapper[4893]: I0314 08:03:50.376318 4893 scope.go:117] "RemoveContainer" containerID="e12005c3784aac2361996712f043ac7b2eec25802043ed8145c0668053f67c8d" Mar 14 08:03:50 crc kubenswrapper[4893]: E0314 08:03:50.377255 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:04:00 crc kubenswrapper[4893]: I0314 08:04:00.142991 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557924-5mtgh"] Mar 14 08:04:00 crc kubenswrapper[4893]: E0314 08:04:00.144049 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd6b7d1b-fbb9-4da9-91ae-aa87f2f5ccd5" containerName="extract-content" Mar 14 08:04:00 crc kubenswrapper[4893]: I0314 08:04:00.144067 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd6b7d1b-fbb9-4da9-91ae-aa87f2f5ccd5" containerName="extract-content" Mar 14 08:04:00 crc kubenswrapper[4893]: E0314 08:04:00.144098 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd6b7d1b-fbb9-4da9-91ae-aa87f2f5ccd5" containerName="registry-server" Mar 14 08:04:00 crc kubenswrapper[4893]: I0314 08:04:00.144108 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd6b7d1b-fbb9-4da9-91ae-aa87f2f5ccd5" containerName="registry-server" Mar 14 08:04:00 crc kubenswrapper[4893]: E0314 08:04:00.144127 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd6b7d1b-fbb9-4da9-91ae-aa87f2f5ccd5" containerName="extract-utilities" Mar 14 08:04:00 crc kubenswrapper[4893]: I0314 08:04:00.144135 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd6b7d1b-fbb9-4da9-91ae-aa87f2f5ccd5" containerName="extract-utilities" Mar 14 08:04:00 crc kubenswrapper[4893]: I0314 08:04:00.144305 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd6b7d1b-fbb9-4da9-91ae-aa87f2f5ccd5" containerName="registry-server" Mar 14 08:04:00 crc kubenswrapper[4893]: I0314 08:04:00.144861 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557924-5mtgh" Mar 14 08:04:00 crc kubenswrapper[4893]: I0314 08:04:00.147709 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-44qb7" Mar 14 08:04:00 crc kubenswrapper[4893]: I0314 08:04:00.148283 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 08:04:00 crc kubenswrapper[4893]: I0314 08:04:00.148353 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 08:04:00 crc kubenswrapper[4893]: I0314 08:04:00.152975 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557924-5mtgh"] Mar 14 08:04:00 crc kubenswrapper[4893]: I0314 08:04:00.296722 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fg928\" (UniqueName: \"kubernetes.io/projected/7b6d70d2-3aaf-4301-ab2d-c8adbb0be509-kube-api-access-fg928\") pod \"auto-csr-approver-29557924-5mtgh\" (UID: \"7b6d70d2-3aaf-4301-ab2d-c8adbb0be509\") " pod="openshift-infra/auto-csr-approver-29557924-5mtgh" Mar 14 08:04:00 crc kubenswrapper[4893]: I0314 08:04:00.398301 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fg928\" (UniqueName: \"kubernetes.io/projected/7b6d70d2-3aaf-4301-ab2d-c8adbb0be509-kube-api-access-fg928\") pod \"auto-csr-approver-29557924-5mtgh\" (UID: \"7b6d70d2-3aaf-4301-ab2d-c8adbb0be509\") " pod="openshift-infra/auto-csr-approver-29557924-5mtgh" Mar 14 08:04:00 crc kubenswrapper[4893]: I0314 08:04:00.421024 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fg928\" (UniqueName: \"kubernetes.io/projected/7b6d70d2-3aaf-4301-ab2d-c8adbb0be509-kube-api-access-fg928\") pod \"auto-csr-approver-29557924-5mtgh\" (UID: \"7b6d70d2-3aaf-4301-ab2d-c8adbb0be509\") " pod="openshift-infra/auto-csr-approver-29557924-5mtgh" Mar 14 08:04:00 crc kubenswrapper[4893]: I0314 08:04:00.464587 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557924-5mtgh" Mar 14 08:04:00 crc kubenswrapper[4893]: I0314 08:04:00.874165 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557924-5mtgh"] Mar 14 08:04:01 crc kubenswrapper[4893]: I0314 08:04:01.345847 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557924-5mtgh" event={"ID":"7b6d70d2-3aaf-4301-ab2d-c8adbb0be509","Type":"ContainerStarted","Data":"261a2c2fef442706ab1564d39834565c22af366d3449b95fe26926cb5cf7ba08"} Mar 14 08:04:01 crc kubenswrapper[4893]: I0314 08:04:01.389023 4893 scope.go:117] "RemoveContainer" containerID="e12005c3784aac2361996712f043ac7b2eec25802043ed8145c0668053f67c8d" Mar 14 08:04:01 crc kubenswrapper[4893]: E0314 08:04:01.389471 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:04:02 crc kubenswrapper[4893]: I0314 08:04:02.354409 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557924-5mtgh" event={"ID":"7b6d70d2-3aaf-4301-ab2d-c8adbb0be509","Type":"ContainerStarted","Data":"f36cfc7f44b0bd191cd4564355c44ede7e29936d8509180985064df546cbfc2e"} Mar 14 08:04:02 crc kubenswrapper[4893]: I0314 08:04:02.375345 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557924-5mtgh" podStartSLOduration=1.36454657 podStartE2EDuration="2.375324805s" podCreationTimestamp="2026-03-14 08:04:00 +0000 UTC" firstStartedPulling="2026-03-14 08:04:00.882715865 +0000 UTC m=+3920.144892647" lastFinishedPulling="2026-03-14 08:04:01.89349409 +0000 UTC m=+3921.155670882" observedRunningTime="2026-03-14 08:04:02.37104303 +0000 UTC m=+3921.633219832" watchObservedRunningTime="2026-03-14 08:04:02.375324805 +0000 UTC m=+3921.637501597" Mar 14 08:04:03 crc kubenswrapper[4893]: I0314 08:04:03.363791 4893 generic.go:334] "Generic (PLEG): container finished" podID="7b6d70d2-3aaf-4301-ab2d-c8adbb0be509" containerID="f36cfc7f44b0bd191cd4564355c44ede7e29936d8509180985064df546cbfc2e" exitCode=0 Mar 14 08:04:03 crc kubenswrapper[4893]: I0314 08:04:03.363871 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557924-5mtgh" event={"ID":"7b6d70d2-3aaf-4301-ab2d-c8adbb0be509","Type":"ContainerDied","Data":"f36cfc7f44b0bd191cd4564355c44ede7e29936d8509180985064df546cbfc2e"} Mar 14 08:04:04 crc kubenswrapper[4893]: I0314 08:04:04.724783 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557924-5mtgh" Mar 14 08:04:04 crc kubenswrapper[4893]: I0314 08:04:04.763253 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fg928\" (UniqueName: \"kubernetes.io/projected/7b6d70d2-3aaf-4301-ab2d-c8adbb0be509-kube-api-access-fg928\") pod \"7b6d70d2-3aaf-4301-ab2d-c8adbb0be509\" (UID: \"7b6d70d2-3aaf-4301-ab2d-c8adbb0be509\") " Mar 14 08:04:04 crc kubenswrapper[4893]: I0314 08:04:04.769278 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b6d70d2-3aaf-4301-ab2d-c8adbb0be509-kube-api-access-fg928" (OuterVolumeSpecName: "kube-api-access-fg928") pod "7b6d70d2-3aaf-4301-ab2d-c8adbb0be509" (UID: "7b6d70d2-3aaf-4301-ab2d-c8adbb0be509"). InnerVolumeSpecName "kube-api-access-fg928". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:04:04 crc kubenswrapper[4893]: I0314 08:04:04.864458 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fg928\" (UniqueName: \"kubernetes.io/projected/7b6d70d2-3aaf-4301-ab2d-c8adbb0be509-kube-api-access-fg928\") on node \"crc\" DevicePath \"\"" Mar 14 08:04:05 crc kubenswrapper[4893]: I0314 08:04:05.389350 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557924-5mtgh" Mar 14 08:04:05 crc kubenswrapper[4893]: I0314 08:04:05.401637 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557924-5mtgh" event={"ID":"7b6d70d2-3aaf-4301-ab2d-c8adbb0be509","Type":"ContainerDied","Data":"261a2c2fef442706ab1564d39834565c22af366d3449b95fe26926cb5cf7ba08"} Mar 14 08:04:05 crc kubenswrapper[4893]: I0314 08:04:05.401697 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="261a2c2fef442706ab1564d39834565c22af366d3449b95fe26926cb5cf7ba08" Mar 14 08:04:05 crc kubenswrapper[4893]: I0314 08:04:05.826792 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557918-qn6hf"] Mar 14 08:04:05 crc kubenswrapper[4893]: I0314 08:04:05.831530 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557918-qn6hf"] Mar 14 08:04:07 crc kubenswrapper[4893]: I0314 08:04:07.389269 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1235048d-af3c-4bfb-9211-d22046eabef2" path="/var/lib/kubelet/pods/1235048d-af3c-4bfb-9211-d22046eabef2/volumes" Mar 14 08:04:08 crc kubenswrapper[4893]: I0314 08:04:08.495984 4893 scope.go:117] "RemoveContainer" containerID="f51044283b1cb891aec40f5870424c5ec47527a01af0aba95c783a491c45a2fc" Mar 14 08:04:14 crc kubenswrapper[4893]: I0314 08:04:14.376575 4893 scope.go:117] "RemoveContainer" containerID="e12005c3784aac2361996712f043ac7b2eec25802043ed8145c0668053f67c8d" Mar 14 08:04:14 crc kubenswrapper[4893]: E0314 08:04:14.377477 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:04:29 crc kubenswrapper[4893]: I0314 08:04:29.377503 4893 scope.go:117] "RemoveContainer" containerID="e12005c3784aac2361996712f043ac7b2eec25802043ed8145c0668053f67c8d" Mar 14 08:04:29 crc kubenswrapper[4893]: E0314 08:04:29.378495 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:04:42 crc kubenswrapper[4893]: I0314 08:04:42.376078 4893 scope.go:117] "RemoveContainer" containerID="e12005c3784aac2361996712f043ac7b2eec25802043ed8145c0668053f67c8d" Mar 14 08:04:42 crc kubenswrapper[4893]: E0314 08:04:42.376877 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:04:56 crc kubenswrapper[4893]: I0314 08:04:56.377166 4893 scope.go:117] "RemoveContainer" containerID="e12005c3784aac2361996712f043ac7b2eec25802043ed8145c0668053f67c8d" Mar 14 08:04:56 crc kubenswrapper[4893]: E0314 08:04:56.378254 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:05:09 crc kubenswrapper[4893]: I0314 08:05:09.377247 4893 scope.go:117] "RemoveContainer" containerID="e12005c3784aac2361996712f043ac7b2eec25802043ed8145c0668053f67c8d" Mar 14 08:05:09 crc kubenswrapper[4893]: E0314 08:05:09.377931 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:05:24 crc kubenswrapper[4893]: I0314 08:05:24.376812 4893 scope.go:117] "RemoveContainer" containerID="e12005c3784aac2361996712f043ac7b2eec25802043ed8145c0668053f67c8d" Mar 14 08:05:24 crc kubenswrapper[4893]: E0314 08:05:24.377487 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:05:36 crc kubenswrapper[4893]: I0314 08:05:36.377197 4893 scope.go:117] "RemoveContainer" containerID="e12005c3784aac2361996712f043ac7b2eec25802043ed8145c0668053f67c8d" Mar 14 08:05:36 crc kubenswrapper[4893]: E0314 08:05:36.378317 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:05:50 crc kubenswrapper[4893]: I0314 08:05:50.377834 4893 scope.go:117] "RemoveContainer" containerID="e12005c3784aac2361996712f043ac7b2eec25802043ed8145c0668053f67c8d" Mar 14 08:05:50 crc kubenswrapper[4893]: E0314 08:05:50.379229 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:06:00 crc kubenswrapper[4893]: I0314 08:06:00.137713 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557926-sqlws"] Mar 14 08:06:00 crc kubenswrapper[4893]: E0314 08:06:00.139840 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b6d70d2-3aaf-4301-ab2d-c8adbb0be509" containerName="oc" Mar 14 08:06:00 crc kubenswrapper[4893]: I0314 08:06:00.139946 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b6d70d2-3aaf-4301-ab2d-c8adbb0be509" containerName="oc" Mar 14 08:06:00 crc kubenswrapper[4893]: I0314 08:06:00.140168 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b6d70d2-3aaf-4301-ab2d-c8adbb0be509" containerName="oc" Mar 14 08:06:00 crc kubenswrapper[4893]: I0314 08:06:00.140708 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557926-sqlws" Mar 14 08:06:00 crc kubenswrapper[4893]: I0314 08:06:00.142565 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 08:06:00 crc kubenswrapper[4893]: I0314 08:06:00.143066 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-44qb7" Mar 14 08:06:00 crc kubenswrapper[4893]: I0314 08:06:00.143100 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 08:06:00 crc kubenswrapper[4893]: I0314 08:06:00.155317 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557926-sqlws"] Mar 14 08:06:00 crc kubenswrapper[4893]: I0314 08:06:00.231397 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqnkj\" (UniqueName: \"kubernetes.io/projected/c2b72a40-c832-4b17-8135-1da465713199-kube-api-access-bqnkj\") pod \"auto-csr-approver-29557926-sqlws\" (UID: \"c2b72a40-c832-4b17-8135-1da465713199\") " pod="openshift-infra/auto-csr-approver-29557926-sqlws" Mar 14 08:06:00 crc kubenswrapper[4893]: I0314 08:06:00.332360 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqnkj\" (UniqueName: \"kubernetes.io/projected/c2b72a40-c832-4b17-8135-1da465713199-kube-api-access-bqnkj\") pod \"auto-csr-approver-29557926-sqlws\" (UID: \"c2b72a40-c832-4b17-8135-1da465713199\") " pod="openshift-infra/auto-csr-approver-29557926-sqlws" Mar 14 08:06:00 crc kubenswrapper[4893]: I0314 08:06:00.350770 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqnkj\" (UniqueName: \"kubernetes.io/projected/c2b72a40-c832-4b17-8135-1da465713199-kube-api-access-bqnkj\") pod \"auto-csr-approver-29557926-sqlws\" (UID: \"c2b72a40-c832-4b17-8135-1da465713199\") " pod="openshift-infra/auto-csr-approver-29557926-sqlws" Mar 14 08:06:00 crc kubenswrapper[4893]: I0314 08:06:00.459810 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557926-sqlws" Mar 14 08:06:00 crc kubenswrapper[4893]: I0314 08:06:00.844035 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557926-sqlws"] Mar 14 08:06:01 crc kubenswrapper[4893]: I0314 08:06:01.236598 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557926-sqlws" event={"ID":"c2b72a40-c832-4b17-8135-1da465713199","Type":"ContainerStarted","Data":"ad7bd859b1bccaee16adadac6f98e0e88f5d100a085d3cd32df50696af0e2322"} Mar 14 08:06:02 crc kubenswrapper[4893]: I0314 08:06:02.245176 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557926-sqlws" event={"ID":"c2b72a40-c832-4b17-8135-1da465713199","Type":"ContainerStarted","Data":"68359cc89d3363585d3ce74edb93b87c43115f11664ce3f87616861367265c9a"} Mar 14 08:06:02 crc kubenswrapper[4893]: I0314 08:06:02.259046 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557926-sqlws" podStartSLOduration=1.271704938 podStartE2EDuration="2.259026659s" podCreationTimestamp="2026-03-14 08:06:00 +0000 UTC" firstStartedPulling="2026-03-14 08:06:00.851719378 +0000 UTC m=+4040.113896170" lastFinishedPulling="2026-03-14 08:06:01.839041059 +0000 UTC m=+4041.101217891" observedRunningTime="2026-03-14 08:06:02.256484607 +0000 UTC m=+4041.518661409" watchObservedRunningTime="2026-03-14 08:06:02.259026659 +0000 UTC m=+4041.521203451" Mar 14 08:06:03 crc kubenswrapper[4893]: I0314 08:06:03.254068 4893 generic.go:334] "Generic (PLEG): container finished" podID="c2b72a40-c832-4b17-8135-1da465713199" containerID="68359cc89d3363585d3ce74edb93b87c43115f11664ce3f87616861367265c9a" exitCode=0 Mar 14 08:06:03 crc kubenswrapper[4893]: I0314 08:06:03.254134 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557926-sqlws" event={"ID":"c2b72a40-c832-4b17-8135-1da465713199","Type":"ContainerDied","Data":"68359cc89d3363585d3ce74edb93b87c43115f11664ce3f87616861367265c9a"} Mar 14 08:06:03 crc kubenswrapper[4893]: I0314 08:06:03.377644 4893 scope.go:117] "RemoveContainer" containerID="e12005c3784aac2361996712f043ac7b2eec25802043ed8145c0668053f67c8d" Mar 14 08:06:03 crc kubenswrapper[4893]: E0314 08:06:03.378268 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:06:04 crc kubenswrapper[4893]: I0314 08:06:04.610011 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557926-sqlws" Mar 14 08:06:04 crc kubenswrapper[4893]: I0314 08:06:04.696915 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqnkj\" (UniqueName: \"kubernetes.io/projected/c2b72a40-c832-4b17-8135-1da465713199-kube-api-access-bqnkj\") pod \"c2b72a40-c832-4b17-8135-1da465713199\" (UID: \"c2b72a40-c832-4b17-8135-1da465713199\") " Mar 14 08:06:04 crc kubenswrapper[4893]: I0314 08:06:04.705810 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2b72a40-c832-4b17-8135-1da465713199-kube-api-access-bqnkj" (OuterVolumeSpecName: "kube-api-access-bqnkj") pod "c2b72a40-c832-4b17-8135-1da465713199" (UID: "c2b72a40-c832-4b17-8135-1da465713199"). InnerVolumeSpecName "kube-api-access-bqnkj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:06:04 crc kubenswrapper[4893]: I0314 08:06:04.798367 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqnkj\" (UniqueName: \"kubernetes.io/projected/c2b72a40-c832-4b17-8135-1da465713199-kube-api-access-bqnkj\") on node \"crc\" DevicePath \"\"" Mar 14 08:06:05 crc kubenswrapper[4893]: I0314 08:06:05.270380 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557926-sqlws" event={"ID":"c2b72a40-c832-4b17-8135-1da465713199","Type":"ContainerDied","Data":"ad7bd859b1bccaee16adadac6f98e0e88f5d100a085d3cd32df50696af0e2322"} Mar 14 08:06:05 crc kubenswrapper[4893]: I0314 08:06:05.270445 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad7bd859b1bccaee16adadac6f98e0e88f5d100a085d3cd32df50696af0e2322" Mar 14 08:06:05 crc kubenswrapper[4893]: I0314 08:06:05.270470 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557926-sqlws" Mar 14 08:06:05 crc kubenswrapper[4893]: I0314 08:06:05.672069 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557920-2t5wd"] Mar 14 08:06:05 crc kubenswrapper[4893]: I0314 08:06:05.679417 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557920-2t5wd"] Mar 14 08:06:07 crc kubenswrapper[4893]: I0314 08:06:07.387562 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cc321fc-d0f5-4d3a-94a2-51680b2cbb12" path="/var/lib/kubelet/pods/1cc321fc-d0f5-4d3a-94a2-51680b2cbb12/volumes" Mar 14 08:06:09 crc kubenswrapper[4893]: I0314 08:06:09.721048 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2lhdw"] Mar 14 08:06:09 crc kubenswrapper[4893]: E0314 08:06:09.721692 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2b72a40-c832-4b17-8135-1da465713199" containerName="oc" Mar 14 08:06:09 crc kubenswrapper[4893]: I0314 08:06:09.721708 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2b72a40-c832-4b17-8135-1da465713199" containerName="oc" Mar 14 08:06:09 crc kubenswrapper[4893]: I0314 08:06:09.721895 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2b72a40-c832-4b17-8135-1da465713199" containerName="oc" Mar 14 08:06:09 crc kubenswrapper[4893]: I0314 08:06:09.722968 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2lhdw" Mar 14 08:06:09 crc kubenswrapper[4893]: I0314 08:06:09.752967 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2lhdw"] Mar 14 08:06:09 crc kubenswrapper[4893]: I0314 08:06:09.867391 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16663f81-6685-4a5b-b41f-2f73c874d719-utilities\") pod \"certified-operators-2lhdw\" (UID: \"16663f81-6685-4a5b-b41f-2f73c874d719\") " pod="openshift-marketplace/certified-operators-2lhdw" Mar 14 08:06:09 crc kubenswrapper[4893]: I0314 08:06:09.867467 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16663f81-6685-4a5b-b41f-2f73c874d719-catalog-content\") pod \"certified-operators-2lhdw\" (UID: \"16663f81-6685-4a5b-b41f-2f73c874d719\") " pod="openshift-marketplace/certified-operators-2lhdw" Mar 14 08:06:09 crc kubenswrapper[4893]: I0314 08:06:09.867537 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xd744\" (UniqueName: \"kubernetes.io/projected/16663f81-6685-4a5b-b41f-2f73c874d719-kube-api-access-xd744\") pod \"certified-operators-2lhdw\" (UID: \"16663f81-6685-4a5b-b41f-2f73c874d719\") " pod="openshift-marketplace/certified-operators-2lhdw" Mar 14 08:06:09 crc kubenswrapper[4893]: I0314 08:06:09.968631 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xd744\" (UniqueName: \"kubernetes.io/projected/16663f81-6685-4a5b-b41f-2f73c874d719-kube-api-access-xd744\") pod \"certified-operators-2lhdw\" (UID: \"16663f81-6685-4a5b-b41f-2f73c874d719\") " pod="openshift-marketplace/certified-operators-2lhdw" Mar 14 08:06:09 crc kubenswrapper[4893]: I0314 08:06:09.968707 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16663f81-6685-4a5b-b41f-2f73c874d719-utilities\") pod \"certified-operators-2lhdw\" (UID: \"16663f81-6685-4a5b-b41f-2f73c874d719\") " pod="openshift-marketplace/certified-operators-2lhdw" Mar 14 08:06:09 crc kubenswrapper[4893]: I0314 08:06:09.968796 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16663f81-6685-4a5b-b41f-2f73c874d719-catalog-content\") pod \"certified-operators-2lhdw\" (UID: \"16663f81-6685-4a5b-b41f-2f73c874d719\") " pod="openshift-marketplace/certified-operators-2lhdw" Mar 14 08:06:09 crc kubenswrapper[4893]: I0314 08:06:09.969383 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16663f81-6685-4a5b-b41f-2f73c874d719-catalog-content\") pod \"certified-operators-2lhdw\" (UID: \"16663f81-6685-4a5b-b41f-2f73c874d719\") " pod="openshift-marketplace/certified-operators-2lhdw" Mar 14 08:06:09 crc kubenswrapper[4893]: I0314 08:06:09.969562 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16663f81-6685-4a5b-b41f-2f73c874d719-utilities\") pod \"certified-operators-2lhdw\" (UID: \"16663f81-6685-4a5b-b41f-2f73c874d719\") " pod="openshift-marketplace/certified-operators-2lhdw" Mar 14 08:06:09 crc kubenswrapper[4893]: I0314 08:06:09.990132 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xd744\" (UniqueName: \"kubernetes.io/projected/16663f81-6685-4a5b-b41f-2f73c874d719-kube-api-access-xd744\") pod \"certified-operators-2lhdw\" (UID: \"16663f81-6685-4a5b-b41f-2f73c874d719\") " pod="openshift-marketplace/certified-operators-2lhdw" Mar 14 08:06:10 crc kubenswrapper[4893]: I0314 08:06:10.043662 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2lhdw" Mar 14 08:06:10 crc kubenswrapper[4893]: I0314 08:06:10.466229 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2lhdw"] Mar 14 08:06:11 crc kubenswrapper[4893]: I0314 08:06:11.323047 4893 generic.go:334] "Generic (PLEG): container finished" podID="16663f81-6685-4a5b-b41f-2f73c874d719" containerID="86dbef5d931c5dd293458b852dd6c54644b295e5e50d8caccdee67263a657547" exitCode=0 Mar 14 08:06:11 crc kubenswrapper[4893]: I0314 08:06:11.323149 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2lhdw" event={"ID":"16663f81-6685-4a5b-b41f-2f73c874d719","Type":"ContainerDied","Data":"86dbef5d931c5dd293458b852dd6c54644b295e5e50d8caccdee67263a657547"} Mar 14 08:06:11 crc kubenswrapper[4893]: I0314 08:06:11.323442 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2lhdw" event={"ID":"16663f81-6685-4a5b-b41f-2f73c874d719","Type":"ContainerStarted","Data":"0a6aa37ee3047183b632ac786cf6658386bf7f7819893ead1cced6a8108ff837"} Mar 14 08:06:12 crc kubenswrapper[4893]: I0314 08:06:12.335369 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2lhdw" event={"ID":"16663f81-6685-4a5b-b41f-2f73c874d719","Type":"ContainerStarted","Data":"6ffca4022394a59563fb31434d50d85b60f2580cbad71ae33d905c19ce34f54f"} Mar 14 08:06:13 crc kubenswrapper[4893]: I0314 08:06:13.341957 4893 generic.go:334] "Generic (PLEG): container finished" podID="16663f81-6685-4a5b-b41f-2f73c874d719" containerID="6ffca4022394a59563fb31434d50d85b60f2580cbad71ae33d905c19ce34f54f" exitCode=0 Mar 14 08:06:13 crc kubenswrapper[4893]: I0314 08:06:13.342004 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2lhdw" event={"ID":"16663f81-6685-4a5b-b41f-2f73c874d719","Type":"ContainerDied","Data":"6ffca4022394a59563fb31434d50d85b60f2580cbad71ae33d905c19ce34f54f"} Mar 14 08:06:15 crc kubenswrapper[4893]: I0314 08:06:15.359999 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2lhdw" event={"ID":"16663f81-6685-4a5b-b41f-2f73c874d719","Type":"ContainerStarted","Data":"54db5c27476cfad6e64ff35f0aea375e56413979db78b64008c5a32148c282e7"} Mar 14 08:06:15 crc kubenswrapper[4893]: I0314 08:06:15.398604 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2lhdw" podStartSLOduration=3.348762392 podStartE2EDuration="6.39857135s" podCreationTimestamp="2026-03-14 08:06:09 +0000 UTC" firstStartedPulling="2026-03-14 08:06:11.325158648 +0000 UTC m=+4050.587335440" lastFinishedPulling="2026-03-14 08:06:14.374967566 +0000 UTC m=+4053.637144398" observedRunningTime="2026-03-14 08:06:15.391421725 +0000 UTC m=+4054.653598537" watchObservedRunningTime="2026-03-14 08:06:15.39857135 +0000 UTC m=+4054.660748172" Mar 14 08:06:18 crc kubenswrapper[4893]: I0314 08:06:18.376555 4893 scope.go:117] "RemoveContainer" containerID="e12005c3784aac2361996712f043ac7b2eec25802043ed8145c0668053f67c8d" Mar 14 08:06:18 crc kubenswrapper[4893]: E0314 08:06:18.377052 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:06:20 crc kubenswrapper[4893]: I0314 08:06:20.045471 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2lhdw" Mar 14 08:06:20 crc kubenswrapper[4893]: I0314 08:06:20.046087 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2lhdw" Mar 14 08:06:20 crc kubenswrapper[4893]: I0314 08:06:20.081137 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2lhdw" Mar 14 08:06:20 crc kubenswrapper[4893]: I0314 08:06:20.901980 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2lhdw" Mar 14 08:06:20 crc kubenswrapper[4893]: I0314 08:06:20.966280 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2lhdw"] Mar 14 08:06:22 crc kubenswrapper[4893]: I0314 08:06:22.424033 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2lhdw" podUID="16663f81-6685-4a5b-b41f-2f73c874d719" containerName="registry-server" containerID="cri-o://54db5c27476cfad6e64ff35f0aea375e56413979db78b64008c5a32148c282e7" gracePeriod=2 Mar 14 08:06:22 crc kubenswrapper[4893]: I0314 08:06:22.873772 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2lhdw" Mar 14 08:06:22 crc kubenswrapper[4893]: I0314 08:06:22.972850 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16663f81-6685-4a5b-b41f-2f73c874d719-utilities\") pod \"16663f81-6685-4a5b-b41f-2f73c874d719\" (UID: \"16663f81-6685-4a5b-b41f-2f73c874d719\") " Mar 14 08:06:22 crc kubenswrapper[4893]: I0314 08:06:22.972983 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xd744\" (UniqueName: \"kubernetes.io/projected/16663f81-6685-4a5b-b41f-2f73c874d719-kube-api-access-xd744\") pod \"16663f81-6685-4a5b-b41f-2f73c874d719\" (UID: \"16663f81-6685-4a5b-b41f-2f73c874d719\") " Mar 14 08:06:22 crc kubenswrapper[4893]: I0314 08:06:22.973086 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16663f81-6685-4a5b-b41f-2f73c874d719-catalog-content\") pod \"16663f81-6685-4a5b-b41f-2f73c874d719\" (UID: \"16663f81-6685-4a5b-b41f-2f73c874d719\") " Mar 14 08:06:22 crc kubenswrapper[4893]: I0314 08:06:22.974574 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16663f81-6685-4a5b-b41f-2f73c874d719-utilities" (OuterVolumeSpecName: "utilities") pod "16663f81-6685-4a5b-b41f-2f73c874d719" (UID: "16663f81-6685-4a5b-b41f-2f73c874d719"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:06:22 crc kubenswrapper[4893]: I0314 08:06:22.980071 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16663f81-6685-4a5b-b41f-2f73c874d719-kube-api-access-xd744" (OuterVolumeSpecName: "kube-api-access-xd744") pod "16663f81-6685-4a5b-b41f-2f73c874d719" (UID: "16663f81-6685-4a5b-b41f-2f73c874d719"). InnerVolumeSpecName "kube-api-access-xd744". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:06:23 crc kubenswrapper[4893]: I0314 08:06:23.075591 4893 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16663f81-6685-4a5b-b41f-2f73c874d719-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 08:06:23 crc kubenswrapper[4893]: I0314 08:06:23.075664 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xd744\" (UniqueName: \"kubernetes.io/projected/16663f81-6685-4a5b-b41f-2f73c874d719-kube-api-access-xd744\") on node \"crc\" DevicePath \"\"" Mar 14 08:06:23 crc kubenswrapper[4893]: I0314 08:06:23.436574 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2lhdw" event={"ID":"16663f81-6685-4a5b-b41f-2f73c874d719","Type":"ContainerDied","Data":"54db5c27476cfad6e64ff35f0aea375e56413979db78b64008c5a32148c282e7"} Mar 14 08:06:23 crc kubenswrapper[4893]: I0314 08:06:23.436680 4893 scope.go:117] "RemoveContainer" containerID="54db5c27476cfad6e64ff35f0aea375e56413979db78b64008c5a32148c282e7" Mar 14 08:06:23 crc kubenswrapper[4893]: I0314 08:06:23.436705 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2lhdw" Mar 14 08:06:23 crc kubenswrapper[4893]: I0314 08:06:23.436572 4893 generic.go:334] "Generic (PLEG): container finished" podID="16663f81-6685-4a5b-b41f-2f73c874d719" containerID="54db5c27476cfad6e64ff35f0aea375e56413979db78b64008c5a32148c282e7" exitCode=0 Mar 14 08:06:23 crc kubenswrapper[4893]: I0314 08:06:23.436770 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2lhdw" event={"ID":"16663f81-6685-4a5b-b41f-2f73c874d719","Type":"ContainerDied","Data":"0a6aa37ee3047183b632ac786cf6658386bf7f7819893ead1cced6a8108ff837"} Mar 14 08:06:23 crc kubenswrapper[4893]: I0314 08:06:23.469966 4893 scope.go:117] "RemoveContainer" containerID="6ffca4022394a59563fb31434d50d85b60f2580cbad71ae33d905c19ce34f54f" Mar 14 08:06:23 crc kubenswrapper[4893]: I0314 08:06:23.501604 4893 scope.go:117] "RemoveContainer" containerID="86dbef5d931c5dd293458b852dd6c54644b295e5e50d8caccdee67263a657547" Mar 14 08:06:23 crc kubenswrapper[4893]: I0314 08:06:23.542259 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16663f81-6685-4a5b-b41f-2f73c874d719-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "16663f81-6685-4a5b-b41f-2f73c874d719" (UID: "16663f81-6685-4a5b-b41f-2f73c874d719"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:06:23 crc kubenswrapper[4893]: I0314 08:06:23.551227 4893 scope.go:117] "RemoveContainer" containerID="54db5c27476cfad6e64ff35f0aea375e56413979db78b64008c5a32148c282e7" Mar 14 08:06:23 crc kubenswrapper[4893]: E0314 08:06:23.552296 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54db5c27476cfad6e64ff35f0aea375e56413979db78b64008c5a32148c282e7\": container with ID starting with 54db5c27476cfad6e64ff35f0aea375e56413979db78b64008c5a32148c282e7 not found: ID does not exist" containerID="54db5c27476cfad6e64ff35f0aea375e56413979db78b64008c5a32148c282e7" Mar 14 08:06:23 crc kubenswrapper[4893]: I0314 08:06:23.552471 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54db5c27476cfad6e64ff35f0aea375e56413979db78b64008c5a32148c282e7"} err="failed to get container status \"54db5c27476cfad6e64ff35f0aea375e56413979db78b64008c5a32148c282e7\": rpc error: code = NotFound desc = could not find container \"54db5c27476cfad6e64ff35f0aea375e56413979db78b64008c5a32148c282e7\": container with ID starting with 54db5c27476cfad6e64ff35f0aea375e56413979db78b64008c5a32148c282e7 not found: ID does not exist" Mar 14 08:06:23 crc kubenswrapper[4893]: I0314 08:06:23.552698 4893 scope.go:117] "RemoveContainer" containerID="6ffca4022394a59563fb31434d50d85b60f2580cbad71ae33d905c19ce34f54f" Mar 14 08:06:23 crc kubenswrapper[4893]: E0314 08:06:23.553478 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ffca4022394a59563fb31434d50d85b60f2580cbad71ae33d905c19ce34f54f\": container with ID starting with 6ffca4022394a59563fb31434d50d85b60f2580cbad71ae33d905c19ce34f54f not found: ID does not exist" containerID="6ffca4022394a59563fb31434d50d85b60f2580cbad71ae33d905c19ce34f54f" Mar 14 08:06:23 crc kubenswrapper[4893]: I0314 08:06:23.553565 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ffca4022394a59563fb31434d50d85b60f2580cbad71ae33d905c19ce34f54f"} err="failed to get container status \"6ffca4022394a59563fb31434d50d85b60f2580cbad71ae33d905c19ce34f54f\": rpc error: code = NotFound desc = could not find container \"6ffca4022394a59563fb31434d50d85b60f2580cbad71ae33d905c19ce34f54f\": container with ID starting with 6ffca4022394a59563fb31434d50d85b60f2580cbad71ae33d905c19ce34f54f not found: ID does not exist" Mar 14 08:06:23 crc kubenswrapper[4893]: I0314 08:06:23.553611 4893 scope.go:117] "RemoveContainer" containerID="86dbef5d931c5dd293458b852dd6c54644b295e5e50d8caccdee67263a657547" Mar 14 08:06:23 crc kubenswrapper[4893]: E0314 08:06:23.554195 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86dbef5d931c5dd293458b852dd6c54644b295e5e50d8caccdee67263a657547\": container with ID starting with 86dbef5d931c5dd293458b852dd6c54644b295e5e50d8caccdee67263a657547 not found: ID does not exist" containerID="86dbef5d931c5dd293458b852dd6c54644b295e5e50d8caccdee67263a657547" Mar 14 08:06:23 crc kubenswrapper[4893]: I0314 08:06:23.554258 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86dbef5d931c5dd293458b852dd6c54644b295e5e50d8caccdee67263a657547"} err="failed to get container status \"86dbef5d931c5dd293458b852dd6c54644b295e5e50d8caccdee67263a657547\": rpc error: code = NotFound desc = could not find container \"86dbef5d931c5dd293458b852dd6c54644b295e5e50d8caccdee67263a657547\": container with ID starting with 86dbef5d931c5dd293458b852dd6c54644b295e5e50d8caccdee67263a657547 not found: ID does not exist" Mar 14 08:06:23 crc kubenswrapper[4893]: I0314 08:06:23.583066 4893 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16663f81-6685-4a5b-b41f-2f73c874d719-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 08:06:23 crc kubenswrapper[4893]: I0314 08:06:23.804015 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2lhdw"] Mar 14 08:06:23 crc kubenswrapper[4893]: I0314 08:06:23.809029 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2lhdw"] Mar 14 08:06:25 crc kubenswrapper[4893]: I0314 08:06:25.385005 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16663f81-6685-4a5b-b41f-2f73c874d719" path="/var/lib/kubelet/pods/16663f81-6685-4a5b-b41f-2f73c874d719/volumes" Mar 14 08:06:30 crc kubenswrapper[4893]: I0314 08:06:30.377242 4893 scope.go:117] "RemoveContainer" containerID="e12005c3784aac2361996712f043ac7b2eec25802043ed8145c0668053f67c8d" Mar 14 08:06:31 crc kubenswrapper[4893]: I0314 08:06:31.494762 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" event={"ID":"ad6724e5-48cf-4417-ae51-b1cb8c6af70d","Type":"ContainerStarted","Data":"66da3e9f540f9b798012137c734a2fc1fb70c88334a2ed2a3dfc71b99343d32e"} Mar 14 08:07:08 crc kubenswrapper[4893]: I0314 08:07:08.616810 4893 scope.go:117] "RemoveContainer" containerID="8ccba8a80c4d3b3b00968d284f510a5117e0b3aa5a15da14d259d5b48c258fc4" Mar 14 08:07:37 crc kubenswrapper[4893]: I0314 08:07:37.902477 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4jcmt"] Mar 14 08:07:37 crc kubenswrapper[4893]: E0314 08:07:37.903786 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16663f81-6685-4a5b-b41f-2f73c874d719" containerName="extract-utilities" Mar 14 08:07:37 crc kubenswrapper[4893]: I0314 08:07:37.903840 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="16663f81-6685-4a5b-b41f-2f73c874d719" containerName="extract-utilities" Mar 14 08:07:37 crc kubenswrapper[4893]: E0314 08:07:37.903870 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16663f81-6685-4a5b-b41f-2f73c874d719" containerName="registry-server" Mar 14 08:07:37 crc kubenswrapper[4893]: I0314 08:07:37.903882 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="16663f81-6685-4a5b-b41f-2f73c874d719" containerName="registry-server" Mar 14 08:07:37 crc kubenswrapper[4893]: E0314 08:07:37.903910 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16663f81-6685-4a5b-b41f-2f73c874d719" containerName="extract-content" Mar 14 08:07:37 crc kubenswrapper[4893]: I0314 08:07:37.903920 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="16663f81-6685-4a5b-b41f-2f73c874d719" containerName="extract-content" Mar 14 08:07:37 crc kubenswrapper[4893]: I0314 08:07:37.904310 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="16663f81-6685-4a5b-b41f-2f73c874d719" containerName="registry-server" Mar 14 08:07:37 crc kubenswrapper[4893]: I0314 08:07:37.906126 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4jcmt" Mar 14 08:07:37 crc kubenswrapper[4893]: I0314 08:07:37.922902 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4jcmt"] Mar 14 08:07:38 crc kubenswrapper[4893]: I0314 08:07:38.063825 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8rzw\" (UniqueName: \"kubernetes.io/projected/a664c04b-9cad-4269-8877-145d8a5af9d7-kube-api-access-n8rzw\") pod \"redhat-operators-4jcmt\" (UID: \"a664c04b-9cad-4269-8877-145d8a5af9d7\") " pod="openshift-marketplace/redhat-operators-4jcmt" Mar 14 08:07:38 crc kubenswrapper[4893]: I0314 08:07:38.063918 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a664c04b-9cad-4269-8877-145d8a5af9d7-utilities\") pod \"redhat-operators-4jcmt\" (UID: \"a664c04b-9cad-4269-8877-145d8a5af9d7\") " pod="openshift-marketplace/redhat-operators-4jcmt" Mar 14 08:07:38 crc kubenswrapper[4893]: I0314 08:07:38.064114 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a664c04b-9cad-4269-8877-145d8a5af9d7-catalog-content\") pod \"redhat-operators-4jcmt\" (UID: \"a664c04b-9cad-4269-8877-145d8a5af9d7\") " pod="openshift-marketplace/redhat-operators-4jcmt" Mar 14 08:07:38 crc kubenswrapper[4893]: I0314 08:07:38.165120 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a664c04b-9cad-4269-8877-145d8a5af9d7-utilities\") pod \"redhat-operators-4jcmt\" (UID: \"a664c04b-9cad-4269-8877-145d8a5af9d7\") " pod="openshift-marketplace/redhat-operators-4jcmt" Mar 14 08:07:38 crc kubenswrapper[4893]: I0314 08:07:38.165233 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a664c04b-9cad-4269-8877-145d8a5af9d7-catalog-content\") pod \"redhat-operators-4jcmt\" (UID: \"a664c04b-9cad-4269-8877-145d8a5af9d7\") " pod="openshift-marketplace/redhat-operators-4jcmt" Mar 14 08:07:38 crc kubenswrapper[4893]: I0314 08:07:38.165281 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8rzw\" (UniqueName: \"kubernetes.io/projected/a664c04b-9cad-4269-8877-145d8a5af9d7-kube-api-access-n8rzw\") pod \"redhat-operators-4jcmt\" (UID: \"a664c04b-9cad-4269-8877-145d8a5af9d7\") " pod="openshift-marketplace/redhat-operators-4jcmt" Mar 14 08:07:38 crc kubenswrapper[4893]: I0314 08:07:38.165697 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a664c04b-9cad-4269-8877-145d8a5af9d7-utilities\") pod \"redhat-operators-4jcmt\" (UID: \"a664c04b-9cad-4269-8877-145d8a5af9d7\") " pod="openshift-marketplace/redhat-operators-4jcmt" Mar 14 08:07:38 crc kubenswrapper[4893]: I0314 08:07:38.165697 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a664c04b-9cad-4269-8877-145d8a5af9d7-catalog-content\") pod \"redhat-operators-4jcmt\" (UID: \"a664c04b-9cad-4269-8877-145d8a5af9d7\") " pod="openshift-marketplace/redhat-operators-4jcmt" Mar 14 08:07:38 crc kubenswrapper[4893]: I0314 08:07:38.184713 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8rzw\" (UniqueName: \"kubernetes.io/projected/a664c04b-9cad-4269-8877-145d8a5af9d7-kube-api-access-n8rzw\") pod \"redhat-operators-4jcmt\" (UID: \"a664c04b-9cad-4269-8877-145d8a5af9d7\") " pod="openshift-marketplace/redhat-operators-4jcmt" Mar 14 08:07:38 crc kubenswrapper[4893]: I0314 08:07:38.228666 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4jcmt" Mar 14 08:07:38 crc kubenswrapper[4893]: I0314 08:07:38.653116 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4jcmt"] Mar 14 08:07:38 crc kubenswrapper[4893]: I0314 08:07:38.997079 4893 generic.go:334] "Generic (PLEG): container finished" podID="a664c04b-9cad-4269-8877-145d8a5af9d7" containerID="87ee203690c7f319c9a6269255b107c5bf56f27602e7afc543582751d0637b51" exitCode=0 Mar 14 08:07:38 crc kubenswrapper[4893]: I0314 08:07:38.997126 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4jcmt" event={"ID":"a664c04b-9cad-4269-8877-145d8a5af9d7","Type":"ContainerDied","Data":"87ee203690c7f319c9a6269255b107c5bf56f27602e7afc543582751d0637b51"} Mar 14 08:07:38 crc kubenswrapper[4893]: I0314 08:07:38.997152 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4jcmt" event={"ID":"a664c04b-9cad-4269-8877-145d8a5af9d7","Type":"ContainerStarted","Data":"fb3bb7c7a48b4ab5333e4f5bedc258a92c2c99d67e862a0f04da283e4c3b07ad"} Mar 14 08:07:43 crc kubenswrapper[4893]: I0314 08:07:43.033593 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4jcmt" event={"ID":"a664c04b-9cad-4269-8877-145d8a5af9d7","Type":"ContainerStarted","Data":"27084352c0e160f4fb51081179d448050f7dc8eadc7cac7067a2565642d1d934"} Mar 14 08:07:44 crc kubenswrapper[4893]: I0314 08:07:44.046076 4893 generic.go:334] "Generic (PLEG): container finished" podID="a664c04b-9cad-4269-8877-145d8a5af9d7" containerID="27084352c0e160f4fb51081179d448050f7dc8eadc7cac7067a2565642d1d934" exitCode=0 Mar 14 08:07:44 crc kubenswrapper[4893]: I0314 08:07:44.046148 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4jcmt" event={"ID":"a664c04b-9cad-4269-8877-145d8a5af9d7","Type":"ContainerDied","Data":"27084352c0e160f4fb51081179d448050f7dc8eadc7cac7067a2565642d1d934"} Mar 14 08:07:45 crc kubenswrapper[4893]: I0314 08:07:45.060687 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4jcmt" event={"ID":"a664c04b-9cad-4269-8877-145d8a5af9d7","Type":"ContainerStarted","Data":"0bf03246a647bd4008d61cff305b56270b2502259a23f30cfa88aad7878814bd"} Mar 14 08:07:45 crc kubenswrapper[4893]: I0314 08:07:45.103186 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4jcmt" podStartSLOduration=3.374043683 podStartE2EDuration="8.103152014s" podCreationTimestamp="2026-03-14 08:07:37 +0000 UTC" firstStartedPulling="2026-03-14 08:07:40.005973345 +0000 UTC m=+4139.268150147" lastFinishedPulling="2026-03-14 08:07:44.735081676 +0000 UTC m=+4143.997258478" observedRunningTime="2026-03-14 08:07:45.092113395 +0000 UTC m=+4144.354290227" watchObservedRunningTime="2026-03-14 08:07:45.103152014 +0000 UTC m=+4144.365328846" Mar 14 08:07:48 crc kubenswrapper[4893]: I0314 08:07:48.229847 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4jcmt" Mar 14 08:07:48 crc kubenswrapper[4893]: I0314 08:07:48.230180 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4jcmt" Mar 14 08:07:49 crc kubenswrapper[4893]: I0314 08:07:49.285546 4893 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4jcmt" podUID="a664c04b-9cad-4269-8877-145d8a5af9d7" containerName="registry-server" probeResult="failure" output=< Mar 14 08:07:49 crc kubenswrapper[4893]: timeout: failed to connect service ":50051" within 1s Mar 14 08:07:49 crc kubenswrapper[4893]: > Mar 14 08:07:58 crc kubenswrapper[4893]: I0314 08:07:58.274585 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4jcmt" Mar 14 08:07:58 crc kubenswrapper[4893]: I0314 08:07:58.318370 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4jcmt" Mar 14 08:07:58 crc kubenswrapper[4893]: I0314 08:07:58.520492 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4jcmt"] Mar 14 08:08:00 crc kubenswrapper[4893]: I0314 08:08:00.136934 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557928-5mshp"] Mar 14 08:08:00 crc kubenswrapper[4893]: I0314 08:08:00.138452 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557928-5mshp" Mar 14 08:08:00 crc kubenswrapper[4893]: I0314 08:08:00.141022 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 08:08:00 crc kubenswrapper[4893]: I0314 08:08:00.141275 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-44qb7" Mar 14 08:08:00 crc kubenswrapper[4893]: I0314 08:08:00.141442 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 08:08:00 crc kubenswrapper[4893]: I0314 08:08:00.149715 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557928-5mshp"] Mar 14 08:08:00 crc kubenswrapper[4893]: I0314 08:08:00.176014 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4jcmt" podUID="a664c04b-9cad-4269-8877-145d8a5af9d7" containerName="registry-server" containerID="cri-o://0bf03246a647bd4008d61cff305b56270b2502259a23f30cfa88aad7878814bd" gracePeriod=2 Mar 14 08:08:00 crc kubenswrapper[4893]: I0314 08:08:00.308482 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfj9g\" (UniqueName: \"kubernetes.io/projected/9cc1448f-bed8-4d56-a646-6e5ddeb86c60-kube-api-access-gfj9g\") pod \"auto-csr-approver-29557928-5mshp\" (UID: \"9cc1448f-bed8-4d56-a646-6e5ddeb86c60\") " pod="openshift-infra/auto-csr-approver-29557928-5mshp" Mar 14 08:08:00 crc kubenswrapper[4893]: I0314 08:08:00.409486 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfj9g\" (UniqueName: \"kubernetes.io/projected/9cc1448f-bed8-4d56-a646-6e5ddeb86c60-kube-api-access-gfj9g\") pod \"auto-csr-approver-29557928-5mshp\" (UID: \"9cc1448f-bed8-4d56-a646-6e5ddeb86c60\") " pod="openshift-infra/auto-csr-approver-29557928-5mshp" Mar 14 08:08:00 crc kubenswrapper[4893]: I0314 08:08:00.430098 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfj9g\" (UniqueName: \"kubernetes.io/projected/9cc1448f-bed8-4d56-a646-6e5ddeb86c60-kube-api-access-gfj9g\") pod \"auto-csr-approver-29557928-5mshp\" (UID: \"9cc1448f-bed8-4d56-a646-6e5ddeb86c60\") " pod="openshift-infra/auto-csr-approver-29557928-5mshp" Mar 14 08:08:00 crc kubenswrapper[4893]: I0314 08:08:00.456398 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557928-5mshp" Mar 14 08:08:00 crc kubenswrapper[4893]: I0314 08:08:00.557755 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4jcmt" Mar 14 08:08:00 crc kubenswrapper[4893]: I0314 08:08:00.713461 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a664c04b-9cad-4269-8877-145d8a5af9d7-catalog-content\") pod \"a664c04b-9cad-4269-8877-145d8a5af9d7\" (UID: \"a664c04b-9cad-4269-8877-145d8a5af9d7\") " Mar 14 08:08:00 crc kubenswrapper[4893]: I0314 08:08:00.713817 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a664c04b-9cad-4269-8877-145d8a5af9d7-utilities\") pod \"a664c04b-9cad-4269-8877-145d8a5af9d7\" (UID: \"a664c04b-9cad-4269-8877-145d8a5af9d7\") " Mar 14 08:08:00 crc kubenswrapper[4893]: I0314 08:08:00.713873 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8rzw\" (UniqueName: \"kubernetes.io/projected/a664c04b-9cad-4269-8877-145d8a5af9d7-kube-api-access-n8rzw\") pod \"a664c04b-9cad-4269-8877-145d8a5af9d7\" (UID: \"a664c04b-9cad-4269-8877-145d8a5af9d7\") " Mar 14 08:08:00 crc kubenswrapper[4893]: I0314 08:08:00.715274 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a664c04b-9cad-4269-8877-145d8a5af9d7-utilities" (OuterVolumeSpecName: "utilities") pod "a664c04b-9cad-4269-8877-145d8a5af9d7" (UID: "a664c04b-9cad-4269-8877-145d8a5af9d7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:08:00 crc kubenswrapper[4893]: I0314 08:08:00.821714 4893 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a664c04b-9cad-4269-8877-145d8a5af9d7-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 08:08:00 crc kubenswrapper[4893]: I0314 08:08:00.867997 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a664c04b-9cad-4269-8877-145d8a5af9d7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a664c04b-9cad-4269-8877-145d8a5af9d7" (UID: "a664c04b-9cad-4269-8877-145d8a5af9d7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:08:00 crc kubenswrapper[4893]: I0314 08:08:00.869827 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a664c04b-9cad-4269-8877-145d8a5af9d7-kube-api-access-n8rzw" (OuterVolumeSpecName: "kube-api-access-n8rzw") pod "a664c04b-9cad-4269-8877-145d8a5af9d7" (UID: "a664c04b-9cad-4269-8877-145d8a5af9d7"). InnerVolumeSpecName "kube-api-access-n8rzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:08:00 crc kubenswrapper[4893]: I0314 08:08:00.895810 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557928-5mshp"] Mar 14 08:08:00 crc kubenswrapper[4893]: I0314 08:08:00.923571 4893 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a664c04b-9cad-4269-8877-145d8a5af9d7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 08:08:00 crc kubenswrapper[4893]: I0314 08:08:00.923797 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8rzw\" (UniqueName: \"kubernetes.io/projected/a664c04b-9cad-4269-8877-145d8a5af9d7-kube-api-access-n8rzw\") on node \"crc\" DevicePath \"\"" Mar 14 08:08:01 crc kubenswrapper[4893]: I0314 08:08:01.189484 4893 generic.go:334] "Generic (PLEG): container finished" podID="a664c04b-9cad-4269-8877-145d8a5af9d7" containerID="0bf03246a647bd4008d61cff305b56270b2502259a23f30cfa88aad7878814bd" exitCode=0 Mar 14 08:08:01 crc kubenswrapper[4893]: I0314 08:08:01.189571 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4jcmt" Mar 14 08:08:01 crc kubenswrapper[4893]: I0314 08:08:01.189565 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4jcmt" event={"ID":"a664c04b-9cad-4269-8877-145d8a5af9d7","Type":"ContainerDied","Data":"0bf03246a647bd4008d61cff305b56270b2502259a23f30cfa88aad7878814bd"} Mar 14 08:08:01 crc kubenswrapper[4893]: I0314 08:08:01.189793 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4jcmt" event={"ID":"a664c04b-9cad-4269-8877-145d8a5af9d7","Type":"ContainerDied","Data":"fb3bb7c7a48b4ab5333e4f5bedc258a92c2c99d67e862a0f04da283e4c3b07ad"} Mar 14 08:08:01 crc kubenswrapper[4893]: I0314 08:08:01.189832 4893 scope.go:117] "RemoveContainer" containerID="0bf03246a647bd4008d61cff305b56270b2502259a23f30cfa88aad7878814bd" Mar 14 08:08:01 crc kubenswrapper[4893]: I0314 08:08:01.191208 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557928-5mshp" event={"ID":"9cc1448f-bed8-4d56-a646-6e5ddeb86c60","Type":"ContainerStarted","Data":"a8512cfe77724fb2b610646e03ff46c23e2bac93746a791ba406fe784a183044"} Mar 14 08:08:01 crc kubenswrapper[4893]: I0314 08:08:01.212385 4893 scope.go:117] "RemoveContainer" containerID="27084352c0e160f4fb51081179d448050f7dc8eadc7cac7067a2565642d1d934" Mar 14 08:08:01 crc kubenswrapper[4893]: I0314 08:08:01.224181 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4jcmt"] Mar 14 08:08:01 crc kubenswrapper[4893]: I0314 08:08:01.231419 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4jcmt"] Mar 14 08:08:01 crc kubenswrapper[4893]: I0314 08:08:01.242969 4893 scope.go:117] "RemoveContainer" containerID="87ee203690c7f319c9a6269255b107c5bf56f27602e7afc543582751d0637b51" Mar 14 08:08:01 crc kubenswrapper[4893]: I0314 08:08:01.263777 4893 scope.go:117] "RemoveContainer" containerID="0bf03246a647bd4008d61cff305b56270b2502259a23f30cfa88aad7878814bd" Mar 14 08:08:01 crc kubenswrapper[4893]: E0314 08:08:01.264326 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bf03246a647bd4008d61cff305b56270b2502259a23f30cfa88aad7878814bd\": container with ID starting with 0bf03246a647bd4008d61cff305b56270b2502259a23f30cfa88aad7878814bd not found: ID does not exist" containerID="0bf03246a647bd4008d61cff305b56270b2502259a23f30cfa88aad7878814bd" Mar 14 08:08:01 crc kubenswrapper[4893]: I0314 08:08:01.264383 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bf03246a647bd4008d61cff305b56270b2502259a23f30cfa88aad7878814bd"} err="failed to get container status \"0bf03246a647bd4008d61cff305b56270b2502259a23f30cfa88aad7878814bd\": rpc error: code = NotFound desc = could not find container \"0bf03246a647bd4008d61cff305b56270b2502259a23f30cfa88aad7878814bd\": container with ID starting with 0bf03246a647bd4008d61cff305b56270b2502259a23f30cfa88aad7878814bd not found: ID does not exist" Mar 14 08:08:01 crc kubenswrapper[4893]: I0314 08:08:01.264411 4893 scope.go:117] "RemoveContainer" containerID="27084352c0e160f4fb51081179d448050f7dc8eadc7cac7067a2565642d1d934" Mar 14 08:08:01 crc kubenswrapper[4893]: E0314 08:08:01.264836 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27084352c0e160f4fb51081179d448050f7dc8eadc7cac7067a2565642d1d934\": container with ID starting with 27084352c0e160f4fb51081179d448050f7dc8eadc7cac7067a2565642d1d934 not found: ID does not exist" containerID="27084352c0e160f4fb51081179d448050f7dc8eadc7cac7067a2565642d1d934" Mar 14 08:08:01 crc kubenswrapper[4893]: I0314 08:08:01.264872 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27084352c0e160f4fb51081179d448050f7dc8eadc7cac7067a2565642d1d934"} err="failed to get container status \"27084352c0e160f4fb51081179d448050f7dc8eadc7cac7067a2565642d1d934\": rpc error: code = NotFound desc = could not find container \"27084352c0e160f4fb51081179d448050f7dc8eadc7cac7067a2565642d1d934\": container with ID starting with 27084352c0e160f4fb51081179d448050f7dc8eadc7cac7067a2565642d1d934 not found: ID does not exist" Mar 14 08:08:01 crc kubenswrapper[4893]: I0314 08:08:01.264893 4893 scope.go:117] "RemoveContainer" containerID="87ee203690c7f319c9a6269255b107c5bf56f27602e7afc543582751d0637b51" Mar 14 08:08:01 crc kubenswrapper[4893]: E0314 08:08:01.265174 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87ee203690c7f319c9a6269255b107c5bf56f27602e7afc543582751d0637b51\": container with ID starting with 87ee203690c7f319c9a6269255b107c5bf56f27602e7afc543582751d0637b51 not found: ID does not exist" containerID="87ee203690c7f319c9a6269255b107c5bf56f27602e7afc543582751d0637b51" Mar 14 08:08:01 crc kubenswrapper[4893]: I0314 08:08:01.265207 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87ee203690c7f319c9a6269255b107c5bf56f27602e7afc543582751d0637b51"} err="failed to get container status \"87ee203690c7f319c9a6269255b107c5bf56f27602e7afc543582751d0637b51\": rpc error: code = NotFound desc = could not find container \"87ee203690c7f319c9a6269255b107c5bf56f27602e7afc543582751d0637b51\": container with ID starting with 87ee203690c7f319c9a6269255b107c5bf56f27602e7afc543582751d0637b51 not found: ID does not exist" Mar 14 08:08:01 crc kubenswrapper[4893]: I0314 08:08:01.386083 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a664c04b-9cad-4269-8877-145d8a5af9d7" path="/var/lib/kubelet/pods/a664c04b-9cad-4269-8877-145d8a5af9d7/volumes" Mar 14 08:08:02 crc kubenswrapper[4893]: I0314 08:08:02.200283 4893 generic.go:334] "Generic (PLEG): container finished" podID="9cc1448f-bed8-4d56-a646-6e5ddeb86c60" containerID="9a5126e7aeeb46a2f863b048a2bbe743e8e10b127e955b060ab78f44219af3fd" exitCode=0 Mar 14 08:08:02 crc kubenswrapper[4893]: I0314 08:08:02.200332 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557928-5mshp" event={"ID":"9cc1448f-bed8-4d56-a646-6e5ddeb86c60","Type":"ContainerDied","Data":"9a5126e7aeeb46a2f863b048a2bbe743e8e10b127e955b060ab78f44219af3fd"} Mar 14 08:08:03 crc kubenswrapper[4893]: I0314 08:08:03.646583 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557928-5mshp" Mar 14 08:08:03 crc kubenswrapper[4893]: I0314 08:08:03.765890 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfj9g\" (UniqueName: \"kubernetes.io/projected/9cc1448f-bed8-4d56-a646-6e5ddeb86c60-kube-api-access-gfj9g\") pod \"9cc1448f-bed8-4d56-a646-6e5ddeb86c60\" (UID: \"9cc1448f-bed8-4d56-a646-6e5ddeb86c60\") " Mar 14 08:08:03 crc kubenswrapper[4893]: I0314 08:08:03.772051 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cc1448f-bed8-4d56-a646-6e5ddeb86c60-kube-api-access-gfj9g" (OuterVolumeSpecName: "kube-api-access-gfj9g") pod "9cc1448f-bed8-4d56-a646-6e5ddeb86c60" (UID: "9cc1448f-bed8-4d56-a646-6e5ddeb86c60"). InnerVolumeSpecName "kube-api-access-gfj9g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:08:03 crc kubenswrapper[4893]: I0314 08:08:03.867859 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfj9g\" (UniqueName: \"kubernetes.io/projected/9cc1448f-bed8-4d56-a646-6e5ddeb86c60-kube-api-access-gfj9g\") on node \"crc\" DevicePath \"\"" Mar 14 08:08:04 crc kubenswrapper[4893]: I0314 08:08:04.229951 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557928-5mshp" event={"ID":"9cc1448f-bed8-4d56-a646-6e5ddeb86c60","Type":"ContainerDied","Data":"a8512cfe77724fb2b610646e03ff46c23e2bac93746a791ba406fe784a183044"} Mar 14 08:08:04 crc kubenswrapper[4893]: I0314 08:08:04.229989 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8512cfe77724fb2b610646e03ff46c23e2bac93746a791ba406fe784a183044" Mar 14 08:08:04 crc kubenswrapper[4893]: I0314 08:08:04.230002 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557928-5mshp" Mar 14 08:08:04 crc kubenswrapper[4893]: I0314 08:08:04.718142 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557922-v7wgf"] Mar 14 08:08:04 crc kubenswrapper[4893]: I0314 08:08:04.725500 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557922-v7wgf"] Mar 14 08:08:05 crc kubenswrapper[4893]: I0314 08:08:05.385136 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f084cb0-1c92-4a6a-879a-83c7e30161b9" path="/var/lib/kubelet/pods/6f084cb0-1c92-4a6a-879a-83c7e30161b9/volumes" Mar 14 08:08:08 crc kubenswrapper[4893]: I0314 08:08:08.695256 4893 scope.go:117] "RemoveContainer" containerID="319687da446251c9089809ac2740feba427fc38527c13ba548c0dc1e69aa05d8" Mar 14 08:08:54 crc kubenswrapper[4893]: I0314 08:08:54.302222 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-p24df"] Mar 14 08:08:54 crc kubenswrapper[4893]: E0314 08:08:54.304705 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a664c04b-9cad-4269-8877-145d8a5af9d7" containerName="registry-server" Mar 14 08:08:54 crc kubenswrapper[4893]: I0314 08:08:54.304846 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="a664c04b-9cad-4269-8877-145d8a5af9d7" containerName="registry-server" Mar 14 08:08:54 crc kubenswrapper[4893]: E0314 08:08:54.304965 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a664c04b-9cad-4269-8877-145d8a5af9d7" containerName="extract-utilities" Mar 14 08:08:54 crc kubenswrapper[4893]: I0314 08:08:54.305062 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="a664c04b-9cad-4269-8877-145d8a5af9d7" containerName="extract-utilities" Mar 14 08:08:54 crc kubenswrapper[4893]: E0314 08:08:54.305192 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cc1448f-bed8-4d56-a646-6e5ddeb86c60" containerName="oc" Mar 14 08:08:54 crc kubenswrapper[4893]: I0314 08:08:54.305282 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cc1448f-bed8-4d56-a646-6e5ddeb86c60" containerName="oc" Mar 14 08:08:54 crc kubenswrapper[4893]: E0314 08:08:54.305379 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a664c04b-9cad-4269-8877-145d8a5af9d7" containerName="extract-content" Mar 14 08:08:54 crc kubenswrapper[4893]: I0314 08:08:54.305459 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="a664c04b-9cad-4269-8877-145d8a5af9d7" containerName="extract-content" Mar 14 08:08:54 crc kubenswrapper[4893]: I0314 08:08:54.305751 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="a664c04b-9cad-4269-8877-145d8a5af9d7" containerName="registry-server" Mar 14 08:08:54 crc kubenswrapper[4893]: I0314 08:08:54.305866 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cc1448f-bed8-4d56-a646-6e5ddeb86c60" containerName="oc" Mar 14 08:08:54 crc kubenswrapper[4893]: I0314 08:08:54.307490 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p24df" Mar 14 08:08:54 crc kubenswrapper[4893]: I0314 08:08:54.328815 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p24df"] Mar 14 08:08:54 crc kubenswrapper[4893]: I0314 08:08:54.428984 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3c2dc75-c1e6-4101-9432-b9f8ace5a3cf-utilities\") pod \"redhat-marketplace-p24df\" (UID: \"e3c2dc75-c1e6-4101-9432-b9f8ace5a3cf\") " pod="openshift-marketplace/redhat-marketplace-p24df" Mar 14 08:08:54 crc kubenswrapper[4893]: I0314 08:08:54.429035 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3c2dc75-c1e6-4101-9432-b9f8ace5a3cf-catalog-content\") pod \"redhat-marketplace-p24df\" (UID: \"e3c2dc75-c1e6-4101-9432-b9f8ace5a3cf\") " pod="openshift-marketplace/redhat-marketplace-p24df" Mar 14 08:08:54 crc kubenswrapper[4893]: I0314 08:08:54.429079 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9mzp\" (UniqueName: \"kubernetes.io/projected/e3c2dc75-c1e6-4101-9432-b9f8ace5a3cf-kube-api-access-t9mzp\") pod \"redhat-marketplace-p24df\" (UID: \"e3c2dc75-c1e6-4101-9432-b9f8ace5a3cf\") " pod="openshift-marketplace/redhat-marketplace-p24df" Mar 14 08:08:54 crc kubenswrapper[4893]: I0314 08:08:54.530367 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9mzp\" (UniqueName: \"kubernetes.io/projected/e3c2dc75-c1e6-4101-9432-b9f8ace5a3cf-kube-api-access-t9mzp\") pod \"redhat-marketplace-p24df\" (UID: \"e3c2dc75-c1e6-4101-9432-b9f8ace5a3cf\") " pod="openshift-marketplace/redhat-marketplace-p24df" Mar 14 08:08:54 crc kubenswrapper[4893]: I0314 08:08:54.530475 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3c2dc75-c1e6-4101-9432-b9f8ace5a3cf-utilities\") pod \"redhat-marketplace-p24df\" (UID: \"e3c2dc75-c1e6-4101-9432-b9f8ace5a3cf\") " pod="openshift-marketplace/redhat-marketplace-p24df" Mar 14 08:08:54 crc kubenswrapper[4893]: I0314 08:08:54.530503 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3c2dc75-c1e6-4101-9432-b9f8ace5a3cf-catalog-content\") pod \"redhat-marketplace-p24df\" (UID: \"e3c2dc75-c1e6-4101-9432-b9f8ace5a3cf\") " pod="openshift-marketplace/redhat-marketplace-p24df" Mar 14 08:08:54 crc kubenswrapper[4893]: I0314 08:08:54.530923 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3c2dc75-c1e6-4101-9432-b9f8ace5a3cf-utilities\") pod \"redhat-marketplace-p24df\" (UID: \"e3c2dc75-c1e6-4101-9432-b9f8ace5a3cf\") " pod="openshift-marketplace/redhat-marketplace-p24df" Mar 14 08:08:54 crc kubenswrapper[4893]: I0314 08:08:54.530989 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3c2dc75-c1e6-4101-9432-b9f8ace5a3cf-catalog-content\") pod \"redhat-marketplace-p24df\" (UID: \"e3c2dc75-c1e6-4101-9432-b9f8ace5a3cf\") " pod="openshift-marketplace/redhat-marketplace-p24df" Mar 14 08:08:54 crc kubenswrapper[4893]: I0314 08:08:54.554470 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9mzp\" (UniqueName: \"kubernetes.io/projected/e3c2dc75-c1e6-4101-9432-b9f8ace5a3cf-kube-api-access-t9mzp\") pod \"redhat-marketplace-p24df\" (UID: \"e3c2dc75-c1e6-4101-9432-b9f8ace5a3cf\") " pod="openshift-marketplace/redhat-marketplace-p24df" Mar 14 08:08:54 crc kubenswrapper[4893]: I0314 08:08:54.644602 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p24df" Mar 14 08:08:55 crc kubenswrapper[4893]: I0314 08:08:55.082736 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p24df"] Mar 14 08:08:55 crc kubenswrapper[4893]: I0314 08:08:55.602697 4893 generic.go:334] "Generic (PLEG): container finished" podID="e3c2dc75-c1e6-4101-9432-b9f8ace5a3cf" containerID="9ccdb650ea9a780c6364ce105e571cb997fe0e9dd8ce2d4f479948fc02ac480b" exitCode=0 Mar 14 08:08:55 crc kubenswrapper[4893]: I0314 08:08:55.602779 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p24df" event={"ID":"e3c2dc75-c1e6-4101-9432-b9f8ace5a3cf","Type":"ContainerDied","Data":"9ccdb650ea9a780c6364ce105e571cb997fe0e9dd8ce2d4f479948fc02ac480b"} Mar 14 08:08:55 crc kubenswrapper[4893]: I0314 08:08:55.604034 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p24df" event={"ID":"e3c2dc75-c1e6-4101-9432-b9f8ace5a3cf","Type":"ContainerStarted","Data":"9912493f9db1204fb75d498269d1b1d15d70ba8fee01709002eb7e06008cf7c5"} Mar 14 08:08:55 crc kubenswrapper[4893]: I0314 08:08:55.604947 4893 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 08:08:56 crc kubenswrapper[4893]: I0314 08:08:56.612145 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p24df" event={"ID":"e3c2dc75-c1e6-4101-9432-b9f8ace5a3cf","Type":"ContainerStarted","Data":"71944f4b10a9e1bbac3f8d129f275a9fbe8d5f85ae4d9d31ed222cb6ea01ce1f"} Mar 14 08:08:57 crc kubenswrapper[4893]: I0314 08:08:57.620840 4893 generic.go:334] "Generic (PLEG): container finished" podID="e3c2dc75-c1e6-4101-9432-b9f8ace5a3cf" containerID="71944f4b10a9e1bbac3f8d129f275a9fbe8d5f85ae4d9d31ed222cb6ea01ce1f" exitCode=0 Mar 14 08:08:57 crc kubenswrapper[4893]: I0314 08:08:57.621218 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p24df" event={"ID":"e3c2dc75-c1e6-4101-9432-b9f8ace5a3cf","Type":"ContainerDied","Data":"71944f4b10a9e1bbac3f8d129f275a9fbe8d5f85ae4d9d31ed222cb6ea01ce1f"} Mar 14 08:08:58 crc kubenswrapper[4893]: I0314 08:08:58.632053 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p24df" event={"ID":"e3c2dc75-c1e6-4101-9432-b9f8ace5a3cf","Type":"ContainerStarted","Data":"f58846d6214b1a6a920a6a060602598f497283baae1adcc5a2bdf80a9e3367fb"} Mar 14 08:08:58 crc kubenswrapper[4893]: I0314 08:08:58.653334 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-p24df" podStartSLOduration=1.97699729 podStartE2EDuration="4.653313719s" podCreationTimestamp="2026-03-14 08:08:54 +0000 UTC" firstStartedPulling="2026-03-14 08:08:55.60470432 +0000 UTC m=+4214.866881112" lastFinishedPulling="2026-03-14 08:08:58.281020739 +0000 UTC m=+4217.543197541" observedRunningTime="2026-03-14 08:08:58.64761122 +0000 UTC m=+4217.909788032" watchObservedRunningTime="2026-03-14 08:08:58.653313719 +0000 UTC m=+4217.915490511" Mar 14 08:08:59 crc kubenswrapper[4893]: I0314 08:08:59.730660 4893 patch_prober.go:28] interesting pod/machine-config-daemon-d4x6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:08:59 crc kubenswrapper[4893]: I0314 08:08:59.730748 4893 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:09:04 crc kubenswrapper[4893]: I0314 08:09:04.645590 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-p24df" Mar 14 08:09:04 crc kubenswrapper[4893]: I0314 08:09:04.646162 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-p24df" Mar 14 08:09:04 crc kubenswrapper[4893]: I0314 08:09:04.695761 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-p24df" Mar 14 08:09:04 crc kubenswrapper[4893]: I0314 08:09:04.740070 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-p24df" Mar 14 08:09:04 crc kubenswrapper[4893]: I0314 08:09:04.926312 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-p24df"] Mar 14 08:09:06 crc kubenswrapper[4893]: I0314 08:09:06.687655 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-p24df" podUID="e3c2dc75-c1e6-4101-9432-b9f8ace5a3cf" containerName="registry-server" containerID="cri-o://f58846d6214b1a6a920a6a060602598f497283baae1adcc5a2bdf80a9e3367fb" gracePeriod=2 Mar 14 08:09:07 crc kubenswrapper[4893]: I0314 08:09:07.142762 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p24df" Mar 14 08:09:07 crc kubenswrapper[4893]: I0314 08:09:07.302252 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9mzp\" (UniqueName: \"kubernetes.io/projected/e3c2dc75-c1e6-4101-9432-b9f8ace5a3cf-kube-api-access-t9mzp\") pod \"e3c2dc75-c1e6-4101-9432-b9f8ace5a3cf\" (UID: \"e3c2dc75-c1e6-4101-9432-b9f8ace5a3cf\") " Mar 14 08:09:07 crc kubenswrapper[4893]: I0314 08:09:07.302375 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3c2dc75-c1e6-4101-9432-b9f8ace5a3cf-utilities\") pod \"e3c2dc75-c1e6-4101-9432-b9f8ace5a3cf\" (UID: \"e3c2dc75-c1e6-4101-9432-b9f8ace5a3cf\") " Mar 14 08:09:07 crc kubenswrapper[4893]: I0314 08:09:07.302449 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3c2dc75-c1e6-4101-9432-b9f8ace5a3cf-catalog-content\") pod \"e3c2dc75-c1e6-4101-9432-b9f8ace5a3cf\" (UID: \"e3c2dc75-c1e6-4101-9432-b9f8ace5a3cf\") " Mar 14 08:09:07 crc kubenswrapper[4893]: I0314 08:09:07.303758 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3c2dc75-c1e6-4101-9432-b9f8ace5a3cf-utilities" (OuterVolumeSpecName: "utilities") pod "e3c2dc75-c1e6-4101-9432-b9f8ace5a3cf" (UID: "e3c2dc75-c1e6-4101-9432-b9f8ace5a3cf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:09:07 crc kubenswrapper[4893]: I0314 08:09:07.307665 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3c2dc75-c1e6-4101-9432-b9f8ace5a3cf-kube-api-access-t9mzp" (OuterVolumeSpecName: "kube-api-access-t9mzp") pod "e3c2dc75-c1e6-4101-9432-b9f8ace5a3cf" (UID: "e3c2dc75-c1e6-4101-9432-b9f8ace5a3cf"). InnerVolumeSpecName "kube-api-access-t9mzp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:09:07 crc kubenswrapper[4893]: I0314 08:09:07.335306 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3c2dc75-c1e6-4101-9432-b9f8ace5a3cf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e3c2dc75-c1e6-4101-9432-b9f8ace5a3cf" (UID: "e3c2dc75-c1e6-4101-9432-b9f8ace5a3cf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:09:07 crc kubenswrapper[4893]: I0314 08:09:07.405625 4893 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3c2dc75-c1e6-4101-9432-b9f8ace5a3cf-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 08:09:07 crc kubenswrapper[4893]: I0314 08:09:07.405660 4893 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3c2dc75-c1e6-4101-9432-b9f8ace5a3cf-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 08:09:07 crc kubenswrapper[4893]: I0314 08:09:07.405675 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9mzp\" (UniqueName: \"kubernetes.io/projected/e3c2dc75-c1e6-4101-9432-b9f8ace5a3cf-kube-api-access-t9mzp\") on node \"crc\" DevicePath \"\"" Mar 14 08:09:07 crc kubenswrapper[4893]: I0314 08:09:07.698844 4893 generic.go:334] "Generic (PLEG): container finished" podID="e3c2dc75-c1e6-4101-9432-b9f8ace5a3cf" containerID="f58846d6214b1a6a920a6a060602598f497283baae1adcc5a2bdf80a9e3367fb" exitCode=0 Mar 14 08:09:07 crc kubenswrapper[4893]: I0314 08:09:07.698899 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p24df" Mar 14 08:09:07 crc kubenswrapper[4893]: I0314 08:09:07.698907 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p24df" event={"ID":"e3c2dc75-c1e6-4101-9432-b9f8ace5a3cf","Type":"ContainerDied","Data":"f58846d6214b1a6a920a6a060602598f497283baae1adcc5a2bdf80a9e3367fb"} Mar 14 08:09:07 crc kubenswrapper[4893]: I0314 08:09:07.698983 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p24df" event={"ID":"e3c2dc75-c1e6-4101-9432-b9f8ace5a3cf","Type":"ContainerDied","Data":"9912493f9db1204fb75d498269d1b1d15d70ba8fee01709002eb7e06008cf7c5"} Mar 14 08:09:07 crc kubenswrapper[4893]: I0314 08:09:07.699021 4893 scope.go:117] "RemoveContainer" containerID="f58846d6214b1a6a920a6a060602598f497283baae1adcc5a2bdf80a9e3367fb" Mar 14 08:09:07 crc kubenswrapper[4893]: I0314 08:09:07.732556 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-p24df"] Mar 14 08:09:07 crc kubenswrapper[4893]: I0314 08:09:07.733936 4893 scope.go:117] "RemoveContainer" containerID="71944f4b10a9e1bbac3f8d129f275a9fbe8d5f85ae4d9d31ed222cb6ea01ce1f" Mar 14 08:09:07 crc kubenswrapper[4893]: I0314 08:09:07.743600 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-p24df"] Mar 14 08:09:07 crc kubenswrapper[4893]: I0314 08:09:07.755438 4893 scope.go:117] "RemoveContainer" containerID="9ccdb650ea9a780c6364ce105e571cb997fe0e9dd8ce2d4f479948fc02ac480b" Mar 14 08:09:07 crc kubenswrapper[4893]: I0314 08:09:07.784487 4893 scope.go:117] "RemoveContainer" containerID="f58846d6214b1a6a920a6a060602598f497283baae1adcc5a2bdf80a9e3367fb" Mar 14 08:09:07 crc kubenswrapper[4893]: E0314 08:09:07.785576 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f58846d6214b1a6a920a6a060602598f497283baae1adcc5a2bdf80a9e3367fb\": container with ID starting with f58846d6214b1a6a920a6a060602598f497283baae1adcc5a2bdf80a9e3367fb not found: ID does not exist" containerID="f58846d6214b1a6a920a6a060602598f497283baae1adcc5a2bdf80a9e3367fb" Mar 14 08:09:07 crc kubenswrapper[4893]: I0314 08:09:07.785796 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f58846d6214b1a6a920a6a060602598f497283baae1adcc5a2bdf80a9e3367fb"} err="failed to get container status \"f58846d6214b1a6a920a6a060602598f497283baae1adcc5a2bdf80a9e3367fb\": rpc error: code = NotFound desc = could not find container \"f58846d6214b1a6a920a6a060602598f497283baae1adcc5a2bdf80a9e3367fb\": container with ID starting with f58846d6214b1a6a920a6a060602598f497283baae1adcc5a2bdf80a9e3367fb not found: ID does not exist" Mar 14 08:09:07 crc kubenswrapper[4893]: I0314 08:09:07.785848 4893 scope.go:117] "RemoveContainer" containerID="71944f4b10a9e1bbac3f8d129f275a9fbe8d5f85ae4d9d31ed222cb6ea01ce1f" Mar 14 08:09:07 crc kubenswrapper[4893]: E0314 08:09:07.786206 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71944f4b10a9e1bbac3f8d129f275a9fbe8d5f85ae4d9d31ed222cb6ea01ce1f\": container with ID starting with 71944f4b10a9e1bbac3f8d129f275a9fbe8d5f85ae4d9d31ed222cb6ea01ce1f not found: ID does not exist" containerID="71944f4b10a9e1bbac3f8d129f275a9fbe8d5f85ae4d9d31ed222cb6ea01ce1f" Mar 14 08:09:07 crc kubenswrapper[4893]: I0314 08:09:07.786241 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71944f4b10a9e1bbac3f8d129f275a9fbe8d5f85ae4d9d31ed222cb6ea01ce1f"} err="failed to get container status \"71944f4b10a9e1bbac3f8d129f275a9fbe8d5f85ae4d9d31ed222cb6ea01ce1f\": rpc error: code = NotFound desc = could not find container \"71944f4b10a9e1bbac3f8d129f275a9fbe8d5f85ae4d9d31ed222cb6ea01ce1f\": container with ID starting with 71944f4b10a9e1bbac3f8d129f275a9fbe8d5f85ae4d9d31ed222cb6ea01ce1f not found: ID does not exist" Mar 14 08:09:07 crc kubenswrapper[4893]: I0314 08:09:07.786263 4893 scope.go:117] "RemoveContainer" containerID="9ccdb650ea9a780c6364ce105e571cb997fe0e9dd8ce2d4f479948fc02ac480b" Mar 14 08:09:07 crc kubenswrapper[4893]: E0314 08:09:07.786577 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ccdb650ea9a780c6364ce105e571cb997fe0e9dd8ce2d4f479948fc02ac480b\": container with ID starting with 9ccdb650ea9a780c6364ce105e571cb997fe0e9dd8ce2d4f479948fc02ac480b not found: ID does not exist" containerID="9ccdb650ea9a780c6364ce105e571cb997fe0e9dd8ce2d4f479948fc02ac480b" Mar 14 08:09:07 crc kubenswrapper[4893]: I0314 08:09:07.786617 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ccdb650ea9a780c6364ce105e571cb997fe0e9dd8ce2d4f479948fc02ac480b"} err="failed to get container status \"9ccdb650ea9a780c6364ce105e571cb997fe0e9dd8ce2d4f479948fc02ac480b\": rpc error: code = NotFound desc = could not find container \"9ccdb650ea9a780c6364ce105e571cb997fe0e9dd8ce2d4f479948fc02ac480b\": container with ID starting with 9ccdb650ea9a780c6364ce105e571cb997fe0e9dd8ce2d4f479948fc02ac480b not found: ID does not exist" Mar 14 08:09:09 crc kubenswrapper[4893]: I0314 08:09:09.385694 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3c2dc75-c1e6-4101-9432-b9f8ace5a3cf" path="/var/lib/kubelet/pods/e3c2dc75-c1e6-4101-9432-b9f8ace5a3cf/volumes" Mar 14 08:09:29 crc kubenswrapper[4893]: I0314 08:09:29.731697 4893 patch_prober.go:28] interesting pod/machine-config-daemon-d4x6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:09:29 crc kubenswrapper[4893]: I0314 08:09:29.733453 4893 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:09:59 crc kubenswrapper[4893]: I0314 08:09:59.730946 4893 patch_prober.go:28] interesting pod/machine-config-daemon-d4x6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:09:59 crc kubenswrapper[4893]: I0314 08:09:59.731495 4893 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:09:59 crc kubenswrapper[4893]: I0314 08:09:59.731586 4893 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" Mar 14 08:09:59 crc kubenswrapper[4893]: I0314 08:09:59.732271 4893 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"66da3e9f540f9b798012137c734a2fc1fb70c88334a2ed2a3dfc71b99343d32e"} pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 08:09:59 crc kubenswrapper[4893]: I0314 08:09:59.732364 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" containerID="cri-o://66da3e9f540f9b798012137c734a2fc1fb70c88334a2ed2a3dfc71b99343d32e" gracePeriod=600 Mar 14 08:10:00 crc kubenswrapper[4893]: I0314 08:10:00.141584 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557930-fdw5v"] Mar 14 08:10:00 crc kubenswrapper[4893]: E0314 08:10:00.142286 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3c2dc75-c1e6-4101-9432-b9f8ace5a3cf" containerName="registry-server" Mar 14 08:10:00 crc kubenswrapper[4893]: I0314 08:10:00.142306 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3c2dc75-c1e6-4101-9432-b9f8ace5a3cf" containerName="registry-server" Mar 14 08:10:00 crc kubenswrapper[4893]: E0314 08:10:00.142323 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3c2dc75-c1e6-4101-9432-b9f8ace5a3cf" containerName="extract-content" Mar 14 08:10:00 crc kubenswrapper[4893]: I0314 08:10:00.142329 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3c2dc75-c1e6-4101-9432-b9f8ace5a3cf" containerName="extract-content" Mar 14 08:10:00 crc kubenswrapper[4893]: E0314 08:10:00.142353 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3c2dc75-c1e6-4101-9432-b9f8ace5a3cf" containerName="extract-utilities" Mar 14 08:10:00 crc kubenswrapper[4893]: I0314 08:10:00.142362 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3c2dc75-c1e6-4101-9432-b9f8ace5a3cf" containerName="extract-utilities" Mar 14 08:10:00 crc kubenswrapper[4893]: I0314 08:10:00.142523 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3c2dc75-c1e6-4101-9432-b9f8ace5a3cf" containerName="registry-server" Mar 14 08:10:00 crc kubenswrapper[4893]: I0314 08:10:00.143817 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557930-fdw5v" Mar 14 08:10:00 crc kubenswrapper[4893]: I0314 08:10:00.147129 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-44qb7" Mar 14 08:10:00 crc kubenswrapper[4893]: I0314 08:10:00.147234 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 08:10:00 crc kubenswrapper[4893]: I0314 08:10:00.149958 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 08:10:00 crc kubenswrapper[4893]: I0314 08:10:00.155510 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557930-fdw5v"] Mar 14 08:10:00 crc kubenswrapper[4893]: I0314 08:10:00.163160 4893 generic.go:334] "Generic (PLEG): container finished" podID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerID="66da3e9f540f9b798012137c734a2fc1fb70c88334a2ed2a3dfc71b99343d32e" exitCode=0 Mar 14 08:10:00 crc kubenswrapper[4893]: I0314 08:10:00.163209 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" event={"ID":"ad6724e5-48cf-4417-ae51-b1cb8c6af70d","Type":"ContainerDied","Data":"66da3e9f540f9b798012137c734a2fc1fb70c88334a2ed2a3dfc71b99343d32e"} Mar 14 08:10:00 crc kubenswrapper[4893]: I0314 08:10:00.163288 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" event={"ID":"ad6724e5-48cf-4417-ae51-b1cb8c6af70d","Type":"ContainerStarted","Data":"7720035fe19d5f2843be206ed7d85829236750c2374247f9ff80fd08220bd900"} Mar 14 08:10:00 crc kubenswrapper[4893]: I0314 08:10:00.163310 4893 scope.go:117] "RemoveContainer" containerID="e12005c3784aac2361996712f043ac7b2eec25802043ed8145c0668053f67c8d" Mar 14 08:10:00 crc kubenswrapper[4893]: I0314 08:10:00.233460 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnj8j\" (UniqueName: \"kubernetes.io/projected/793b1f76-6d6b-434b-8d3a-c10346c023d7-kube-api-access-bnj8j\") pod \"auto-csr-approver-29557930-fdw5v\" (UID: \"793b1f76-6d6b-434b-8d3a-c10346c023d7\") " pod="openshift-infra/auto-csr-approver-29557930-fdw5v" Mar 14 08:10:00 crc kubenswrapper[4893]: I0314 08:10:00.334383 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnj8j\" (UniqueName: \"kubernetes.io/projected/793b1f76-6d6b-434b-8d3a-c10346c023d7-kube-api-access-bnj8j\") pod \"auto-csr-approver-29557930-fdw5v\" (UID: \"793b1f76-6d6b-434b-8d3a-c10346c023d7\") " pod="openshift-infra/auto-csr-approver-29557930-fdw5v" Mar 14 08:10:00 crc kubenswrapper[4893]: I0314 08:10:00.351508 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnj8j\" (UniqueName: \"kubernetes.io/projected/793b1f76-6d6b-434b-8d3a-c10346c023d7-kube-api-access-bnj8j\") pod \"auto-csr-approver-29557930-fdw5v\" (UID: \"793b1f76-6d6b-434b-8d3a-c10346c023d7\") " pod="openshift-infra/auto-csr-approver-29557930-fdw5v" Mar 14 08:10:00 crc kubenswrapper[4893]: I0314 08:10:00.464992 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557930-fdw5v" Mar 14 08:10:00 crc kubenswrapper[4893]: I0314 08:10:00.662123 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557930-fdw5v"] Mar 14 08:10:01 crc kubenswrapper[4893]: I0314 08:10:01.174950 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557930-fdw5v" event={"ID":"793b1f76-6d6b-434b-8d3a-c10346c023d7","Type":"ContainerStarted","Data":"cb7ad27f7f46b0573a207f43e47420f91633ab1870bc46de4ed2b28b91339bdf"} Mar 14 08:10:02 crc kubenswrapper[4893]: I0314 08:10:02.185843 4893 generic.go:334] "Generic (PLEG): container finished" podID="793b1f76-6d6b-434b-8d3a-c10346c023d7" containerID="ceeecf3d73c3a5b59f1cfbc530df48a14116a849ebc5aab6f2b97b53e15f8ce6" exitCode=0 Mar 14 08:10:02 crc kubenswrapper[4893]: I0314 08:10:02.185971 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557930-fdw5v" event={"ID":"793b1f76-6d6b-434b-8d3a-c10346c023d7","Type":"ContainerDied","Data":"ceeecf3d73c3a5b59f1cfbc530df48a14116a849ebc5aab6f2b97b53e15f8ce6"} Mar 14 08:10:03 crc kubenswrapper[4893]: I0314 08:10:03.580955 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557930-fdw5v" Mar 14 08:10:03 crc kubenswrapper[4893]: I0314 08:10:03.692749 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnj8j\" (UniqueName: \"kubernetes.io/projected/793b1f76-6d6b-434b-8d3a-c10346c023d7-kube-api-access-bnj8j\") pod \"793b1f76-6d6b-434b-8d3a-c10346c023d7\" (UID: \"793b1f76-6d6b-434b-8d3a-c10346c023d7\") " Mar 14 08:10:03 crc kubenswrapper[4893]: I0314 08:10:03.707725 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/793b1f76-6d6b-434b-8d3a-c10346c023d7-kube-api-access-bnj8j" (OuterVolumeSpecName: "kube-api-access-bnj8j") pod "793b1f76-6d6b-434b-8d3a-c10346c023d7" (UID: "793b1f76-6d6b-434b-8d3a-c10346c023d7"). InnerVolumeSpecName "kube-api-access-bnj8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:10:03 crc kubenswrapper[4893]: I0314 08:10:03.794370 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bnj8j\" (UniqueName: \"kubernetes.io/projected/793b1f76-6d6b-434b-8d3a-c10346c023d7-kube-api-access-bnj8j\") on node \"crc\" DevicePath \"\"" Mar 14 08:10:04 crc kubenswrapper[4893]: I0314 08:10:04.209337 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557930-fdw5v" event={"ID":"793b1f76-6d6b-434b-8d3a-c10346c023d7","Type":"ContainerDied","Data":"cb7ad27f7f46b0573a207f43e47420f91633ab1870bc46de4ed2b28b91339bdf"} Mar 14 08:10:04 crc kubenswrapper[4893]: I0314 08:10:04.209399 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb7ad27f7f46b0573a207f43e47420f91633ab1870bc46de4ed2b28b91339bdf" Mar 14 08:10:04 crc kubenswrapper[4893]: I0314 08:10:04.209407 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557930-fdw5v" Mar 14 08:10:04 crc kubenswrapper[4893]: I0314 08:10:04.656651 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557924-5mtgh"] Mar 14 08:10:04 crc kubenswrapper[4893]: I0314 08:10:04.663587 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557924-5mtgh"] Mar 14 08:10:05 crc kubenswrapper[4893]: I0314 08:10:05.384789 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b6d70d2-3aaf-4301-ab2d-c8adbb0be509" path="/var/lib/kubelet/pods/7b6d70d2-3aaf-4301-ab2d-c8adbb0be509/volumes" Mar 14 08:10:08 crc kubenswrapper[4893]: I0314 08:10:08.812209 4893 scope.go:117] "RemoveContainer" containerID="f36cfc7f44b0bd191cd4564355c44ede7e29936d8509180985064df546cbfc2e" Mar 14 08:11:59 crc kubenswrapper[4893]: I0314 08:11:59.730876 4893 patch_prober.go:28] interesting pod/machine-config-daemon-d4x6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:11:59 crc kubenswrapper[4893]: I0314 08:11:59.731667 4893 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:12:00 crc kubenswrapper[4893]: I0314 08:12:00.145447 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557932-nvlrp"] Mar 14 08:12:00 crc kubenswrapper[4893]: E0314 08:12:00.145785 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="793b1f76-6d6b-434b-8d3a-c10346c023d7" containerName="oc" Mar 14 08:12:00 crc kubenswrapper[4893]: I0314 08:12:00.145796 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="793b1f76-6d6b-434b-8d3a-c10346c023d7" containerName="oc" Mar 14 08:12:00 crc kubenswrapper[4893]: I0314 08:12:00.145976 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="793b1f76-6d6b-434b-8d3a-c10346c023d7" containerName="oc" Mar 14 08:12:00 crc kubenswrapper[4893]: I0314 08:12:00.147425 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557932-nvlrp" Mar 14 08:12:00 crc kubenswrapper[4893]: I0314 08:12:00.150821 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557932-nvlrp"] Mar 14 08:12:00 crc kubenswrapper[4893]: I0314 08:12:00.192081 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-44qb7" Mar 14 08:12:00 crc kubenswrapper[4893]: I0314 08:12:00.192122 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 08:12:00 crc kubenswrapper[4893]: I0314 08:12:00.192301 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 08:12:00 crc kubenswrapper[4893]: I0314 08:12:00.192716 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8kcl\" (UniqueName: \"kubernetes.io/projected/0efee831-113a-4473-8091-7c1d51cafd5d-kube-api-access-t8kcl\") pod \"auto-csr-approver-29557932-nvlrp\" (UID: \"0efee831-113a-4473-8091-7c1d51cafd5d\") " pod="openshift-infra/auto-csr-approver-29557932-nvlrp" Mar 14 08:12:00 crc kubenswrapper[4893]: I0314 08:12:00.294696 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8kcl\" (UniqueName: \"kubernetes.io/projected/0efee831-113a-4473-8091-7c1d51cafd5d-kube-api-access-t8kcl\") pod \"auto-csr-approver-29557932-nvlrp\" (UID: \"0efee831-113a-4473-8091-7c1d51cafd5d\") " pod="openshift-infra/auto-csr-approver-29557932-nvlrp" Mar 14 08:12:00 crc kubenswrapper[4893]: I0314 08:12:00.312285 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8kcl\" (UniqueName: \"kubernetes.io/projected/0efee831-113a-4473-8091-7c1d51cafd5d-kube-api-access-t8kcl\") pod \"auto-csr-approver-29557932-nvlrp\" (UID: \"0efee831-113a-4473-8091-7c1d51cafd5d\") " pod="openshift-infra/auto-csr-approver-29557932-nvlrp" Mar 14 08:12:00 crc kubenswrapper[4893]: I0314 08:12:00.519574 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557932-nvlrp" Mar 14 08:12:00 crc kubenswrapper[4893]: I0314 08:12:00.920745 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557932-nvlrp"] Mar 14 08:12:01 crc kubenswrapper[4893]: I0314 08:12:01.230672 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557932-nvlrp" event={"ID":"0efee831-113a-4473-8091-7c1d51cafd5d","Type":"ContainerStarted","Data":"5830c0434cf75184f74a182501ad56c531b5e205356f2fdd384b94c13d48badd"} Mar 14 08:12:02 crc kubenswrapper[4893]: I0314 08:12:02.241901 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557932-nvlrp" event={"ID":"0efee831-113a-4473-8091-7c1d51cafd5d","Type":"ContainerStarted","Data":"322bc6f2a2ea9f756e4ff76fe8d8514aeac2cb2dfd13cca27e419df8183cbf01"} Mar 14 08:12:02 crc kubenswrapper[4893]: I0314 08:12:02.260349 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557932-nvlrp" podStartSLOduration=1.353023032 podStartE2EDuration="2.260319577s" podCreationTimestamp="2026-03-14 08:12:00 +0000 UTC" firstStartedPulling="2026-03-14 08:12:00.930781387 +0000 UTC m=+4400.192958179" lastFinishedPulling="2026-03-14 08:12:01.838077932 +0000 UTC m=+4401.100254724" observedRunningTime="2026-03-14 08:12:02.253989444 +0000 UTC m=+4401.516166236" watchObservedRunningTime="2026-03-14 08:12:02.260319577 +0000 UTC m=+4401.522496379" Mar 14 08:12:03 crc kubenswrapper[4893]: I0314 08:12:03.250928 4893 generic.go:334] "Generic (PLEG): container finished" podID="0efee831-113a-4473-8091-7c1d51cafd5d" containerID="322bc6f2a2ea9f756e4ff76fe8d8514aeac2cb2dfd13cca27e419df8183cbf01" exitCode=0 Mar 14 08:12:03 crc kubenswrapper[4893]: I0314 08:12:03.251001 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557932-nvlrp" event={"ID":"0efee831-113a-4473-8091-7c1d51cafd5d","Type":"ContainerDied","Data":"322bc6f2a2ea9f756e4ff76fe8d8514aeac2cb2dfd13cca27e419df8183cbf01"} Mar 14 08:12:04 crc kubenswrapper[4893]: I0314 08:12:04.510694 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557932-nvlrp" Mar 14 08:12:04 crc kubenswrapper[4893]: I0314 08:12:04.662367 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8kcl\" (UniqueName: \"kubernetes.io/projected/0efee831-113a-4473-8091-7c1d51cafd5d-kube-api-access-t8kcl\") pod \"0efee831-113a-4473-8091-7c1d51cafd5d\" (UID: \"0efee831-113a-4473-8091-7c1d51cafd5d\") " Mar 14 08:12:04 crc kubenswrapper[4893]: I0314 08:12:04.676265 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0efee831-113a-4473-8091-7c1d51cafd5d-kube-api-access-t8kcl" (OuterVolumeSpecName: "kube-api-access-t8kcl") pod "0efee831-113a-4473-8091-7c1d51cafd5d" (UID: "0efee831-113a-4473-8091-7c1d51cafd5d"). InnerVolumeSpecName "kube-api-access-t8kcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:12:04 crc kubenswrapper[4893]: I0314 08:12:04.764190 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8kcl\" (UniqueName: \"kubernetes.io/projected/0efee831-113a-4473-8091-7c1d51cafd5d-kube-api-access-t8kcl\") on node \"crc\" DevicePath \"\"" Mar 14 08:12:05 crc kubenswrapper[4893]: I0314 08:12:05.270748 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557932-nvlrp" event={"ID":"0efee831-113a-4473-8091-7c1d51cafd5d","Type":"ContainerDied","Data":"5830c0434cf75184f74a182501ad56c531b5e205356f2fdd384b94c13d48badd"} Mar 14 08:12:05 crc kubenswrapper[4893]: I0314 08:12:05.270793 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5830c0434cf75184f74a182501ad56c531b5e205356f2fdd384b94c13d48badd" Mar 14 08:12:05 crc kubenswrapper[4893]: I0314 08:12:05.270860 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557932-nvlrp" Mar 14 08:12:05 crc kubenswrapper[4893]: I0314 08:12:05.587362 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557926-sqlws"] Mar 14 08:12:05 crc kubenswrapper[4893]: I0314 08:12:05.592010 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557926-sqlws"] Mar 14 08:12:07 crc kubenswrapper[4893]: I0314 08:12:07.388164 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2b72a40-c832-4b17-8135-1da465713199" path="/var/lib/kubelet/pods/c2b72a40-c832-4b17-8135-1da465713199/volumes" Mar 14 08:12:08 crc kubenswrapper[4893]: I0314 08:12:08.927436 4893 scope.go:117] "RemoveContainer" containerID="68359cc89d3363585d3ce74edb93b87c43115f11664ce3f87616861367265c9a" Mar 14 08:12:29 crc kubenswrapper[4893]: I0314 08:12:29.731071 4893 patch_prober.go:28] interesting pod/machine-config-daemon-d4x6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:12:29 crc kubenswrapper[4893]: I0314 08:12:29.731762 4893 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:12:59 crc kubenswrapper[4893]: I0314 08:12:59.731723 4893 patch_prober.go:28] interesting pod/machine-config-daemon-d4x6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:12:59 crc kubenswrapper[4893]: I0314 08:12:59.732367 4893 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:12:59 crc kubenswrapper[4893]: I0314 08:12:59.732432 4893 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" Mar 14 08:12:59 crc kubenswrapper[4893]: I0314 08:12:59.733301 4893 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7720035fe19d5f2843be206ed7d85829236750c2374247f9ff80fd08220bd900"} pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 08:12:59 crc kubenswrapper[4893]: I0314 08:12:59.733375 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" containerID="cri-o://7720035fe19d5f2843be206ed7d85829236750c2374247f9ff80fd08220bd900" gracePeriod=600 Mar 14 08:12:59 crc kubenswrapper[4893]: E0314 08:12:59.853166 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:13:00 crc kubenswrapper[4893]: I0314 08:13:00.706569 4893 generic.go:334] "Generic (PLEG): container finished" podID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerID="7720035fe19d5f2843be206ed7d85829236750c2374247f9ff80fd08220bd900" exitCode=0 Mar 14 08:13:00 crc kubenswrapper[4893]: I0314 08:13:00.706611 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" event={"ID":"ad6724e5-48cf-4417-ae51-b1cb8c6af70d","Type":"ContainerDied","Data":"7720035fe19d5f2843be206ed7d85829236750c2374247f9ff80fd08220bd900"} Mar 14 08:13:00 crc kubenswrapper[4893]: I0314 08:13:00.706646 4893 scope.go:117] "RemoveContainer" containerID="66da3e9f540f9b798012137c734a2fc1fb70c88334a2ed2a3dfc71b99343d32e" Mar 14 08:13:00 crc kubenswrapper[4893]: I0314 08:13:00.707157 4893 scope.go:117] "RemoveContainer" containerID="7720035fe19d5f2843be206ed7d85829236750c2374247f9ff80fd08220bd900" Mar 14 08:13:00 crc kubenswrapper[4893]: E0314 08:13:00.707509 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:13:12 crc kubenswrapper[4893]: I0314 08:13:12.377486 4893 scope.go:117] "RemoveContainer" containerID="7720035fe19d5f2843be206ed7d85829236750c2374247f9ff80fd08220bd900" Mar 14 08:13:12 crc kubenswrapper[4893]: E0314 08:13:12.378591 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:13:26 crc kubenswrapper[4893]: I0314 08:13:26.377452 4893 scope.go:117] "RemoveContainer" containerID="7720035fe19d5f2843be206ed7d85829236750c2374247f9ff80fd08220bd900" Mar 14 08:13:26 crc kubenswrapper[4893]: E0314 08:13:26.378726 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:13:38 crc kubenswrapper[4893]: I0314 08:13:38.376756 4893 scope.go:117] "RemoveContainer" containerID="7720035fe19d5f2843be206ed7d85829236750c2374247f9ff80fd08220bd900" Mar 14 08:13:38 crc kubenswrapper[4893]: E0314 08:13:38.377412 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:13:52 crc kubenswrapper[4893]: I0314 08:13:52.376280 4893 scope.go:117] "RemoveContainer" containerID="7720035fe19d5f2843be206ed7d85829236750c2374247f9ff80fd08220bd900" Mar 14 08:13:52 crc kubenswrapper[4893]: E0314 08:13:52.377244 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:14:00 crc kubenswrapper[4893]: I0314 08:14:00.142655 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557934-qndlc"] Mar 14 08:14:00 crc kubenswrapper[4893]: E0314 08:14:00.145987 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0efee831-113a-4473-8091-7c1d51cafd5d" containerName="oc" Mar 14 08:14:00 crc kubenswrapper[4893]: I0314 08:14:00.146019 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="0efee831-113a-4473-8091-7c1d51cafd5d" containerName="oc" Mar 14 08:14:00 crc kubenswrapper[4893]: I0314 08:14:00.146158 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="0efee831-113a-4473-8091-7c1d51cafd5d" containerName="oc" Mar 14 08:14:00 crc kubenswrapper[4893]: I0314 08:14:00.146665 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557934-qndlc" Mar 14 08:14:00 crc kubenswrapper[4893]: I0314 08:14:00.149013 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 08:14:00 crc kubenswrapper[4893]: I0314 08:14:00.149084 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 08:14:00 crc kubenswrapper[4893]: I0314 08:14:00.149094 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-44qb7" Mar 14 08:14:00 crc kubenswrapper[4893]: I0314 08:14:00.153625 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557934-qndlc"] Mar 14 08:14:00 crc kubenswrapper[4893]: I0314 08:14:00.321985 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzl9g\" (UniqueName: \"kubernetes.io/projected/a54bc35f-50f6-4e96-80b2-274392a59aaa-kube-api-access-xzl9g\") pod \"auto-csr-approver-29557934-qndlc\" (UID: \"a54bc35f-50f6-4e96-80b2-274392a59aaa\") " pod="openshift-infra/auto-csr-approver-29557934-qndlc" Mar 14 08:14:00 crc kubenswrapper[4893]: I0314 08:14:00.423685 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzl9g\" (UniqueName: \"kubernetes.io/projected/a54bc35f-50f6-4e96-80b2-274392a59aaa-kube-api-access-xzl9g\") pod \"auto-csr-approver-29557934-qndlc\" (UID: \"a54bc35f-50f6-4e96-80b2-274392a59aaa\") " pod="openshift-infra/auto-csr-approver-29557934-qndlc" Mar 14 08:14:00 crc kubenswrapper[4893]: I0314 08:14:00.440767 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzl9g\" (UniqueName: \"kubernetes.io/projected/a54bc35f-50f6-4e96-80b2-274392a59aaa-kube-api-access-xzl9g\") pod \"auto-csr-approver-29557934-qndlc\" (UID: \"a54bc35f-50f6-4e96-80b2-274392a59aaa\") " pod="openshift-infra/auto-csr-approver-29557934-qndlc" Mar 14 08:14:00 crc kubenswrapper[4893]: I0314 08:14:00.468766 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557934-qndlc" Mar 14 08:14:00 crc kubenswrapper[4893]: I0314 08:14:00.877692 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557934-qndlc"] Mar 14 08:14:00 crc kubenswrapper[4893]: I0314 08:14:00.884737 4893 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 08:14:01 crc kubenswrapper[4893]: I0314 08:14:01.165744 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557934-qndlc" event={"ID":"a54bc35f-50f6-4e96-80b2-274392a59aaa","Type":"ContainerStarted","Data":"6e7a1bf9ce4562d92fc1dfe5cced7b9b3d61616c60e9bb09a495bcb87384fc64"} Mar 14 08:14:02 crc kubenswrapper[4893]: I0314 08:14:02.172563 4893 generic.go:334] "Generic (PLEG): container finished" podID="a54bc35f-50f6-4e96-80b2-274392a59aaa" containerID="9a963b1a1b431572dcd660db0a253a810dc0c5e4fd402b42f6c4c4bef65ed3b7" exitCode=0 Mar 14 08:14:02 crc kubenswrapper[4893]: I0314 08:14:02.172644 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557934-qndlc" event={"ID":"a54bc35f-50f6-4e96-80b2-274392a59aaa","Type":"ContainerDied","Data":"9a963b1a1b431572dcd660db0a253a810dc0c5e4fd402b42f6c4c4bef65ed3b7"} Mar 14 08:14:03 crc kubenswrapper[4893]: I0314 08:14:03.550668 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557934-qndlc" Mar 14 08:14:03 crc kubenswrapper[4893]: I0314 08:14:03.669258 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzl9g\" (UniqueName: \"kubernetes.io/projected/a54bc35f-50f6-4e96-80b2-274392a59aaa-kube-api-access-xzl9g\") pod \"a54bc35f-50f6-4e96-80b2-274392a59aaa\" (UID: \"a54bc35f-50f6-4e96-80b2-274392a59aaa\") " Mar 14 08:14:03 crc kubenswrapper[4893]: I0314 08:14:03.675105 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a54bc35f-50f6-4e96-80b2-274392a59aaa-kube-api-access-xzl9g" (OuterVolumeSpecName: "kube-api-access-xzl9g") pod "a54bc35f-50f6-4e96-80b2-274392a59aaa" (UID: "a54bc35f-50f6-4e96-80b2-274392a59aaa"). InnerVolumeSpecName "kube-api-access-xzl9g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:14:03 crc kubenswrapper[4893]: I0314 08:14:03.772466 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzl9g\" (UniqueName: \"kubernetes.io/projected/a54bc35f-50f6-4e96-80b2-274392a59aaa-kube-api-access-xzl9g\") on node \"crc\" DevicePath \"\"" Mar 14 08:14:04 crc kubenswrapper[4893]: I0314 08:14:04.193029 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557934-qndlc" event={"ID":"a54bc35f-50f6-4e96-80b2-274392a59aaa","Type":"ContainerDied","Data":"6e7a1bf9ce4562d92fc1dfe5cced7b9b3d61616c60e9bb09a495bcb87384fc64"} Mar 14 08:14:04 crc kubenswrapper[4893]: I0314 08:14:04.193074 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e7a1bf9ce4562d92fc1dfe5cced7b9b3d61616c60e9bb09a495bcb87384fc64" Mar 14 08:14:04 crc kubenswrapper[4893]: I0314 08:14:04.193114 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557934-qndlc" Mar 14 08:14:04 crc kubenswrapper[4893]: I0314 08:14:04.645926 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557928-5mshp"] Mar 14 08:14:04 crc kubenswrapper[4893]: I0314 08:14:04.654001 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557928-5mshp"] Mar 14 08:14:05 crc kubenswrapper[4893]: I0314 08:14:05.376108 4893 scope.go:117] "RemoveContainer" containerID="7720035fe19d5f2843be206ed7d85829236750c2374247f9ff80fd08220bd900" Mar 14 08:14:05 crc kubenswrapper[4893]: E0314 08:14:05.376581 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:14:05 crc kubenswrapper[4893]: I0314 08:14:05.383733 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cc1448f-bed8-4d56-a646-6e5ddeb86c60" path="/var/lib/kubelet/pods/9cc1448f-bed8-4d56-a646-6e5ddeb86c60/volumes" Mar 14 08:14:09 crc kubenswrapper[4893]: I0314 08:14:09.358956 4893 scope.go:117] "RemoveContainer" containerID="9a5126e7aeeb46a2f863b048a2bbe743e8e10b127e955b060ab78f44219af3fd" Mar 14 08:14:16 crc kubenswrapper[4893]: I0314 08:14:16.377144 4893 scope.go:117] "RemoveContainer" containerID="7720035fe19d5f2843be206ed7d85829236750c2374247f9ff80fd08220bd900" Mar 14 08:14:16 crc kubenswrapper[4893]: E0314 08:14:16.378025 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:14:29 crc kubenswrapper[4893]: I0314 08:14:29.690343 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-b9888"] Mar 14 08:14:29 crc kubenswrapper[4893]: E0314 08:14:29.691461 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a54bc35f-50f6-4e96-80b2-274392a59aaa" containerName="oc" Mar 14 08:14:29 crc kubenswrapper[4893]: I0314 08:14:29.691476 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="a54bc35f-50f6-4e96-80b2-274392a59aaa" containerName="oc" Mar 14 08:14:29 crc kubenswrapper[4893]: I0314 08:14:29.691651 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="a54bc35f-50f6-4e96-80b2-274392a59aaa" containerName="oc" Mar 14 08:14:29 crc kubenswrapper[4893]: I0314 08:14:29.692785 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b9888" Mar 14 08:14:29 crc kubenswrapper[4893]: I0314 08:14:29.700991 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b9888"] Mar 14 08:14:29 crc kubenswrapper[4893]: I0314 08:14:29.881898 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-292w7\" (UniqueName: \"kubernetes.io/projected/71b64914-14e9-4e81-923c-1406bc119fde-kube-api-access-292w7\") pod \"community-operators-b9888\" (UID: \"71b64914-14e9-4e81-923c-1406bc119fde\") " pod="openshift-marketplace/community-operators-b9888" Mar 14 08:14:29 crc kubenswrapper[4893]: I0314 08:14:29.881950 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71b64914-14e9-4e81-923c-1406bc119fde-utilities\") pod \"community-operators-b9888\" (UID: \"71b64914-14e9-4e81-923c-1406bc119fde\") " pod="openshift-marketplace/community-operators-b9888" Mar 14 08:14:29 crc kubenswrapper[4893]: I0314 08:14:29.882223 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71b64914-14e9-4e81-923c-1406bc119fde-catalog-content\") pod \"community-operators-b9888\" (UID: \"71b64914-14e9-4e81-923c-1406bc119fde\") " pod="openshift-marketplace/community-operators-b9888" Mar 14 08:14:29 crc kubenswrapper[4893]: I0314 08:14:29.984048 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-292w7\" (UniqueName: \"kubernetes.io/projected/71b64914-14e9-4e81-923c-1406bc119fde-kube-api-access-292w7\") pod \"community-operators-b9888\" (UID: \"71b64914-14e9-4e81-923c-1406bc119fde\") " pod="openshift-marketplace/community-operators-b9888" Mar 14 08:14:29 crc kubenswrapper[4893]: I0314 08:14:29.984091 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71b64914-14e9-4e81-923c-1406bc119fde-utilities\") pod \"community-operators-b9888\" (UID: \"71b64914-14e9-4e81-923c-1406bc119fde\") " pod="openshift-marketplace/community-operators-b9888" Mar 14 08:14:29 crc kubenswrapper[4893]: I0314 08:14:29.984151 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71b64914-14e9-4e81-923c-1406bc119fde-catalog-content\") pod \"community-operators-b9888\" (UID: \"71b64914-14e9-4e81-923c-1406bc119fde\") " pod="openshift-marketplace/community-operators-b9888" Mar 14 08:14:29 crc kubenswrapper[4893]: I0314 08:14:29.984577 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71b64914-14e9-4e81-923c-1406bc119fde-catalog-content\") pod \"community-operators-b9888\" (UID: \"71b64914-14e9-4e81-923c-1406bc119fde\") " pod="openshift-marketplace/community-operators-b9888" Mar 14 08:14:29 crc kubenswrapper[4893]: I0314 08:14:29.984673 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71b64914-14e9-4e81-923c-1406bc119fde-utilities\") pod \"community-operators-b9888\" (UID: \"71b64914-14e9-4e81-923c-1406bc119fde\") " pod="openshift-marketplace/community-operators-b9888" Mar 14 08:14:30 crc kubenswrapper[4893]: I0314 08:14:30.008585 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-292w7\" (UniqueName: \"kubernetes.io/projected/71b64914-14e9-4e81-923c-1406bc119fde-kube-api-access-292w7\") pod \"community-operators-b9888\" (UID: \"71b64914-14e9-4e81-923c-1406bc119fde\") " pod="openshift-marketplace/community-operators-b9888" Mar 14 08:14:30 crc kubenswrapper[4893]: I0314 08:14:30.308955 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b9888" Mar 14 08:14:30 crc kubenswrapper[4893]: I0314 08:14:30.376757 4893 scope.go:117] "RemoveContainer" containerID="7720035fe19d5f2843be206ed7d85829236750c2374247f9ff80fd08220bd900" Mar 14 08:14:30 crc kubenswrapper[4893]: E0314 08:14:30.377022 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:14:30 crc kubenswrapper[4893]: I0314 08:14:30.727894 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b9888"] Mar 14 08:14:31 crc kubenswrapper[4893]: I0314 08:14:31.399579 4893 generic.go:334] "Generic (PLEG): container finished" podID="71b64914-14e9-4e81-923c-1406bc119fde" containerID="fec8d93260fec95e4acfb7a827c99860933c0cba2db82054e728f44f2d970ca7" exitCode=0 Mar 14 08:14:31 crc kubenswrapper[4893]: I0314 08:14:31.399679 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b9888" event={"ID":"71b64914-14e9-4e81-923c-1406bc119fde","Type":"ContainerDied","Data":"fec8d93260fec95e4acfb7a827c99860933c0cba2db82054e728f44f2d970ca7"} Mar 14 08:14:31 crc kubenswrapper[4893]: I0314 08:14:31.399880 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b9888" event={"ID":"71b64914-14e9-4e81-923c-1406bc119fde","Type":"ContainerStarted","Data":"cc7201cfacba12e02ff7cb360a97b64c5f5ae67eafb512fa26d465bc8a87a382"} Mar 14 08:14:32 crc kubenswrapper[4893]: I0314 08:14:32.409935 4893 generic.go:334] "Generic (PLEG): container finished" podID="71b64914-14e9-4e81-923c-1406bc119fde" containerID="7a142c1d5b1e8c47af0dd3c72db1827e0c1ec7f9a93490dd044aad3a9b5e5547" exitCode=0 Mar 14 08:14:32 crc kubenswrapper[4893]: I0314 08:14:32.409980 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b9888" event={"ID":"71b64914-14e9-4e81-923c-1406bc119fde","Type":"ContainerDied","Data":"7a142c1d5b1e8c47af0dd3c72db1827e0c1ec7f9a93490dd044aad3a9b5e5547"} Mar 14 08:14:33 crc kubenswrapper[4893]: I0314 08:14:33.422237 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b9888" event={"ID":"71b64914-14e9-4e81-923c-1406bc119fde","Type":"ContainerStarted","Data":"45b0ff4c6dbaba7707228aeaa5e3ab91f663da6a1c7307145bf722fbb91ce7e8"} Mar 14 08:14:33 crc kubenswrapper[4893]: I0314 08:14:33.456141 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-b9888" podStartSLOduration=3.088955737 podStartE2EDuration="4.456113719s" podCreationTimestamp="2026-03-14 08:14:29 +0000 UTC" firstStartedPulling="2026-03-14 08:14:31.401161143 +0000 UTC m=+4550.663337935" lastFinishedPulling="2026-03-14 08:14:32.768319125 +0000 UTC m=+4552.030495917" observedRunningTime="2026-03-14 08:14:33.441698798 +0000 UTC m=+4552.703875600" watchObservedRunningTime="2026-03-14 08:14:33.456113719 +0000 UTC m=+4552.718290681" Mar 14 08:14:40 crc kubenswrapper[4893]: I0314 08:14:40.310225 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-b9888" Mar 14 08:14:40 crc kubenswrapper[4893]: I0314 08:14:40.310924 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-b9888" Mar 14 08:14:40 crc kubenswrapper[4893]: I0314 08:14:40.361499 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-b9888" Mar 14 08:14:40 crc kubenswrapper[4893]: I0314 08:14:40.504946 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-b9888" Mar 14 08:14:40 crc kubenswrapper[4893]: I0314 08:14:40.596897 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b9888"] Mar 14 08:14:42 crc kubenswrapper[4893]: I0314 08:14:42.482980 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-b9888" podUID="71b64914-14e9-4e81-923c-1406bc119fde" containerName="registry-server" containerID="cri-o://45b0ff4c6dbaba7707228aeaa5e3ab91f663da6a1c7307145bf722fbb91ce7e8" gracePeriod=2 Mar 14 08:14:43 crc kubenswrapper[4893]: I0314 08:14:43.376400 4893 scope.go:117] "RemoveContainer" containerID="7720035fe19d5f2843be206ed7d85829236750c2374247f9ff80fd08220bd900" Mar 14 08:14:43 crc kubenswrapper[4893]: E0314 08:14:43.376660 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:14:43 crc kubenswrapper[4893]: I0314 08:14:43.493635 4893 generic.go:334] "Generic (PLEG): container finished" podID="71b64914-14e9-4e81-923c-1406bc119fde" containerID="45b0ff4c6dbaba7707228aeaa5e3ab91f663da6a1c7307145bf722fbb91ce7e8" exitCode=0 Mar 14 08:14:43 crc kubenswrapper[4893]: I0314 08:14:43.493669 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b9888" event={"ID":"71b64914-14e9-4e81-923c-1406bc119fde","Type":"ContainerDied","Data":"45b0ff4c6dbaba7707228aeaa5e3ab91f663da6a1c7307145bf722fbb91ce7e8"} Mar 14 08:14:43 crc kubenswrapper[4893]: I0314 08:14:43.623974 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b9888" Mar 14 08:14:43 crc kubenswrapper[4893]: I0314 08:14:43.788049 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71b64914-14e9-4e81-923c-1406bc119fde-catalog-content\") pod \"71b64914-14e9-4e81-923c-1406bc119fde\" (UID: \"71b64914-14e9-4e81-923c-1406bc119fde\") " Mar 14 08:14:43 crc kubenswrapper[4893]: I0314 08:14:43.788161 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71b64914-14e9-4e81-923c-1406bc119fde-utilities\") pod \"71b64914-14e9-4e81-923c-1406bc119fde\" (UID: \"71b64914-14e9-4e81-923c-1406bc119fde\") " Mar 14 08:14:43 crc kubenswrapper[4893]: I0314 08:14:43.788186 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-292w7\" (UniqueName: \"kubernetes.io/projected/71b64914-14e9-4e81-923c-1406bc119fde-kube-api-access-292w7\") pod \"71b64914-14e9-4e81-923c-1406bc119fde\" (UID: \"71b64914-14e9-4e81-923c-1406bc119fde\") " Mar 14 08:14:43 crc kubenswrapper[4893]: I0314 08:14:43.790177 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71b64914-14e9-4e81-923c-1406bc119fde-utilities" (OuterVolumeSpecName: "utilities") pod "71b64914-14e9-4e81-923c-1406bc119fde" (UID: "71b64914-14e9-4e81-923c-1406bc119fde"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:14:43 crc kubenswrapper[4893]: I0314 08:14:43.796219 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71b64914-14e9-4e81-923c-1406bc119fde-kube-api-access-292w7" (OuterVolumeSpecName: "kube-api-access-292w7") pod "71b64914-14e9-4e81-923c-1406bc119fde" (UID: "71b64914-14e9-4e81-923c-1406bc119fde"). InnerVolumeSpecName "kube-api-access-292w7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:14:43 crc kubenswrapper[4893]: I0314 08:14:43.844842 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71b64914-14e9-4e81-923c-1406bc119fde-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "71b64914-14e9-4e81-923c-1406bc119fde" (UID: "71b64914-14e9-4e81-923c-1406bc119fde"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:14:43 crc kubenswrapper[4893]: I0314 08:14:43.889491 4893 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71b64914-14e9-4e81-923c-1406bc119fde-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 08:14:43 crc kubenswrapper[4893]: I0314 08:14:43.889558 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-292w7\" (UniqueName: \"kubernetes.io/projected/71b64914-14e9-4e81-923c-1406bc119fde-kube-api-access-292w7\") on node \"crc\" DevicePath \"\"" Mar 14 08:14:43 crc kubenswrapper[4893]: I0314 08:14:43.889574 4893 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71b64914-14e9-4e81-923c-1406bc119fde-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 08:14:44 crc kubenswrapper[4893]: I0314 08:14:44.508194 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b9888" event={"ID":"71b64914-14e9-4e81-923c-1406bc119fde","Type":"ContainerDied","Data":"cc7201cfacba12e02ff7cb360a97b64c5f5ae67eafb512fa26d465bc8a87a382"} Mar 14 08:14:44 crc kubenswrapper[4893]: I0314 08:14:44.508261 4893 scope.go:117] "RemoveContainer" containerID="45b0ff4c6dbaba7707228aeaa5e3ab91f663da6a1c7307145bf722fbb91ce7e8" Mar 14 08:14:44 crc kubenswrapper[4893]: I0314 08:14:44.508267 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b9888" Mar 14 08:14:44 crc kubenswrapper[4893]: I0314 08:14:44.536846 4893 scope.go:117] "RemoveContainer" containerID="7a142c1d5b1e8c47af0dd3c72db1827e0c1ec7f9a93490dd044aad3a9b5e5547" Mar 14 08:14:44 crc kubenswrapper[4893]: I0314 08:14:44.575211 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b9888"] Mar 14 08:14:44 crc kubenswrapper[4893]: I0314 08:14:44.580372 4893 scope.go:117] "RemoveContainer" containerID="fec8d93260fec95e4acfb7a827c99860933c0cba2db82054e728f44f2d970ca7" Mar 14 08:14:44 crc kubenswrapper[4893]: I0314 08:14:44.585913 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-b9888"] Mar 14 08:14:45 crc kubenswrapper[4893]: I0314 08:14:45.385720 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71b64914-14e9-4e81-923c-1406bc119fde" path="/var/lib/kubelet/pods/71b64914-14e9-4e81-923c-1406bc119fde/volumes" Mar 14 08:14:56 crc kubenswrapper[4893]: I0314 08:14:56.376687 4893 scope.go:117] "RemoveContainer" containerID="7720035fe19d5f2843be206ed7d85829236750c2374247f9ff80fd08220bd900" Mar 14 08:14:56 crc kubenswrapper[4893]: E0314 08:14:56.377484 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:15:00 crc kubenswrapper[4893]: I0314 08:15:00.147085 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557935-lfwdf"] Mar 14 08:15:00 crc kubenswrapper[4893]: E0314 08:15:00.147783 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71b64914-14e9-4e81-923c-1406bc119fde" containerName="extract-content" Mar 14 08:15:00 crc kubenswrapper[4893]: I0314 08:15:00.147799 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="71b64914-14e9-4e81-923c-1406bc119fde" containerName="extract-content" Mar 14 08:15:00 crc kubenswrapper[4893]: E0314 08:15:00.147809 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71b64914-14e9-4e81-923c-1406bc119fde" containerName="registry-server" Mar 14 08:15:00 crc kubenswrapper[4893]: I0314 08:15:00.147816 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="71b64914-14e9-4e81-923c-1406bc119fde" containerName="registry-server" Mar 14 08:15:00 crc kubenswrapper[4893]: E0314 08:15:00.147837 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71b64914-14e9-4e81-923c-1406bc119fde" containerName="extract-utilities" Mar 14 08:15:00 crc kubenswrapper[4893]: I0314 08:15:00.147844 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="71b64914-14e9-4e81-923c-1406bc119fde" containerName="extract-utilities" Mar 14 08:15:00 crc kubenswrapper[4893]: I0314 08:15:00.147978 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="71b64914-14e9-4e81-923c-1406bc119fde" containerName="registry-server" Mar 14 08:15:00 crc kubenswrapper[4893]: I0314 08:15:00.148481 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557935-lfwdf" Mar 14 08:15:00 crc kubenswrapper[4893]: I0314 08:15:00.155019 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557935-lfwdf"] Mar 14 08:15:00 crc kubenswrapper[4893]: I0314 08:15:00.160078 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 14 08:15:00 crc kubenswrapper[4893]: I0314 08:15:00.160272 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 14 08:15:00 crc kubenswrapper[4893]: I0314 08:15:00.243696 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47jtm\" (UniqueName: \"kubernetes.io/projected/36d14a3d-dda7-4da3-be24-c64c6c120c12-kube-api-access-47jtm\") pod \"collect-profiles-29557935-lfwdf\" (UID: \"36d14a3d-dda7-4da3-be24-c64c6c120c12\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557935-lfwdf" Mar 14 08:15:00 crc kubenswrapper[4893]: I0314 08:15:00.244089 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/36d14a3d-dda7-4da3-be24-c64c6c120c12-secret-volume\") pod \"collect-profiles-29557935-lfwdf\" (UID: \"36d14a3d-dda7-4da3-be24-c64c6c120c12\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557935-lfwdf" Mar 14 08:15:00 crc kubenswrapper[4893]: I0314 08:15:00.244118 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/36d14a3d-dda7-4da3-be24-c64c6c120c12-config-volume\") pod \"collect-profiles-29557935-lfwdf\" (UID: \"36d14a3d-dda7-4da3-be24-c64c6c120c12\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557935-lfwdf" Mar 14 08:15:00 crc kubenswrapper[4893]: I0314 08:15:00.345636 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/36d14a3d-dda7-4da3-be24-c64c6c120c12-secret-volume\") pod \"collect-profiles-29557935-lfwdf\" (UID: \"36d14a3d-dda7-4da3-be24-c64c6c120c12\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557935-lfwdf" Mar 14 08:15:00 crc kubenswrapper[4893]: I0314 08:15:00.345702 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/36d14a3d-dda7-4da3-be24-c64c6c120c12-config-volume\") pod \"collect-profiles-29557935-lfwdf\" (UID: \"36d14a3d-dda7-4da3-be24-c64c6c120c12\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557935-lfwdf" Mar 14 08:15:00 crc kubenswrapper[4893]: I0314 08:15:00.345765 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47jtm\" (UniqueName: \"kubernetes.io/projected/36d14a3d-dda7-4da3-be24-c64c6c120c12-kube-api-access-47jtm\") pod \"collect-profiles-29557935-lfwdf\" (UID: \"36d14a3d-dda7-4da3-be24-c64c6c120c12\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557935-lfwdf" Mar 14 08:15:00 crc kubenswrapper[4893]: I0314 08:15:00.348598 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/36d14a3d-dda7-4da3-be24-c64c6c120c12-config-volume\") pod \"collect-profiles-29557935-lfwdf\" (UID: \"36d14a3d-dda7-4da3-be24-c64c6c120c12\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557935-lfwdf" Mar 14 08:15:00 crc kubenswrapper[4893]: I0314 08:15:00.352384 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/36d14a3d-dda7-4da3-be24-c64c6c120c12-secret-volume\") pod \"collect-profiles-29557935-lfwdf\" (UID: \"36d14a3d-dda7-4da3-be24-c64c6c120c12\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557935-lfwdf" Mar 14 08:15:00 crc kubenswrapper[4893]: I0314 08:15:00.367330 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47jtm\" (UniqueName: \"kubernetes.io/projected/36d14a3d-dda7-4da3-be24-c64c6c120c12-kube-api-access-47jtm\") pod \"collect-profiles-29557935-lfwdf\" (UID: \"36d14a3d-dda7-4da3-be24-c64c6c120c12\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557935-lfwdf" Mar 14 08:15:00 crc kubenswrapper[4893]: I0314 08:15:00.479575 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557935-lfwdf" Mar 14 08:15:00 crc kubenswrapper[4893]: I0314 08:15:00.907004 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557935-lfwdf"] Mar 14 08:15:01 crc kubenswrapper[4893]: I0314 08:15:01.640631 4893 generic.go:334] "Generic (PLEG): container finished" podID="36d14a3d-dda7-4da3-be24-c64c6c120c12" containerID="e4c32854d49fb45a0a62c1291ca8f36766f0fc976ee0e7e05ca651bad761c896" exitCode=0 Mar 14 08:15:01 crc kubenswrapper[4893]: I0314 08:15:01.640766 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557935-lfwdf" event={"ID":"36d14a3d-dda7-4da3-be24-c64c6c120c12","Type":"ContainerDied","Data":"e4c32854d49fb45a0a62c1291ca8f36766f0fc976ee0e7e05ca651bad761c896"} Mar 14 08:15:01 crc kubenswrapper[4893]: I0314 08:15:01.641059 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557935-lfwdf" event={"ID":"36d14a3d-dda7-4da3-be24-c64c6c120c12","Type":"ContainerStarted","Data":"8397a4a30d3b95c857e56dfe22dc589f6f62b97e119c0a0fb00a5ff5ebfaadb6"} Mar 14 08:15:02 crc kubenswrapper[4893]: I0314 08:15:02.909001 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557935-lfwdf" Mar 14 08:15:02 crc kubenswrapper[4893]: I0314 08:15:02.980296 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/36d14a3d-dda7-4da3-be24-c64c6c120c12-config-volume\") pod \"36d14a3d-dda7-4da3-be24-c64c6c120c12\" (UID: \"36d14a3d-dda7-4da3-be24-c64c6c120c12\") " Mar 14 08:15:02 crc kubenswrapper[4893]: I0314 08:15:02.980444 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47jtm\" (UniqueName: \"kubernetes.io/projected/36d14a3d-dda7-4da3-be24-c64c6c120c12-kube-api-access-47jtm\") pod \"36d14a3d-dda7-4da3-be24-c64c6c120c12\" (UID: \"36d14a3d-dda7-4da3-be24-c64c6c120c12\") " Mar 14 08:15:02 crc kubenswrapper[4893]: I0314 08:15:02.980488 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/36d14a3d-dda7-4da3-be24-c64c6c120c12-secret-volume\") pod \"36d14a3d-dda7-4da3-be24-c64c6c120c12\" (UID: \"36d14a3d-dda7-4da3-be24-c64c6c120c12\") " Mar 14 08:15:02 crc kubenswrapper[4893]: I0314 08:15:02.981929 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36d14a3d-dda7-4da3-be24-c64c6c120c12-config-volume" (OuterVolumeSpecName: "config-volume") pod "36d14a3d-dda7-4da3-be24-c64c6c120c12" (UID: "36d14a3d-dda7-4da3-be24-c64c6c120c12"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:15:02 crc kubenswrapper[4893]: I0314 08:15:02.985745 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36d14a3d-dda7-4da3-be24-c64c6c120c12-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "36d14a3d-dda7-4da3-be24-c64c6c120c12" (UID: "36d14a3d-dda7-4da3-be24-c64c6c120c12"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:15:02 crc kubenswrapper[4893]: I0314 08:15:02.985853 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36d14a3d-dda7-4da3-be24-c64c6c120c12-kube-api-access-47jtm" (OuterVolumeSpecName: "kube-api-access-47jtm") pod "36d14a3d-dda7-4da3-be24-c64c6c120c12" (UID: "36d14a3d-dda7-4da3-be24-c64c6c120c12"). InnerVolumeSpecName "kube-api-access-47jtm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:15:03 crc kubenswrapper[4893]: I0314 08:15:03.082592 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-47jtm\" (UniqueName: \"kubernetes.io/projected/36d14a3d-dda7-4da3-be24-c64c6c120c12-kube-api-access-47jtm\") on node \"crc\" DevicePath \"\"" Mar 14 08:15:03 crc kubenswrapper[4893]: I0314 08:15:03.082636 4893 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/36d14a3d-dda7-4da3-be24-c64c6c120c12-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 14 08:15:03 crc kubenswrapper[4893]: I0314 08:15:03.082647 4893 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/36d14a3d-dda7-4da3-be24-c64c6c120c12-config-volume\") on node \"crc\" DevicePath \"\"" Mar 14 08:15:03 crc kubenswrapper[4893]: I0314 08:15:03.655219 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557935-lfwdf" event={"ID":"36d14a3d-dda7-4da3-be24-c64c6c120c12","Type":"ContainerDied","Data":"8397a4a30d3b95c857e56dfe22dc589f6f62b97e119c0a0fb00a5ff5ebfaadb6"} Mar 14 08:15:03 crc kubenswrapper[4893]: I0314 08:15:03.655268 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557935-lfwdf" Mar 14 08:15:03 crc kubenswrapper[4893]: I0314 08:15:03.655275 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8397a4a30d3b95c857e56dfe22dc589f6f62b97e119c0a0fb00a5ff5ebfaadb6" Mar 14 08:15:03 crc kubenswrapper[4893]: I0314 08:15:03.973109 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557890-wrxz5"] Mar 14 08:15:03 crc kubenswrapper[4893]: I0314 08:15:03.977545 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557890-wrxz5"] Mar 14 08:15:05 crc kubenswrapper[4893]: I0314 08:15:05.386701 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d266d4cc-1e1f-43b7-8d8b-98e032249192" path="/var/lib/kubelet/pods/d266d4cc-1e1f-43b7-8d8b-98e032249192/volumes" Mar 14 08:15:09 crc kubenswrapper[4893]: I0314 08:15:09.459228 4893 scope.go:117] "RemoveContainer" containerID="baa987483ca644dd873671273f1b8070a8cc32a295ba7d47cff6ea4d21605b8b" Mar 14 08:15:10 crc kubenswrapper[4893]: I0314 08:15:10.376963 4893 scope.go:117] "RemoveContainer" containerID="7720035fe19d5f2843be206ed7d85829236750c2374247f9ff80fd08220bd900" Mar 14 08:15:10 crc kubenswrapper[4893]: E0314 08:15:10.377470 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:15:22 crc kubenswrapper[4893]: I0314 08:15:22.376849 4893 scope.go:117] "RemoveContainer" containerID="7720035fe19d5f2843be206ed7d85829236750c2374247f9ff80fd08220bd900" Mar 14 08:15:22 crc kubenswrapper[4893]: E0314 08:15:22.377684 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:15:33 crc kubenswrapper[4893]: I0314 08:15:33.377805 4893 scope.go:117] "RemoveContainer" containerID="7720035fe19d5f2843be206ed7d85829236750c2374247f9ff80fd08220bd900" Mar 14 08:15:33 crc kubenswrapper[4893]: E0314 08:15:33.379441 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:15:46 crc kubenswrapper[4893]: I0314 08:15:46.377004 4893 scope.go:117] "RemoveContainer" containerID="7720035fe19d5f2843be206ed7d85829236750c2374247f9ff80fd08220bd900" Mar 14 08:15:46 crc kubenswrapper[4893]: E0314 08:15:46.377755 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:15:58 crc kubenswrapper[4893]: I0314 08:15:58.376306 4893 scope.go:117] "RemoveContainer" containerID="7720035fe19d5f2843be206ed7d85829236750c2374247f9ff80fd08220bd900" Mar 14 08:15:58 crc kubenswrapper[4893]: E0314 08:15:58.378635 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:16:00 crc kubenswrapper[4893]: I0314 08:16:00.137773 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557936-tr766"] Mar 14 08:16:00 crc kubenswrapper[4893]: E0314 08:16:00.139111 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36d14a3d-dda7-4da3-be24-c64c6c120c12" containerName="collect-profiles" Mar 14 08:16:00 crc kubenswrapper[4893]: I0314 08:16:00.139214 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="36d14a3d-dda7-4da3-be24-c64c6c120c12" containerName="collect-profiles" Mar 14 08:16:00 crc kubenswrapper[4893]: I0314 08:16:00.139437 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="36d14a3d-dda7-4da3-be24-c64c6c120c12" containerName="collect-profiles" Mar 14 08:16:00 crc kubenswrapper[4893]: I0314 08:16:00.140140 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557936-tr766" Mar 14 08:16:00 crc kubenswrapper[4893]: I0314 08:16:00.144391 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-44qb7" Mar 14 08:16:00 crc kubenswrapper[4893]: I0314 08:16:00.144701 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 08:16:00 crc kubenswrapper[4893]: I0314 08:16:00.147256 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 08:16:00 crc kubenswrapper[4893]: I0314 08:16:00.148134 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557936-tr766"] Mar 14 08:16:00 crc kubenswrapper[4893]: I0314 08:16:00.257353 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blm89\" (UniqueName: \"kubernetes.io/projected/fe5f27f2-590d-4349-ad1d-f5ddf73bd55c-kube-api-access-blm89\") pod \"auto-csr-approver-29557936-tr766\" (UID: \"fe5f27f2-590d-4349-ad1d-f5ddf73bd55c\") " pod="openshift-infra/auto-csr-approver-29557936-tr766" Mar 14 08:16:00 crc kubenswrapper[4893]: I0314 08:16:00.359150 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blm89\" (UniqueName: \"kubernetes.io/projected/fe5f27f2-590d-4349-ad1d-f5ddf73bd55c-kube-api-access-blm89\") pod \"auto-csr-approver-29557936-tr766\" (UID: \"fe5f27f2-590d-4349-ad1d-f5ddf73bd55c\") " pod="openshift-infra/auto-csr-approver-29557936-tr766" Mar 14 08:16:00 crc kubenswrapper[4893]: I0314 08:16:00.377281 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blm89\" (UniqueName: \"kubernetes.io/projected/fe5f27f2-590d-4349-ad1d-f5ddf73bd55c-kube-api-access-blm89\") pod \"auto-csr-approver-29557936-tr766\" (UID: \"fe5f27f2-590d-4349-ad1d-f5ddf73bd55c\") " pod="openshift-infra/auto-csr-approver-29557936-tr766" Mar 14 08:16:00 crc kubenswrapper[4893]: I0314 08:16:00.463421 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557936-tr766" Mar 14 08:16:00 crc kubenswrapper[4893]: I0314 08:16:00.877105 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557936-tr766"] Mar 14 08:16:01 crc kubenswrapper[4893]: I0314 08:16:01.095697 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557936-tr766" event={"ID":"fe5f27f2-590d-4349-ad1d-f5ddf73bd55c","Type":"ContainerStarted","Data":"57be90f8bde8bc32675181381c7bda047c0a6153b5a17cab224bddd3b92e75f9"} Mar 14 08:16:02 crc kubenswrapper[4893]: I0314 08:16:02.116590 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557936-tr766" event={"ID":"fe5f27f2-590d-4349-ad1d-f5ddf73bd55c","Type":"ContainerStarted","Data":"1cb08c3199e521b7f07a08c97e8f12b0661b94b10ca646200ce9b1beb6dae5d1"} Mar 14 08:16:02 crc kubenswrapper[4893]: I0314 08:16:02.133193 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557936-tr766" podStartSLOduration=1.362187197 podStartE2EDuration="2.133176893s" podCreationTimestamp="2026-03-14 08:16:00 +0000 UTC" firstStartedPulling="2026-03-14 08:16:00.893617805 +0000 UTC m=+4640.155794597" lastFinishedPulling="2026-03-14 08:16:01.664607461 +0000 UTC m=+4640.926784293" observedRunningTime="2026-03-14 08:16:02.131931602 +0000 UTC m=+4641.394108414" watchObservedRunningTime="2026-03-14 08:16:02.133176893 +0000 UTC m=+4641.395353685" Mar 14 08:16:03 crc kubenswrapper[4893]: I0314 08:16:03.127144 4893 generic.go:334] "Generic (PLEG): container finished" podID="fe5f27f2-590d-4349-ad1d-f5ddf73bd55c" containerID="1cb08c3199e521b7f07a08c97e8f12b0661b94b10ca646200ce9b1beb6dae5d1" exitCode=0 Mar 14 08:16:03 crc kubenswrapper[4893]: I0314 08:16:03.127466 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557936-tr766" event={"ID":"fe5f27f2-590d-4349-ad1d-f5ddf73bd55c","Type":"ContainerDied","Data":"1cb08c3199e521b7f07a08c97e8f12b0661b94b10ca646200ce9b1beb6dae5d1"} Mar 14 08:16:04 crc kubenswrapper[4893]: I0314 08:16:04.446276 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557936-tr766" Mar 14 08:16:04 crc kubenswrapper[4893]: I0314 08:16:04.618013 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-blm89\" (UniqueName: \"kubernetes.io/projected/fe5f27f2-590d-4349-ad1d-f5ddf73bd55c-kube-api-access-blm89\") pod \"fe5f27f2-590d-4349-ad1d-f5ddf73bd55c\" (UID: \"fe5f27f2-590d-4349-ad1d-f5ddf73bd55c\") " Mar 14 08:16:04 crc kubenswrapper[4893]: I0314 08:16:04.625897 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe5f27f2-590d-4349-ad1d-f5ddf73bd55c-kube-api-access-blm89" (OuterVolumeSpecName: "kube-api-access-blm89") pod "fe5f27f2-590d-4349-ad1d-f5ddf73bd55c" (UID: "fe5f27f2-590d-4349-ad1d-f5ddf73bd55c"). InnerVolumeSpecName "kube-api-access-blm89". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:16:04 crc kubenswrapper[4893]: I0314 08:16:04.719746 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-blm89\" (UniqueName: \"kubernetes.io/projected/fe5f27f2-590d-4349-ad1d-f5ddf73bd55c-kube-api-access-blm89\") on node \"crc\" DevicePath \"\"" Mar 14 08:16:05 crc kubenswrapper[4893]: I0314 08:16:05.145967 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557936-tr766" event={"ID":"fe5f27f2-590d-4349-ad1d-f5ddf73bd55c","Type":"ContainerDied","Data":"57be90f8bde8bc32675181381c7bda047c0a6153b5a17cab224bddd3b92e75f9"} Mar 14 08:16:05 crc kubenswrapper[4893]: I0314 08:16:05.146008 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="57be90f8bde8bc32675181381c7bda047c0a6153b5a17cab224bddd3b92e75f9" Mar 14 08:16:05 crc kubenswrapper[4893]: I0314 08:16:05.146135 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557936-tr766" Mar 14 08:16:05 crc kubenswrapper[4893]: I0314 08:16:05.506265 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557930-fdw5v"] Mar 14 08:16:05 crc kubenswrapper[4893]: I0314 08:16:05.511218 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557930-fdw5v"] Mar 14 08:16:07 crc kubenswrapper[4893]: I0314 08:16:07.390646 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="793b1f76-6d6b-434b-8d3a-c10346c023d7" path="/var/lib/kubelet/pods/793b1f76-6d6b-434b-8d3a-c10346c023d7/volumes" Mar 14 08:16:09 crc kubenswrapper[4893]: I0314 08:16:09.525798 4893 scope.go:117] "RemoveContainer" containerID="ceeecf3d73c3a5b59f1cfbc530df48a14116a849ebc5aab6f2b97b53e15f8ce6" Mar 14 08:16:13 crc kubenswrapper[4893]: I0314 08:16:13.376666 4893 scope.go:117] "RemoveContainer" containerID="7720035fe19d5f2843be206ed7d85829236750c2374247f9ff80fd08220bd900" Mar 14 08:16:13 crc kubenswrapper[4893]: E0314 08:16:13.377196 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:16:25 crc kubenswrapper[4893]: I0314 08:16:25.376670 4893 scope.go:117] "RemoveContainer" containerID="7720035fe19d5f2843be206ed7d85829236750c2374247f9ff80fd08220bd900" Mar 14 08:16:25 crc kubenswrapper[4893]: E0314 08:16:25.377419 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:16:39 crc kubenswrapper[4893]: I0314 08:16:39.377470 4893 scope.go:117] "RemoveContainer" containerID="7720035fe19d5f2843be206ed7d85829236750c2374247f9ff80fd08220bd900" Mar 14 08:16:39 crc kubenswrapper[4893]: E0314 08:16:39.378462 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:16:50 crc kubenswrapper[4893]: I0314 08:16:50.377192 4893 scope.go:117] "RemoveContainer" containerID="7720035fe19d5f2843be206ed7d85829236750c2374247f9ff80fd08220bd900" Mar 14 08:16:50 crc kubenswrapper[4893]: E0314 08:16:50.378015 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:17:01 crc kubenswrapper[4893]: I0314 08:17:01.381193 4893 scope.go:117] "RemoveContainer" containerID="7720035fe19d5f2843be206ed7d85829236750c2374247f9ff80fd08220bd900" Mar 14 08:17:01 crc kubenswrapper[4893]: E0314 08:17:01.381982 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:17:12 crc kubenswrapper[4893]: I0314 08:17:12.377599 4893 scope.go:117] "RemoveContainer" containerID="7720035fe19d5f2843be206ed7d85829236750c2374247f9ff80fd08220bd900" Mar 14 08:17:12 crc kubenswrapper[4893]: E0314 08:17:12.378332 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:17:27 crc kubenswrapper[4893]: I0314 08:17:27.376228 4893 scope.go:117] "RemoveContainer" containerID="7720035fe19d5f2843be206ed7d85829236750c2374247f9ff80fd08220bd900" Mar 14 08:17:27 crc kubenswrapper[4893]: E0314 08:17:27.376834 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:17:39 crc kubenswrapper[4893]: I0314 08:17:39.376589 4893 scope.go:117] "RemoveContainer" containerID="7720035fe19d5f2843be206ed7d85829236750c2374247f9ff80fd08220bd900" Mar 14 08:17:39 crc kubenswrapper[4893]: E0314 08:17:39.377411 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:17:52 crc kubenswrapper[4893]: I0314 08:17:52.376715 4893 scope.go:117] "RemoveContainer" containerID="7720035fe19d5f2843be206ed7d85829236750c2374247f9ff80fd08220bd900" Mar 14 08:17:52 crc kubenswrapper[4893]: E0314 08:17:52.377596 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:18:00 crc kubenswrapper[4893]: I0314 08:18:00.138796 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557938-mpczd"] Mar 14 08:18:00 crc kubenswrapper[4893]: E0314 08:18:00.139618 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe5f27f2-590d-4349-ad1d-f5ddf73bd55c" containerName="oc" Mar 14 08:18:00 crc kubenswrapper[4893]: I0314 08:18:00.139633 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe5f27f2-590d-4349-ad1d-f5ddf73bd55c" containerName="oc" Mar 14 08:18:00 crc kubenswrapper[4893]: I0314 08:18:00.139814 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe5f27f2-590d-4349-ad1d-f5ddf73bd55c" containerName="oc" Mar 14 08:18:00 crc kubenswrapper[4893]: I0314 08:18:00.140249 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557938-mpczd" Mar 14 08:18:00 crc kubenswrapper[4893]: I0314 08:18:00.142673 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 08:18:00 crc kubenswrapper[4893]: I0314 08:18:00.143471 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-44qb7" Mar 14 08:18:00 crc kubenswrapper[4893]: I0314 08:18:00.143655 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 08:18:00 crc kubenswrapper[4893]: I0314 08:18:00.150870 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557938-mpczd"] Mar 14 08:18:00 crc kubenswrapper[4893]: I0314 08:18:00.281282 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pk67q\" (UniqueName: \"kubernetes.io/projected/aae93951-cc2f-49ab-a84b-3a2fdfb5d2a1-kube-api-access-pk67q\") pod \"auto-csr-approver-29557938-mpczd\" (UID: \"aae93951-cc2f-49ab-a84b-3a2fdfb5d2a1\") " pod="openshift-infra/auto-csr-approver-29557938-mpczd" Mar 14 08:18:00 crc kubenswrapper[4893]: I0314 08:18:00.381984 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pk67q\" (UniqueName: \"kubernetes.io/projected/aae93951-cc2f-49ab-a84b-3a2fdfb5d2a1-kube-api-access-pk67q\") pod \"auto-csr-approver-29557938-mpczd\" (UID: \"aae93951-cc2f-49ab-a84b-3a2fdfb5d2a1\") " pod="openshift-infra/auto-csr-approver-29557938-mpczd" Mar 14 08:18:00 crc kubenswrapper[4893]: I0314 08:18:00.399228 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pk67q\" (UniqueName: \"kubernetes.io/projected/aae93951-cc2f-49ab-a84b-3a2fdfb5d2a1-kube-api-access-pk67q\") pod \"auto-csr-approver-29557938-mpczd\" (UID: \"aae93951-cc2f-49ab-a84b-3a2fdfb5d2a1\") " pod="openshift-infra/auto-csr-approver-29557938-mpczd" Mar 14 08:18:00 crc kubenswrapper[4893]: I0314 08:18:00.457786 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557938-mpczd" Mar 14 08:18:00 crc kubenswrapper[4893]: I0314 08:18:00.853655 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557938-mpczd"] Mar 14 08:18:00 crc kubenswrapper[4893]: W0314 08:18:00.861397 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaae93951_cc2f_49ab_a84b_3a2fdfb5d2a1.slice/crio-f6e328e0e49c7e0df5f7acf42d7c75f16727b8037d64a60e6848f1d2b6d62cb8 WatchSource:0}: Error finding container f6e328e0e49c7e0df5f7acf42d7c75f16727b8037d64a60e6848f1d2b6d62cb8: Status 404 returned error can't find the container with id f6e328e0e49c7e0df5f7acf42d7c75f16727b8037d64a60e6848f1d2b6d62cb8 Mar 14 08:18:01 crc kubenswrapper[4893]: I0314 08:18:01.005122 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557938-mpczd" event={"ID":"aae93951-cc2f-49ab-a84b-3a2fdfb5d2a1","Type":"ContainerStarted","Data":"f6e328e0e49c7e0df5f7acf42d7c75f16727b8037d64a60e6848f1d2b6d62cb8"} Mar 14 08:18:02 crc kubenswrapper[4893]: I0314 08:18:02.015005 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557938-mpczd" event={"ID":"aae93951-cc2f-49ab-a84b-3a2fdfb5d2a1","Type":"ContainerStarted","Data":"d9669102bdcc6575f93b9383cefc26fdeeadc8ba9b52001ae3e1fb75317c1a41"} Mar 14 08:18:02 crc kubenswrapper[4893]: I0314 08:18:02.034690 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557938-mpczd" podStartSLOduration=1.084371973 podStartE2EDuration="2.034663987s" podCreationTimestamp="2026-03-14 08:18:00 +0000 UTC" firstStartedPulling="2026-03-14 08:18:00.8643069 +0000 UTC m=+4760.126483692" lastFinishedPulling="2026-03-14 08:18:01.814598894 +0000 UTC m=+4761.076775706" observedRunningTime="2026-03-14 08:18:02.027461521 +0000 UTC m=+4761.289638313" watchObservedRunningTime="2026-03-14 08:18:02.034663987 +0000 UTC m=+4761.296840779" Mar 14 08:18:03 crc kubenswrapper[4893]: I0314 08:18:03.023149 4893 generic.go:334] "Generic (PLEG): container finished" podID="aae93951-cc2f-49ab-a84b-3a2fdfb5d2a1" containerID="d9669102bdcc6575f93b9383cefc26fdeeadc8ba9b52001ae3e1fb75317c1a41" exitCode=0 Mar 14 08:18:03 crc kubenswrapper[4893]: I0314 08:18:03.023189 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557938-mpczd" event={"ID":"aae93951-cc2f-49ab-a84b-3a2fdfb5d2a1","Type":"ContainerDied","Data":"d9669102bdcc6575f93b9383cefc26fdeeadc8ba9b52001ae3e1fb75317c1a41"} Mar 14 08:18:04 crc kubenswrapper[4893]: I0314 08:18:04.709352 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557938-mpczd" Mar 14 08:18:04 crc kubenswrapper[4893]: I0314 08:18:04.851510 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pk67q\" (UniqueName: \"kubernetes.io/projected/aae93951-cc2f-49ab-a84b-3a2fdfb5d2a1-kube-api-access-pk67q\") pod \"aae93951-cc2f-49ab-a84b-3a2fdfb5d2a1\" (UID: \"aae93951-cc2f-49ab-a84b-3a2fdfb5d2a1\") " Mar 14 08:18:04 crc kubenswrapper[4893]: I0314 08:18:04.857223 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aae93951-cc2f-49ab-a84b-3a2fdfb5d2a1-kube-api-access-pk67q" (OuterVolumeSpecName: "kube-api-access-pk67q") pod "aae93951-cc2f-49ab-a84b-3a2fdfb5d2a1" (UID: "aae93951-cc2f-49ab-a84b-3a2fdfb5d2a1"). InnerVolumeSpecName "kube-api-access-pk67q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:18:04 crc kubenswrapper[4893]: I0314 08:18:04.952839 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pk67q\" (UniqueName: \"kubernetes.io/projected/aae93951-cc2f-49ab-a84b-3a2fdfb5d2a1-kube-api-access-pk67q\") on node \"crc\" DevicePath \"\"" Mar 14 08:18:05 crc kubenswrapper[4893]: I0314 08:18:05.049542 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557938-mpczd" event={"ID":"aae93951-cc2f-49ab-a84b-3a2fdfb5d2a1","Type":"ContainerDied","Data":"f6e328e0e49c7e0df5f7acf42d7c75f16727b8037d64a60e6848f1d2b6d62cb8"} Mar 14 08:18:05 crc kubenswrapper[4893]: I0314 08:18:05.049588 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6e328e0e49c7e0df5f7acf42d7c75f16727b8037d64a60e6848f1d2b6d62cb8" Mar 14 08:18:05 crc kubenswrapper[4893]: I0314 08:18:05.049632 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557938-mpczd" Mar 14 08:18:05 crc kubenswrapper[4893]: I0314 08:18:05.781296 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557932-nvlrp"] Mar 14 08:18:05 crc kubenswrapper[4893]: I0314 08:18:05.788315 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557932-nvlrp"] Mar 14 08:18:07 crc kubenswrapper[4893]: I0314 08:18:07.376699 4893 scope.go:117] "RemoveContainer" containerID="7720035fe19d5f2843be206ed7d85829236750c2374247f9ff80fd08220bd900" Mar 14 08:18:07 crc kubenswrapper[4893]: I0314 08:18:07.387774 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0efee831-113a-4473-8091-7c1d51cafd5d" path="/var/lib/kubelet/pods/0efee831-113a-4473-8091-7c1d51cafd5d/volumes" Mar 14 08:18:08 crc kubenswrapper[4893]: I0314 08:18:08.077046 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" event={"ID":"ad6724e5-48cf-4417-ae51-b1cb8c6af70d","Type":"ContainerStarted","Data":"d044194b9e8d803d7530360c6cd8408591f39c223a9a3894c9a3515c611da710"} Mar 14 08:18:09 crc kubenswrapper[4893]: I0314 08:18:09.602739 4893 scope.go:117] "RemoveContainer" containerID="322bc6f2a2ea9f756e4ff76fe8d8514aeac2cb2dfd13cca27e419df8183cbf01" Mar 14 08:18:25 crc kubenswrapper[4893]: I0314 08:18:25.725631 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ltjjt"] Mar 14 08:18:25 crc kubenswrapper[4893]: E0314 08:18:25.726424 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aae93951-cc2f-49ab-a84b-3a2fdfb5d2a1" containerName="oc" Mar 14 08:18:25 crc kubenswrapper[4893]: I0314 08:18:25.726437 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="aae93951-cc2f-49ab-a84b-3a2fdfb5d2a1" containerName="oc" Mar 14 08:18:25 crc kubenswrapper[4893]: I0314 08:18:25.726605 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="aae93951-cc2f-49ab-a84b-3a2fdfb5d2a1" containerName="oc" Mar 14 08:18:25 crc kubenswrapper[4893]: I0314 08:18:25.727619 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ltjjt" Mar 14 08:18:25 crc kubenswrapper[4893]: I0314 08:18:25.741318 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ltjjt"] Mar 14 08:18:25 crc kubenswrapper[4893]: I0314 08:18:25.867126 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5bvv\" (UniqueName: \"kubernetes.io/projected/89bda156-cd95-4a54-8911-d9eb964535a8-kube-api-access-c5bvv\") pod \"redhat-operators-ltjjt\" (UID: \"89bda156-cd95-4a54-8911-d9eb964535a8\") " pod="openshift-marketplace/redhat-operators-ltjjt" Mar 14 08:18:25 crc kubenswrapper[4893]: I0314 08:18:25.867222 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89bda156-cd95-4a54-8911-d9eb964535a8-catalog-content\") pod \"redhat-operators-ltjjt\" (UID: \"89bda156-cd95-4a54-8911-d9eb964535a8\") " pod="openshift-marketplace/redhat-operators-ltjjt" Mar 14 08:18:25 crc kubenswrapper[4893]: I0314 08:18:25.867356 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89bda156-cd95-4a54-8911-d9eb964535a8-utilities\") pod \"redhat-operators-ltjjt\" (UID: \"89bda156-cd95-4a54-8911-d9eb964535a8\") " pod="openshift-marketplace/redhat-operators-ltjjt" Mar 14 08:18:25 crc kubenswrapper[4893]: I0314 08:18:25.968874 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89bda156-cd95-4a54-8911-d9eb964535a8-catalog-content\") pod \"redhat-operators-ltjjt\" (UID: \"89bda156-cd95-4a54-8911-d9eb964535a8\") " pod="openshift-marketplace/redhat-operators-ltjjt" Mar 14 08:18:25 crc kubenswrapper[4893]: I0314 08:18:25.968930 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89bda156-cd95-4a54-8911-d9eb964535a8-utilities\") pod \"redhat-operators-ltjjt\" (UID: \"89bda156-cd95-4a54-8911-d9eb964535a8\") " pod="openshift-marketplace/redhat-operators-ltjjt" Mar 14 08:18:25 crc kubenswrapper[4893]: I0314 08:18:25.968978 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5bvv\" (UniqueName: \"kubernetes.io/projected/89bda156-cd95-4a54-8911-d9eb964535a8-kube-api-access-c5bvv\") pod \"redhat-operators-ltjjt\" (UID: \"89bda156-cd95-4a54-8911-d9eb964535a8\") " pod="openshift-marketplace/redhat-operators-ltjjt" Mar 14 08:18:25 crc kubenswrapper[4893]: I0314 08:18:25.969497 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89bda156-cd95-4a54-8911-d9eb964535a8-utilities\") pod \"redhat-operators-ltjjt\" (UID: \"89bda156-cd95-4a54-8911-d9eb964535a8\") " pod="openshift-marketplace/redhat-operators-ltjjt" Mar 14 08:18:25 crc kubenswrapper[4893]: I0314 08:18:25.969601 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89bda156-cd95-4a54-8911-d9eb964535a8-catalog-content\") pod \"redhat-operators-ltjjt\" (UID: \"89bda156-cd95-4a54-8911-d9eb964535a8\") " pod="openshift-marketplace/redhat-operators-ltjjt" Mar 14 08:18:25 crc kubenswrapper[4893]: I0314 08:18:25.991438 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5bvv\" (UniqueName: \"kubernetes.io/projected/89bda156-cd95-4a54-8911-d9eb964535a8-kube-api-access-c5bvv\") pod \"redhat-operators-ltjjt\" (UID: \"89bda156-cd95-4a54-8911-d9eb964535a8\") " pod="openshift-marketplace/redhat-operators-ltjjt" Mar 14 08:18:26 crc kubenswrapper[4893]: I0314 08:18:26.050629 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ltjjt" Mar 14 08:18:26 crc kubenswrapper[4893]: I0314 08:18:26.774907 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ltjjt"] Mar 14 08:18:26 crc kubenswrapper[4893]: W0314 08:18:26.777091 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89bda156_cd95_4a54_8911_d9eb964535a8.slice/crio-d03403f89f0aacfee8d2ed13f8d15cfb0e64317d5128adbf26df8368f39d48cb WatchSource:0}: Error finding container d03403f89f0aacfee8d2ed13f8d15cfb0e64317d5128adbf26df8368f39d48cb: Status 404 returned error can't find the container with id d03403f89f0aacfee8d2ed13f8d15cfb0e64317d5128adbf26df8368f39d48cb Mar 14 08:18:27 crc kubenswrapper[4893]: I0314 08:18:27.275349 4893 generic.go:334] "Generic (PLEG): container finished" podID="89bda156-cd95-4a54-8911-d9eb964535a8" containerID="e93bdce13d69fd0b1fbbc5be3db6194c77c2e0210e6acd546fd3dc1f4c317f8f" exitCode=0 Mar 14 08:18:27 crc kubenswrapper[4893]: I0314 08:18:27.275444 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ltjjt" event={"ID":"89bda156-cd95-4a54-8911-d9eb964535a8","Type":"ContainerDied","Data":"e93bdce13d69fd0b1fbbc5be3db6194c77c2e0210e6acd546fd3dc1f4c317f8f"} Mar 14 08:18:27 crc kubenswrapper[4893]: I0314 08:18:27.276151 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ltjjt" event={"ID":"89bda156-cd95-4a54-8911-d9eb964535a8","Type":"ContainerStarted","Data":"d03403f89f0aacfee8d2ed13f8d15cfb0e64317d5128adbf26df8368f39d48cb"} Mar 14 08:18:28 crc kubenswrapper[4893]: I0314 08:18:28.289619 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ltjjt" event={"ID":"89bda156-cd95-4a54-8911-d9eb964535a8","Type":"ContainerStarted","Data":"9d230a1a673374286998f519f9c90329bae954cf9ccee628760f92ba1ae66eb8"} Mar 14 08:18:29 crc kubenswrapper[4893]: I0314 08:18:29.302165 4893 generic.go:334] "Generic (PLEG): container finished" podID="89bda156-cd95-4a54-8911-d9eb964535a8" containerID="9d230a1a673374286998f519f9c90329bae954cf9ccee628760f92ba1ae66eb8" exitCode=0 Mar 14 08:18:29 crc kubenswrapper[4893]: I0314 08:18:29.302219 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ltjjt" event={"ID":"89bda156-cd95-4a54-8911-d9eb964535a8","Type":"ContainerDied","Data":"9d230a1a673374286998f519f9c90329bae954cf9ccee628760f92ba1ae66eb8"} Mar 14 08:18:29 crc kubenswrapper[4893]: I0314 08:18:29.902052 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8f6tq"] Mar 14 08:18:29 crc kubenswrapper[4893]: I0314 08:18:29.904089 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8f6tq" Mar 14 08:18:29 crc kubenswrapper[4893]: I0314 08:18:29.940381 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8f6tq"] Mar 14 08:18:30 crc kubenswrapper[4893]: I0314 08:18:30.032864 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/327bd0f6-3603-4ffb-976a-0ef87d92e6c1-utilities\") pod \"certified-operators-8f6tq\" (UID: \"327bd0f6-3603-4ffb-976a-0ef87d92e6c1\") " pod="openshift-marketplace/certified-operators-8f6tq" Mar 14 08:18:30 crc kubenswrapper[4893]: I0314 08:18:30.032931 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfnrv\" (UniqueName: \"kubernetes.io/projected/327bd0f6-3603-4ffb-976a-0ef87d92e6c1-kube-api-access-zfnrv\") pod \"certified-operators-8f6tq\" (UID: \"327bd0f6-3603-4ffb-976a-0ef87d92e6c1\") " pod="openshift-marketplace/certified-operators-8f6tq" Mar 14 08:18:30 crc kubenswrapper[4893]: I0314 08:18:30.032971 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/327bd0f6-3603-4ffb-976a-0ef87d92e6c1-catalog-content\") pod \"certified-operators-8f6tq\" (UID: \"327bd0f6-3603-4ffb-976a-0ef87d92e6c1\") " pod="openshift-marketplace/certified-operators-8f6tq" Mar 14 08:18:30 crc kubenswrapper[4893]: I0314 08:18:30.134389 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/327bd0f6-3603-4ffb-976a-0ef87d92e6c1-utilities\") pod \"certified-operators-8f6tq\" (UID: \"327bd0f6-3603-4ffb-976a-0ef87d92e6c1\") " pod="openshift-marketplace/certified-operators-8f6tq" Mar 14 08:18:30 crc kubenswrapper[4893]: I0314 08:18:30.134469 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfnrv\" (UniqueName: \"kubernetes.io/projected/327bd0f6-3603-4ffb-976a-0ef87d92e6c1-kube-api-access-zfnrv\") pod \"certified-operators-8f6tq\" (UID: \"327bd0f6-3603-4ffb-976a-0ef87d92e6c1\") " pod="openshift-marketplace/certified-operators-8f6tq" Mar 14 08:18:30 crc kubenswrapper[4893]: I0314 08:18:30.134589 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/327bd0f6-3603-4ffb-976a-0ef87d92e6c1-catalog-content\") pod \"certified-operators-8f6tq\" (UID: \"327bd0f6-3603-4ffb-976a-0ef87d92e6c1\") " pod="openshift-marketplace/certified-operators-8f6tq" Mar 14 08:18:30 crc kubenswrapper[4893]: I0314 08:18:30.135221 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/327bd0f6-3603-4ffb-976a-0ef87d92e6c1-catalog-content\") pod \"certified-operators-8f6tq\" (UID: \"327bd0f6-3603-4ffb-976a-0ef87d92e6c1\") " pod="openshift-marketplace/certified-operators-8f6tq" Mar 14 08:18:30 crc kubenswrapper[4893]: I0314 08:18:30.135935 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/327bd0f6-3603-4ffb-976a-0ef87d92e6c1-utilities\") pod \"certified-operators-8f6tq\" (UID: \"327bd0f6-3603-4ffb-976a-0ef87d92e6c1\") " pod="openshift-marketplace/certified-operators-8f6tq" Mar 14 08:18:30 crc kubenswrapper[4893]: I0314 08:18:30.156993 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfnrv\" (UniqueName: \"kubernetes.io/projected/327bd0f6-3603-4ffb-976a-0ef87d92e6c1-kube-api-access-zfnrv\") pod \"certified-operators-8f6tq\" (UID: \"327bd0f6-3603-4ffb-976a-0ef87d92e6c1\") " pod="openshift-marketplace/certified-operators-8f6tq" Mar 14 08:18:30 crc kubenswrapper[4893]: I0314 08:18:30.221284 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8f6tq" Mar 14 08:18:30 crc kubenswrapper[4893]: I0314 08:18:30.311537 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ltjjt" event={"ID":"89bda156-cd95-4a54-8911-d9eb964535a8","Type":"ContainerStarted","Data":"7fadf6dc11fd7124407918b981fbaf34e13ce80420a867ec3dee6db4660d189e"} Mar 14 08:18:30 crc kubenswrapper[4893]: I0314 08:18:30.336374 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ltjjt" podStartSLOduration=2.916488471 podStartE2EDuration="5.336349849s" podCreationTimestamp="2026-03-14 08:18:25 +0000 UTC" firstStartedPulling="2026-03-14 08:18:27.277019387 +0000 UTC m=+4786.539196179" lastFinishedPulling="2026-03-14 08:18:29.696880765 +0000 UTC m=+4788.959057557" observedRunningTime="2026-03-14 08:18:30.329940203 +0000 UTC m=+4789.592117015" watchObservedRunningTime="2026-03-14 08:18:30.336349849 +0000 UTC m=+4789.598526641" Mar 14 08:18:30 crc kubenswrapper[4893]: I0314 08:18:30.694192 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8f6tq"] Mar 14 08:18:31 crc kubenswrapper[4893]: I0314 08:18:31.320575 4893 generic.go:334] "Generic (PLEG): container finished" podID="327bd0f6-3603-4ffb-976a-0ef87d92e6c1" containerID="ffc5dfbd3fc295e061da23e495fa4d83baf396fb834c592e5b402335068c27a7" exitCode=0 Mar 14 08:18:31 crc kubenswrapper[4893]: I0314 08:18:31.320775 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8f6tq" event={"ID":"327bd0f6-3603-4ffb-976a-0ef87d92e6c1","Type":"ContainerDied","Data":"ffc5dfbd3fc295e061da23e495fa4d83baf396fb834c592e5b402335068c27a7"} Mar 14 08:18:31 crc kubenswrapper[4893]: I0314 08:18:31.320931 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8f6tq" event={"ID":"327bd0f6-3603-4ffb-976a-0ef87d92e6c1","Type":"ContainerStarted","Data":"2e10acd9e133bfa7db9fbd86ac60ec85dfb5a1bc72939f31236eb0aef58b9d7b"} Mar 14 08:18:32 crc kubenswrapper[4893]: I0314 08:18:32.328155 4893 generic.go:334] "Generic (PLEG): container finished" podID="327bd0f6-3603-4ffb-976a-0ef87d92e6c1" containerID="fefea47699c80e2dd1faefe2f311556172a8cbe0ad7cda35bcce41877185ce84" exitCode=0 Mar 14 08:18:32 crc kubenswrapper[4893]: I0314 08:18:32.328194 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8f6tq" event={"ID":"327bd0f6-3603-4ffb-976a-0ef87d92e6c1","Type":"ContainerDied","Data":"fefea47699c80e2dd1faefe2f311556172a8cbe0ad7cda35bcce41877185ce84"} Mar 14 08:18:33 crc kubenswrapper[4893]: I0314 08:18:33.336902 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8f6tq" event={"ID":"327bd0f6-3603-4ffb-976a-0ef87d92e6c1","Type":"ContainerStarted","Data":"939c1ec1e2235d14eed2903258050d8162200b3e750c2e1d37dd1a003f1f4eae"} Mar 14 08:18:33 crc kubenswrapper[4893]: I0314 08:18:33.356457 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8f6tq" podStartSLOduration=2.900075702 podStartE2EDuration="4.356437463s" podCreationTimestamp="2026-03-14 08:18:29 +0000 UTC" firstStartedPulling="2026-03-14 08:18:31.322560381 +0000 UTC m=+4790.584737173" lastFinishedPulling="2026-03-14 08:18:32.778922142 +0000 UTC m=+4792.041098934" observedRunningTime="2026-03-14 08:18:33.355838269 +0000 UTC m=+4792.618015081" watchObservedRunningTime="2026-03-14 08:18:33.356437463 +0000 UTC m=+4792.618614255" Mar 14 08:18:36 crc kubenswrapper[4893]: I0314 08:18:36.052196 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ltjjt" Mar 14 08:18:36 crc kubenswrapper[4893]: I0314 08:18:36.052535 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ltjjt" Mar 14 08:18:36 crc kubenswrapper[4893]: I0314 08:18:36.094424 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ltjjt" Mar 14 08:18:36 crc kubenswrapper[4893]: I0314 08:18:36.401472 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ltjjt" Mar 14 08:18:37 crc kubenswrapper[4893]: I0314 08:18:37.095026 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ltjjt"] Mar 14 08:18:38 crc kubenswrapper[4893]: I0314 08:18:38.372294 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ltjjt" podUID="89bda156-cd95-4a54-8911-d9eb964535a8" containerName="registry-server" containerID="cri-o://7fadf6dc11fd7124407918b981fbaf34e13ce80420a867ec3dee6db4660d189e" gracePeriod=2 Mar 14 08:18:38 crc kubenswrapper[4893]: I0314 08:18:38.809107 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ltjjt" Mar 14 08:18:38 crc kubenswrapper[4893]: I0314 08:18:38.960340 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89bda156-cd95-4a54-8911-d9eb964535a8-utilities\") pod \"89bda156-cd95-4a54-8911-d9eb964535a8\" (UID: \"89bda156-cd95-4a54-8911-d9eb964535a8\") " Mar 14 08:18:38 crc kubenswrapper[4893]: I0314 08:18:38.960445 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5bvv\" (UniqueName: \"kubernetes.io/projected/89bda156-cd95-4a54-8911-d9eb964535a8-kube-api-access-c5bvv\") pod \"89bda156-cd95-4a54-8911-d9eb964535a8\" (UID: \"89bda156-cd95-4a54-8911-d9eb964535a8\") " Mar 14 08:18:38 crc kubenswrapper[4893]: I0314 08:18:38.960671 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89bda156-cd95-4a54-8911-d9eb964535a8-catalog-content\") pod \"89bda156-cd95-4a54-8911-d9eb964535a8\" (UID: \"89bda156-cd95-4a54-8911-d9eb964535a8\") " Mar 14 08:18:38 crc kubenswrapper[4893]: I0314 08:18:38.961461 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89bda156-cd95-4a54-8911-d9eb964535a8-utilities" (OuterVolumeSpecName: "utilities") pod "89bda156-cd95-4a54-8911-d9eb964535a8" (UID: "89bda156-cd95-4a54-8911-d9eb964535a8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:18:39 crc kubenswrapper[4893]: I0314 08:18:39.062489 4893 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89bda156-cd95-4a54-8911-d9eb964535a8-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 08:18:39 crc kubenswrapper[4893]: I0314 08:18:39.390660 4893 generic.go:334] "Generic (PLEG): container finished" podID="89bda156-cd95-4a54-8911-d9eb964535a8" containerID="7fadf6dc11fd7124407918b981fbaf34e13ce80420a867ec3dee6db4660d189e" exitCode=0 Mar 14 08:18:39 crc kubenswrapper[4893]: I0314 08:18:39.390815 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ltjjt" Mar 14 08:18:39 crc kubenswrapper[4893]: I0314 08:18:39.394910 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ltjjt" event={"ID":"89bda156-cd95-4a54-8911-d9eb964535a8","Type":"ContainerDied","Data":"7fadf6dc11fd7124407918b981fbaf34e13ce80420a867ec3dee6db4660d189e"} Mar 14 08:18:39 crc kubenswrapper[4893]: I0314 08:18:39.394953 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ltjjt" event={"ID":"89bda156-cd95-4a54-8911-d9eb964535a8","Type":"ContainerDied","Data":"d03403f89f0aacfee8d2ed13f8d15cfb0e64317d5128adbf26df8368f39d48cb"} Mar 14 08:18:39 crc kubenswrapper[4893]: I0314 08:18:39.394972 4893 scope.go:117] "RemoveContainer" containerID="7fadf6dc11fd7124407918b981fbaf34e13ce80420a867ec3dee6db4660d189e" Mar 14 08:18:39 crc kubenswrapper[4893]: I0314 08:18:39.419638 4893 scope.go:117] "RemoveContainer" containerID="9d230a1a673374286998f519f9c90329bae954cf9ccee628760f92ba1ae66eb8" Mar 14 08:18:39 crc kubenswrapper[4893]: I0314 08:18:39.458220 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89bda156-cd95-4a54-8911-d9eb964535a8-kube-api-access-c5bvv" (OuterVolumeSpecName: "kube-api-access-c5bvv") pod "89bda156-cd95-4a54-8911-d9eb964535a8" (UID: "89bda156-cd95-4a54-8911-d9eb964535a8"). InnerVolumeSpecName "kube-api-access-c5bvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:18:39 crc kubenswrapper[4893]: I0314 08:18:39.470568 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5bvv\" (UniqueName: \"kubernetes.io/projected/89bda156-cd95-4a54-8911-d9eb964535a8-kube-api-access-c5bvv\") on node \"crc\" DevicePath \"\"" Mar 14 08:18:39 crc kubenswrapper[4893]: I0314 08:18:39.471171 4893 scope.go:117] "RemoveContainer" containerID="e93bdce13d69fd0b1fbbc5be3db6194c77c2e0210e6acd546fd3dc1f4c317f8f" Mar 14 08:18:39 crc kubenswrapper[4893]: I0314 08:18:39.611878 4893 scope.go:117] "RemoveContainer" containerID="7fadf6dc11fd7124407918b981fbaf34e13ce80420a867ec3dee6db4660d189e" Mar 14 08:18:39 crc kubenswrapper[4893]: E0314 08:18:39.617949 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fadf6dc11fd7124407918b981fbaf34e13ce80420a867ec3dee6db4660d189e\": container with ID starting with 7fadf6dc11fd7124407918b981fbaf34e13ce80420a867ec3dee6db4660d189e not found: ID does not exist" containerID="7fadf6dc11fd7124407918b981fbaf34e13ce80420a867ec3dee6db4660d189e" Mar 14 08:18:39 crc kubenswrapper[4893]: I0314 08:18:39.617998 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fadf6dc11fd7124407918b981fbaf34e13ce80420a867ec3dee6db4660d189e"} err="failed to get container status \"7fadf6dc11fd7124407918b981fbaf34e13ce80420a867ec3dee6db4660d189e\": rpc error: code = NotFound desc = could not find container \"7fadf6dc11fd7124407918b981fbaf34e13ce80420a867ec3dee6db4660d189e\": container with ID starting with 7fadf6dc11fd7124407918b981fbaf34e13ce80420a867ec3dee6db4660d189e not found: ID does not exist" Mar 14 08:18:39 crc kubenswrapper[4893]: I0314 08:18:39.618033 4893 scope.go:117] "RemoveContainer" containerID="9d230a1a673374286998f519f9c90329bae954cf9ccee628760f92ba1ae66eb8" Mar 14 08:18:39 crc kubenswrapper[4893]: E0314 08:18:39.618786 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d230a1a673374286998f519f9c90329bae954cf9ccee628760f92ba1ae66eb8\": container with ID starting with 9d230a1a673374286998f519f9c90329bae954cf9ccee628760f92ba1ae66eb8 not found: ID does not exist" containerID="9d230a1a673374286998f519f9c90329bae954cf9ccee628760f92ba1ae66eb8" Mar 14 08:18:39 crc kubenswrapper[4893]: I0314 08:18:39.618823 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d230a1a673374286998f519f9c90329bae954cf9ccee628760f92ba1ae66eb8"} err="failed to get container status \"9d230a1a673374286998f519f9c90329bae954cf9ccee628760f92ba1ae66eb8\": rpc error: code = NotFound desc = could not find container \"9d230a1a673374286998f519f9c90329bae954cf9ccee628760f92ba1ae66eb8\": container with ID starting with 9d230a1a673374286998f519f9c90329bae954cf9ccee628760f92ba1ae66eb8 not found: ID does not exist" Mar 14 08:18:39 crc kubenswrapper[4893]: I0314 08:18:39.618845 4893 scope.go:117] "RemoveContainer" containerID="e93bdce13d69fd0b1fbbc5be3db6194c77c2e0210e6acd546fd3dc1f4c317f8f" Mar 14 08:18:39 crc kubenswrapper[4893]: E0314 08:18:39.619340 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e93bdce13d69fd0b1fbbc5be3db6194c77c2e0210e6acd546fd3dc1f4c317f8f\": container with ID starting with e93bdce13d69fd0b1fbbc5be3db6194c77c2e0210e6acd546fd3dc1f4c317f8f not found: ID does not exist" containerID="e93bdce13d69fd0b1fbbc5be3db6194c77c2e0210e6acd546fd3dc1f4c317f8f" Mar 14 08:18:39 crc kubenswrapper[4893]: I0314 08:18:39.619402 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e93bdce13d69fd0b1fbbc5be3db6194c77c2e0210e6acd546fd3dc1f4c317f8f"} err="failed to get container status \"e93bdce13d69fd0b1fbbc5be3db6194c77c2e0210e6acd546fd3dc1f4c317f8f\": rpc error: code = NotFound desc = could not find container \"e93bdce13d69fd0b1fbbc5be3db6194c77c2e0210e6acd546fd3dc1f4c317f8f\": container with ID starting with e93bdce13d69fd0b1fbbc5be3db6194c77c2e0210e6acd546fd3dc1f4c317f8f not found: ID does not exist" Mar 14 08:18:40 crc kubenswrapper[4893]: I0314 08:18:40.222509 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8f6tq" Mar 14 08:18:40 crc kubenswrapper[4893]: I0314 08:18:40.223837 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8f6tq" Mar 14 08:18:40 crc kubenswrapper[4893]: I0314 08:18:40.265609 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8f6tq" Mar 14 08:18:40 crc kubenswrapper[4893]: I0314 08:18:40.455206 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8f6tq" Mar 14 08:18:41 crc kubenswrapper[4893]: I0314 08:18:41.495037 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8f6tq"] Mar 14 08:18:41 crc kubenswrapper[4893]: I0314 08:18:41.867394 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89bda156-cd95-4a54-8911-d9eb964535a8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "89bda156-cd95-4a54-8911-d9eb964535a8" (UID: "89bda156-cd95-4a54-8911-d9eb964535a8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:18:41 crc kubenswrapper[4893]: I0314 08:18:41.940202 4893 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89bda156-cd95-4a54-8911-d9eb964535a8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 08:18:42 crc kubenswrapper[4893]: I0314 08:18:42.132043 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ltjjt"] Mar 14 08:18:42 crc kubenswrapper[4893]: I0314 08:18:42.139467 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ltjjt"] Mar 14 08:18:42 crc kubenswrapper[4893]: I0314 08:18:42.410637 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8f6tq" podUID="327bd0f6-3603-4ffb-976a-0ef87d92e6c1" containerName="registry-server" containerID="cri-o://939c1ec1e2235d14eed2903258050d8162200b3e750c2e1d37dd1a003f1f4eae" gracePeriod=2 Mar 14 08:18:43 crc kubenswrapper[4893]: I0314 08:18:43.326173 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8f6tq" Mar 14 08:18:43 crc kubenswrapper[4893]: I0314 08:18:43.376624 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/327bd0f6-3603-4ffb-976a-0ef87d92e6c1-catalog-content\") pod \"327bd0f6-3603-4ffb-976a-0ef87d92e6c1\" (UID: \"327bd0f6-3603-4ffb-976a-0ef87d92e6c1\") " Mar 14 08:18:43 crc kubenswrapper[4893]: I0314 08:18:43.376738 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfnrv\" (UniqueName: \"kubernetes.io/projected/327bd0f6-3603-4ffb-976a-0ef87d92e6c1-kube-api-access-zfnrv\") pod \"327bd0f6-3603-4ffb-976a-0ef87d92e6c1\" (UID: \"327bd0f6-3603-4ffb-976a-0ef87d92e6c1\") " Mar 14 08:18:43 crc kubenswrapper[4893]: I0314 08:18:43.377193 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/327bd0f6-3603-4ffb-976a-0ef87d92e6c1-utilities\") pod \"327bd0f6-3603-4ffb-976a-0ef87d92e6c1\" (UID: \"327bd0f6-3603-4ffb-976a-0ef87d92e6c1\") " Mar 14 08:18:43 crc kubenswrapper[4893]: I0314 08:18:43.377923 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/327bd0f6-3603-4ffb-976a-0ef87d92e6c1-utilities" (OuterVolumeSpecName: "utilities") pod "327bd0f6-3603-4ffb-976a-0ef87d92e6c1" (UID: "327bd0f6-3603-4ffb-976a-0ef87d92e6c1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:18:43 crc kubenswrapper[4893]: I0314 08:18:43.386058 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89bda156-cd95-4a54-8911-d9eb964535a8" path="/var/lib/kubelet/pods/89bda156-cd95-4a54-8911-d9eb964535a8/volumes" Mar 14 08:18:43 crc kubenswrapper[4893]: I0314 08:18:43.388738 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/327bd0f6-3603-4ffb-976a-0ef87d92e6c1-kube-api-access-zfnrv" (OuterVolumeSpecName: "kube-api-access-zfnrv") pod "327bd0f6-3603-4ffb-976a-0ef87d92e6c1" (UID: "327bd0f6-3603-4ffb-976a-0ef87d92e6c1"). InnerVolumeSpecName "kube-api-access-zfnrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:18:43 crc kubenswrapper[4893]: I0314 08:18:43.420306 4893 generic.go:334] "Generic (PLEG): container finished" podID="327bd0f6-3603-4ffb-976a-0ef87d92e6c1" containerID="939c1ec1e2235d14eed2903258050d8162200b3e750c2e1d37dd1a003f1f4eae" exitCode=0 Mar 14 08:18:43 crc kubenswrapper[4893]: I0314 08:18:43.420346 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8f6tq" event={"ID":"327bd0f6-3603-4ffb-976a-0ef87d92e6c1","Type":"ContainerDied","Data":"939c1ec1e2235d14eed2903258050d8162200b3e750c2e1d37dd1a003f1f4eae"} Mar 14 08:18:43 crc kubenswrapper[4893]: I0314 08:18:43.420359 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8f6tq" Mar 14 08:18:43 crc kubenswrapper[4893]: I0314 08:18:43.420369 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8f6tq" event={"ID":"327bd0f6-3603-4ffb-976a-0ef87d92e6c1","Type":"ContainerDied","Data":"2e10acd9e133bfa7db9fbd86ac60ec85dfb5a1bc72939f31236eb0aef58b9d7b"} Mar 14 08:18:43 crc kubenswrapper[4893]: I0314 08:18:43.420385 4893 scope.go:117] "RemoveContainer" containerID="939c1ec1e2235d14eed2903258050d8162200b3e750c2e1d37dd1a003f1f4eae" Mar 14 08:18:43 crc kubenswrapper[4893]: I0314 08:18:43.434551 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/327bd0f6-3603-4ffb-976a-0ef87d92e6c1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "327bd0f6-3603-4ffb-976a-0ef87d92e6c1" (UID: "327bd0f6-3603-4ffb-976a-0ef87d92e6c1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:18:43 crc kubenswrapper[4893]: I0314 08:18:43.443717 4893 scope.go:117] "RemoveContainer" containerID="fefea47699c80e2dd1faefe2f311556172a8cbe0ad7cda35bcce41877185ce84" Mar 14 08:18:43 crc kubenswrapper[4893]: I0314 08:18:43.459994 4893 scope.go:117] "RemoveContainer" containerID="ffc5dfbd3fc295e061da23e495fa4d83baf396fb834c592e5b402335068c27a7" Mar 14 08:18:43 crc kubenswrapper[4893]: I0314 08:18:43.479597 4893 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/327bd0f6-3603-4ffb-976a-0ef87d92e6c1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 08:18:43 crc kubenswrapper[4893]: I0314 08:18:43.479643 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zfnrv\" (UniqueName: \"kubernetes.io/projected/327bd0f6-3603-4ffb-976a-0ef87d92e6c1-kube-api-access-zfnrv\") on node \"crc\" DevicePath \"\"" Mar 14 08:18:43 crc kubenswrapper[4893]: I0314 08:18:43.479662 4893 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/327bd0f6-3603-4ffb-976a-0ef87d92e6c1-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 08:18:43 crc kubenswrapper[4893]: I0314 08:18:43.492397 4893 scope.go:117] "RemoveContainer" containerID="939c1ec1e2235d14eed2903258050d8162200b3e750c2e1d37dd1a003f1f4eae" Mar 14 08:18:43 crc kubenswrapper[4893]: E0314 08:18:43.492908 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"939c1ec1e2235d14eed2903258050d8162200b3e750c2e1d37dd1a003f1f4eae\": container with ID starting with 939c1ec1e2235d14eed2903258050d8162200b3e750c2e1d37dd1a003f1f4eae not found: ID does not exist" containerID="939c1ec1e2235d14eed2903258050d8162200b3e750c2e1d37dd1a003f1f4eae" Mar 14 08:18:43 crc kubenswrapper[4893]: I0314 08:18:43.492954 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"939c1ec1e2235d14eed2903258050d8162200b3e750c2e1d37dd1a003f1f4eae"} err="failed to get container status \"939c1ec1e2235d14eed2903258050d8162200b3e750c2e1d37dd1a003f1f4eae\": rpc error: code = NotFound desc = could not find container \"939c1ec1e2235d14eed2903258050d8162200b3e750c2e1d37dd1a003f1f4eae\": container with ID starting with 939c1ec1e2235d14eed2903258050d8162200b3e750c2e1d37dd1a003f1f4eae not found: ID does not exist" Mar 14 08:18:43 crc kubenswrapper[4893]: I0314 08:18:43.492985 4893 scope.go:117] "RemoveContainer" containerID="fefea47699c80e2dd1faefe2f311556172a8cbe0ad7cda35bcce41877185ce84" Mar 14 08:18:43 crc kubenswrapper[4893]: E0314 08:18:43.493599 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fefea47699c80e2dd1faefe2f311556172a8cbe0ad7cda35bcce41877185ce84\": container with ID starting with fefea47699c80e2dd1faefe2f311556172a8cbe0ad7cda35bcce41877185ce84 not found: ID does not exist" containerID="fefea47699c80e2dd1faefe2f311556172a8cbe0ad7cda35bcce41877185ce84" Mar 14 08:18:43 crc kubenswrapper[4893]: I0314 08:18:43.493637 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fefea47699c80e2dd1faefe2f311556172a8cbe0ad7cda35bcce41877185ce84"} err="failed to get container status \"fefea47699c80e2dd1faefe2f311556172a8cbe0ad7cda35bcce41877185ce84\": rpc error: code = NotFound desc = could not find container \"fefea47699c80e2dd1faefe2f311556172a8cbe0ad7cda35bcce41877185ce84\": container with ID starting with fefea47699c80e2dd1faefe2f311556172a8cbe0ad7cda35bcce41877185ce84 not found: ID does not exist" Mar 14 08:18:43 crc kubenswrapper[4893]: I0314 08:18:43.493663 4893 scope.go:117] "RemoveContainer" containerID="ffc5dfbd3fc295e061da23e495fa4d83baf396fb834c592e5b402335068c27a7" Mar 14 08:18:43 crc kubenswrapper[4893]: E0314 08:18:43.493956 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffc5dfbd3fc295e061da23e495fa4d83baf396fb834c592e5b402335068c27a7\": container with ID starting with ffc5dfbd3fc295e061da23e495fa4d83baf396fb834c592e5b402335068c27a7 not found: ID does not exist" containerID="ffc5dfbd3fc295e061da23e495fa4d83baf396fb834c592e5b402335068c27a7" Mar 14 08:18:43 crc kubenswrapper[4893]: I0314 08:18:43.493991 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffc5dfbd3fc295e061da23e495fa4d83baf396fb834c592e5b402335068c27a7"} err="failed to get container status \"ffc5dfbd3fc295e061da23e495fa4d83baf396fb834c592e5b402335068c27a7\": rpc error: code = NotFound desc = could not find container \"ffc5dfbd3fc295e061da23e495fa4d83baf396fb834c592e5b402335068c27a7\": container with ID starting with ffc5dfbd3fc295e061da23e495fa4d83baf396fb834c592e5b402335068c27a7 not found: ID does not exist" Mar 14 08:18:43 crc kubenswrapper[4893]: I0314 08:18:43.765503 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8f6tq"] Mar 14 08:18:43 crc kubenswrapper[4893]: I0314 08:18:43.769884 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8f6tq"] Mar 14 08:18:45 crc kubenswrapper[4893]: I0314 08:18:45.387040 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="327bd0f6-3603-4ffb-976a-0ef87d92e6c1" path="/var/lib/kubelet/pods/327bd0f6-3603-4ffb-976a-0ef87d92e6c1/volumes" Mar 14 08:20:00 crc kubenswrapper[4893]: I0314 08:20:00.140404 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557940-hzvhr"] Mar 14 08:20:00 crc kubenswrapper[4893]: E0314 08:20:00.141186 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89bda156-cd95-4a54-8911-d9eb964535a8" containerName="extract-utilities" Mar 14 08:20:00 crc kubenswrapper[4893]: I0314 08:20:00.141197 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="89bda156-cd95-4a54-8911-d9eb964535a8" containerName="extract-utilities" Mar 14 08:20:00 crc kubenswrapper[4893]: E0314 08:20:00.141222 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="327bd0f6-3603-4ffb-976a-0ef87d92e6c1" containerName="extract-content" Mar 14 08:20:00 crc kubenswrapper[4893]: I0314 08:20:00.141230 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="327bd0f6-3603-4ffb-976a-0ef87d92e6c1" containerName="extract-content" Mar 14 08:20:00 crc kubenswrapper[4893]: E0314 08:20:00.141245 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89bda156-cd95-4a54-8911-d9eb964535a8" containerName="extract-content" Mar 14 08:20:00 crc kubenswrapper[4893]: I0314 08:20:00.141254 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="89bda156-cd95-4a54-8911-d9eb964535a8" containerName="extract-content" Mar 14 08:20:00 crc kubenswrapper[4893]: E0314 08:20:00.141270 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89bda156-cd95-4a54-8911-d9eb964535a8" containerName="registry-server" Mar 14 08:20:00 crc kubenswrapper[4893]: I0314 08:20:00.141277 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="89bda156-cd95-4a54-8911-d9eb964535a8" containerName="registry-server" Mar 14 08:20:00 crc kubenswrapper[4893]: E0314 08:20:00.141356 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="327bd0f6-3603-4ffb-976a-0ef87d92e6c1" containerName="registry-server" Mar 14 08:20:00 crc kubenswrapper[4893]: I0314 08:20:00.141366 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="327bd0f6-3603-4ffb-976a-0ef87d92e6c1" containerName="registry-server" Mar 14 08:20:00 crc kubenswrapper[4893]: E0314 08:20:00.141381 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="327bd0f6-3603-4ffb-976a-0ef87d92e6c1" containerName="extract-utilities" Mar 14 08:20:00 crc kubenswrapper[4893]: I0314 08:20:00.141388 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="327bd0f6-3603-4ffb-976a-0ef87d92e6c1" containerName="extract-utilities" Mar 14 08:20:00 crc kubenswrapper[4893]: I0314 08:20:00.141754 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="327bd0f6-3603-4ffb-976a-0ef87d92e6c1" containerName="registry-server" Mar 14 08:20:00 crc kubenswrapper[4893]: I0314 08:20:00.141773 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="89bda156-cd95-4a54-8911-d9eb964535a8" containerName="registry-server" Mar 14 08:20:00 crc kubenswrapper[4893]: I0314 08:20:00.142185 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557940-hzvhr" Mar 14 08:20:00 crc kubenswrapper[4893]: I0314 08:20:00.144003 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-44qb7" Mar 14 08:20:00 crc kubenswrapper[4893]: I0314 08:20:00.144241 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 08:20:00 crc kubenswrapper[4893]: I0314 08:20:00.144428 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 08:20:00 crc kubenswrapper[4893]: I0314 08:20:00.147286 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557940-hzvhr"] Mar 14 08:20:00 crc kubenswrapper[4893]: I0314 08:20:00.249147 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wf8f2\" (UniqueName: \"kubernetes.io/projected/86f97d66-0b8d-4ba5-9792-0dd9b9a21472-kube-api-access-wf8f2\") pod \"auto-csr-approver-29557940-hzvhr\" (UID: \"86f97d66-0b8d-4ba5-9792-0dd9b9a21472\") " pod="openshift-infra/auto-csr-approver-29557940-hzvhr" Mar 14 08:20:00 crc kubenswrapper[4893]: I0314 08:20:00.350845 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wf8f2\" (UniqueName: \"kubernetes.io/projected/86f97d66-0b8d-4ba5-9792-0dd9b9a21472-kube-api-access-wf8f2\") pod \"auto-csr-approver-29557940-hzvhr\" (UID: \"86f97d66-0b8d-4ba5-9792-0dd9b9a21472\") " pod="openshift-infra/auto-csr-approver-29557940-hzvhr" Mar 14 08:20:00 crc kubenswrapper[4893]: I0314 08:20:00.369066 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wf8f2\" (UniqueName: \"kubernetes.io/projected/86f97d66-0b8d-4ba5-9792-0dd9b9a21472-kube-api-access-wf8f2\") pod \"auto-csr-approver-29557940-hzvhr\" (UID: \"86f97d66-0b8d-4ba5-9792-0dd9b9a21472\") " pod="openshift-infra/auto-csr-approver-29557940-hzvhr" Mar 14 08:20:00 crc kubenswrapper[4893]: I0314 08:20:00.462005 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557940-hzvhr" Mar 14 08:20:00 crc kubenswrapper[4893]: I0314 08:20:00.879941 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557940-hzvhr"] Mar 14 08:20:00 crc kubenswrapper[4893]: I0314 08:20:00.888862 4893 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 08:20:01 crc kubenswrapper[4893]: I0314 08:20:01.033795 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557940-hzvhr" event={"ID":"86f97d66-0b8d-4ba5-9792-0dd9b9a21472","Type":"ContainerStarted","Data":"9ef96318b425d492818fec2c6b32c93bdc3b998e89fb9b11913364d16e092230"} Mar 14 08:20:03 crc kubenswrapper[4893]: I0314 08:20:03.053735 4893 generic.go:334] "Generic (PLEG): container finished" podID="86f97d66-0b8d-4ba5-9792-0dd9b9a21472" containerID="b262c09ce542900ef4f708769b4f8e681a36fd9063735c1b7789726c0ec2d8ab" exitCode=0 Mar 14 08:20:03 crc kubenswrapper[4893]: I0314 08:20:03.053817 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557940-hzvhr" event={"ID":"86f97d66-0b8d-4ba5-9792-0dd9b9a21472","Type":"ContainerDied","Data":"b262c09ce542900ef4f708769b4f8e681a36fd9063735c1b7789726c0ec2d8ab"} Mar 14 08:20:04 crc kubenswrapper[4893]: I0314 08:20:04.314096 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557940-hzvhr" Mar 14 08:20:04 crc kubenswrapper[4893]: I0314 08:20:04.515391 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wf8f2\" (UniqueName: \"kubernetes.io/projected/86f97d66-0b8d-4ba5-9792-0dd9b9a21472-kube-api-access-wf8f2\") pod \"86f97d66-0b8d-4ba5-9792-0dd9b9a21472\" (UID: \"86f97d66-0b8d-4ba5-9792-0dd9b9a21472\") " Mar 14 08:20:04 crc kubenswrapper[4893]: I0314 08:20:04.521485 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86f97d66-0b8d-4ba5-9792-0dd9b9a21472-kube-api-access-wf8f2" (OuterVolumeSpecName: "kube-api-access-wf8f2") pod "86f97d66-0b8d-4ba5-9792-0dd9b9a21472" (UID: "86f97d66-0b8d-4ba5-9792-0dd9b9a21472"). InnerVolumeSpecName "kube-api-access-wf8f2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:20:04 crc kubenswrapper[4893]: I0314 08:20:04.617944 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wf8f2\" (UniqueName: \"kubernetes.io/projected/86f97d66-0b8d-4ba5-9792-0dd9b9a21472-kube-api-access-wf8f2\") on node \"crc\" DevicePath \"\"" Mar 14 08:20:05 crc kubenswrapper[4893]: I0314 08:20:05.073591 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557940-hzvhr" event={"ID":"86f97d66-0b8d-4ba5-9792-0dd9b9a21472","Type":"ContainerDied","Data":"9ef96318b425d492818fec2c6b32c93bdc3b998e89fb9b11913364d16e092230"} Mar 14 08:20:05 crc kubenswrapper[4893]: I0314 08:20:05.073639 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ef96318b425d492818fec2c6b32c93bdc3b998e89fb9b11913364d16e092230" Mar 14 08:20:05 crc kubenswrapper[4893]: I0314 08:20:05.073685 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557940-hzvhr" Mar 14 08:20:05 crc kubenswrapper[4893]: I0314 08:20:05.374436 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557934-qndlc"] Mar 14 08:20:05 crc kubenswrapper[4893]: I0314 08:20:05.384710 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557934-qndlc"] Mar 14 08:20:07 crc kubenswrapper[4893]: I0314 08:20:07.386380 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a54bc35f-50f6-4e96-80b2-274392a59aaa" path="/var/lib/kubelet/pods/a54bc35f-50f6-4e96-80b2-274392a59aaa/volumes" Mar 14 08:20:09 crc kubenswrapper[4893]: I0314 08:20:09.701154 4893 scope.go:117] "RemoveContainer" containerID="9a963b1a1b431572dcd660db0a253a810dc0c5e4fd402b42f6c4c4bef65ed3b7" Mar 14 08:20:29 crc kubenswrapper[4893]: I0314 08:20:29.730675 4893 patch_prober.go:28] interesting pod/machine-config-daemon-d4x6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:20:29 crc kubenswrapper[4893]: I0314 08:20:29.731237 4893 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:20:59 crc kubenswrapper[4893]: I0314 08:20:59.731390 4893 patch_prober.go:28] interesting pod/machine-config-daemon-d4x6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:20:59 crc kubenswrapper[4893]: I0314 08:20:59.731891 4893 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:21:29 crc kubenswrapper[4893]: I0314 08:21:29.731569 4893 patch_prober.go:28] interesting pod/machine-config-daemon-d4x6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:21:29 crc kubenswrapper[4893]: I0314 08:21:29.732271 4893 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:21:29 crc kubenswrapper[4893]: I0314 08:21:29.732331 4893 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" Mar 14 08:21:29 crc kubenswrapper[4893]: I0314 08:21:29.733247 4893 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d044194b9e8d803d7530360c6cd8408591f39c223a9a3894c9a3515c611da710"} pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 08:21:29 crc kubenswrapper[4893]: I0314 08:21:29.733322 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" containerID="cri-o://d044194b9e8d803d7530360c6cd8408591f39c223a9a3894c9a3515c611da710" gracePeriod=600 Mar 14 08:21:30 crc kubenswrapper[4893]: I0314 08:21:30.735433 4893 generic.go:334] "Generic (PLEG): container finished" podID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerID="d044194b9e8d803d7530360c6cd8408591f39c223a9a3894c9a3515c611da710" exitCode=0 Mar 14 08:21:30 crc kubenswrapper[4893]: I0314 08:21:30.735502 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" event={"ID":"ad6724e5-48cf-4417-ae51-b1cb8c6af70d","Type":"ContainerDied","Data":"d044194b9e8d803d7530360c6cd8408591f39c223a9a3894c9a3515c611da710"} Mar 14 08:21:30 crc kubenswrapper[4893]: I0314 08:21:30.735861 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" event={"ID":"ad6724e5-48cf-4417-ae51-b1cb8c6af70d","Type":"ContainerStarted","Data":"9caed2e46027ef871354b1defae4055b2942b0a9ac8f8b215e24dd5cd352dc86"} Mar 14 08:21:30 crc kubenswrapper[4893]: I0314 08:21:30.735882 4893 scope.go:117] "RemoveContainer" containerID="7720035fe19d5f2843be206ed7d85829236750c2374247f9ff80fd08220bd900" Mar 14 08:22:00 crc kubenswrapper[4893]: I0314 08:22:00.139655 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557942-l8gwc"] Mar 14 08:22:00 crc kubenswrapper[4893]: E0314 08:22:00.140763 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86f97d66-0b8d-4ba5-9792-0dd9b9a21472" containerName="oc" Mar 14 08:22:00 crc kubenswrapper[4893]: I0314 08:22:00.140779 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="86f97d66-0b8d-4ba5-9792-0dd9b9a21472" containerName="oc" Mar 14 08:22:00 crc kubenswrapper[4893]: I0314 08:22:00.140978 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="86f97d66-0b8d-4ba5-9792-0dd9b9a21472" containerName="oc" Mar 14 08:22:00 crc kubenswrapper[4893]: I0314 08:22:00.141573 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557942-l8gwc" Mar 14 08:22:00 crc kubenswrapper[4893]: I0314 08:22:00.146909 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 08:22:00 crc kubenswrapper[4893]: I0314 08:22:00.147176 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-44qb7" Mar 14 08:22:00 crc kubenswrapper[4893]: I0314 08:22:00.147297 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 08:22:00 crc kubenswrapper[4893]: I0314 08:22:00.151213 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557942-l8gwc"] Mar 14 08:22:00 crc kubenswrapper[4893]: I0314 08:22:00.259447 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plw2q\" (UniqueName: \"kubernetes.io/projected/cd929b50-edb0-4172-88ed-e9dba2bae63f-kube-api-access-plw2q\") pod \"auto-csr-approver-29557942-l8gwc\" (UID: \"cd929b50-edb0-4172-88ed-e9dba2bae63f\") " pod="openshift-infra/auto-csr-approver-29557942-l8gwc" Mar 14 08:22:00 crc kubenswrapper[4893]: I0314 08:22:00.360952 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plw2q\" (UniqueName: \"kubernetes.io/projected/cd929b50-edb0-4172-88ed-e9dba2bae63f-kube-api-access-plw2q\") pod \"auto-csr-approver-29557942-l8gwc\" (UID: \"cd929b50-edb0-4172-88ed-e9dba2bae63f\") " pod="openshift-infra/auto-csr-approver-29557942-l8gwc" Mar 14 08:22:00 crc kubenswrapper[4893]: I0314 08:22:00.378183 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plw2q\" (UniqueName: \"kubernetes.io/projected/cd929b50-edb0-4172-88ed-e9dba2bae63f-kube-api-access-plw2q\") pod \"auto-csr-approver-29557942-l8gwc\" (UID: \"cd929b50-edb0-4172-88ed-e9dba2bae63f\") " pod="openshift-infra/auto-csr-approver-29557942-l8gwc" Mar 14 08:22:00 crc kubenswrapper[4893]: I0314 08:22:00.459761 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557942-l8gwc" Mar 14 08:22:00 crc kubenswrapper[4893]: I0314 08:22:00.849838 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557942-l8gwc"] Mar 14 08:22:00 crc kubenswrapper[4893]: W0314 08:22:00.856058 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd929b50_edb0_4172_88ed_e9dba2bae63f.slice/crio-59ad74565fa04f93afe7aeb3e6b0d8b283bf2135b0a04fd7f3a65087ce2d16b2 WatchSource:0}: Error finding container 59ad74565fa04f93afe7aeb3e6b0d8b283bf2135b0a04fd7f3a65087ce2d16b2: Status 404 returned error can't find the container with id 59ad74565fa04f93afe7aeb3e6b0d8b283bf2135b0a04fd7f3a65087ce2d16b2 Mar 14 08:22:00 crc kubenswrapper[4893]: I0314 08:22:00.968785 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557942-l8gwc" event={"ID":"cd929b50-edb0-4172-88ed-e9dba2bae63f","Type":"ContainerStarted","Data":"59ad74565fa04f93afe7aeb3e6b0d8b283bf2135b0a04fd7f3a65087ce2d16b2"} Mar 14 08:22:02 crc kubenswrapper[4893]: I0314 08:22:02.986661 4893 generic.go:334] "Generic (PLEG): container finished" podID="cd929b50-edb0-4172-88ed-e9dba2bae63f" containerID="5588770ba7a03358964c3150b45dd361cfb3b97c02eae71f7f27bb1bb998fe28" exitCode=0 Mar 14 08:22:02 crc kubenswrapper[4893]: I0314 08:22:02.986733 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557942-l8gwc" event={"ID":"cd929b50-edb0-4172-88ed-e9dba2bae63f","Type":"ContainerDied","Data":"5588770ba7a03358964c3150b45dd361cfb3b97c02eae71f7f27bb1bb998fe28"} Mar 14 08:22:04 crc kubenswrapper[4893]: I0314 08:22:04.311369 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557942-l8gwc" Mar 14 08:22:04 crc kubenswrapper[4893]: I0314 08:22:04.320786 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plw2q\" (UniqueName: \"kubernetes.io/projected/cd929b50-edb0-4172-88ed-e9dba2bae63f-kube-api-access-plw2q\") pod \"cd929b50-edb0-4172-88ed-e9dba2bae63f\" (UID: \"cd929b50-edb0-4172-88ed-e9dba2bae63f\") " Mar 14 08:22:04 crc kubenswrapper[4893]: I0314 08:22:04.328297 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd929b50-edb0-4172-88ed-e9dba2bae63f-kube-api-access-plw2q" (OuterVolumeSpecName: "kube-api-access-plw2q") pod "cd929b50-edb0-4172-88ed-e9dba2bae63f" (UID: "cd929b50-edb0-4172-88ed-e9dba2bae63f"). InnerVolumeSpecName "kube-api-access-plw2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:22:04 crc kubenswrapper[4893]: I0314 08:22:04.422053 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plw2q\" (UniqueName: \"kubernetes.io/projected/cd929b50-edb0-4172-88ed-e9dba2bae63f-kube-api-access-plw2q\") on node \"crc\" DevicePath \"\"" Mar 14 08:22:05 crc kubenswrapper[4893]: I0314 08:22:05.001860 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557942-l8gwc" event={"ID":"cd929b50-edb0-4172-88ed-e9dba2bae63f","Type":"ContainerDied","Data":"59ad74565fa04f93afe7aeb3e6b0d8b283bf2135b0a04fd7f3a65087ce2d16b2"} Mar 14 08:22:05 crc kubenswrapper[4893]: I0314 08:22:05.002345 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="59ad74565fa04f93afe7aeb3e6b0d8b283bf2135b0a04fd7f3a65087ce2d16b2" Mar 14 08:22:05 crc kubenswrapper[4893]: I0314 08:22:05.001919 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557942-l8gwc" Mar 14 08:22:05 crc kubenswrapper[4893]: I0314 08:22:05.372591 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557936-tr766"] Mar 14 08:22:05 crc kubenswrapper[4893]: I0314 08:22:05.385087 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557936-tr766"] Mar 14 08:22:07 crc kubenswrapper[4893]: I0314 08:22:07.385875 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe5f27f2-590d-4349-ad1d-f5ddf73bd55c" path="/var/lib/kubelet/pods/fe5f27f2-590d-4349-ad1d-f5ddf73bd55c/volumes" Mar 14 08:22:09 crc kubenswrapper[4893]: I0314 08:22:09.771932 4893 scope.go:117] "RemoveContainer" containerID="1cb08c3199e521b7f07a08c97e8f12b0661b94b10ca646200ce9b1beb6dae5d1" Mar 14 08:23:59 crc kubenswrapper[4893]: I0314 08:23:59.730763 4893 patch_prober.go:28] interesting pod/machine-config-daemon-d4x6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:23:59 crc kubenswrapper[4893]: I0314 08:23:59.731415 4893 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:24:00 crc kubenswrapper[4893]: I0314 08:24:00.157735 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557944-rl77b"] Mar 14 08:24:00 crc kubenswrapper[4893]: E0314 08:24:00.158099 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd929b50-edb0-4172-88ed-e9dba2bae63f" containerName="oc" Mar 14 08:24:00 crc kubenswrapper[4893]: I0314 08:24:00.158117 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd929b50-edb0-4172-88ed-e9dba2bae63f" containerName="oc" Mar 14 08:24:00 crc kubenswrapper[4893]: I0314 08:24:00.158272 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd929b50-edb0-4172-88ed-e9dba2bae63f" containerName="oc" Mar 14 08:24:00 crc kubenswrapper[4893]: I0314 08:24:00.158836 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557944-rl77b" Mar 14 08:24:00 crc kubenswrapper[4893]: I0314 08:24:00.162142 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 08:24:00 crc kubenswrapper[4893]: I0314 08:24:00.162345 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 08:24:00 crc kubenswrapper[4893]: I0314 08:24:00.162941 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-44qb7" Mar 14 08:24:00 crc kubenswrapper[4893]: I0314 08:24:00.168465 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557944-rl77b"] Mar 14 08:24:00 crc kubenswrapper[4893]: I0314 08:24:00.259012 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7qfd\" (UniqueName: \"kubernetes.io/projected/3d7eda3e-abf0-407a-bb29-efee396949bc-kube-api-access-z7qfd\") pod \"auto-csr-approver-29557944-rl77b\" (UID: \"3d7eda3e-abf0-407a-bb29-efee396949bc\") " pod="openshift-infra/auto-csr-approver-29557944-rl77b" Mar 14 08:24:00 crc kubenswrapper[4893]: I0314 08:24:00.360788 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7qfd\" (UniqueName: \"kubernetes.io/projected/3d7eda3e-abf0-407a-bb29-efee396949bc-kube-api-access-z7qfd\") pod \"auto-csr-approver-29557944-rl77b\" (UID: \"3d7eda3e-abf0-407a-bb29-efee396949bc\") " pod="openshift-infra/auto-csr-approver-29557944-rl77b" Mar 14 08:24:00 crc kubenswrapper[4893]: I0314 08:24:00.379221 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7qfd\" (UniqueName: \"kubernetes.io/projected/3d7eda3e-abf0-407a-bb29-efee396949bc-kube-api-access-z7qfd\") pod \"auto-csr-approver-29557944-rl77b\" (UID: \"3d7eda3e-abf0-407a-bb29-efee396949bc\") " pod="openshift-infra/auto-csr-approver-29557944-rl77b" Mar 14 08:24:00 crc kubenswrapper[4893]: I0314 08:24:00.484314 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557944-rl77b" Mar 14 08:24:00 crc kubenswrapper[4893]: I0314 08:24:00.720828 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557944-rl77b"] Mar 14 08:24:00 crc kubenswrapper[4893]: I0314 08:24:00.968728 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557944-rl77b" event={"ID":"3d7eda3e-abf0-407a-bb29-efee396949bc","Type":"ContainerStarted","Data":"9754bacfd86fcf93e880a1fed95d9fa0bf84921d6c7872404f6dad1ba9372206"} Mar 14 08:24:02 crc kubenswrapper[4893]: I0314 08:24:02.988298 4893 generic.go:334] "Generic (PLEG): container finished" podID="3d7eda3e-abf0-407a-bb29-efee396949bc" containerID="7ef3de15bcbd9abc0cd13171e49ba1e36807a035d530cf15d179a4a1c6f755cb" exitCode=0 Mar 14 08:24:02 crc kubenswrapper[4893]: I0314 08:24:02.988385 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557944-rl77b" event={"ID":"3d7eda3e-abf0-407a-bb29-efee396949bc","Type":"ContainerDied","Data":"7ef3de15bcbd9abc0cd13171e49ba1e36807a035d530cf15d179a4a1c6f755cb"} Mar 14 08:24:04 crc kubenswrapper[4893]: I0314 08:24:04.294988 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557944-rl77b" Mar 14 08:24:04 crc kubenswrapper[4893]: I0314 08:24:04.345852 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7qfd\" (UniqueName: \"kubernetes.io/projected/3d7eda3e-abf0-407a-bb29-efee396949bc-kube-api-access-z7qfd\") pod \"3d7eda3e-abf0-407a-bb29-efee396949bc\" (UID: \"3d7eda3e-abf0-407a-bb29-efee396949bc\") " Mar 14 08:24:04 crc kubenswrapper[4893]: I0314 08:24:04.350401 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d7eda3e-abf0-407a-bb29-efee396949bc-kube-api-access-z7qfd" (OuterVolumeSpecName: "kube-api-access-z7qfd") pod "3d7eda3e-abf0-407a-bb29-efee396949bc" (UID: "3d7eda3e-abf0-407a-bb29-efee396949bc"). InnerVolumeSpecName "kube-api-access-z7qfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:24:04 crc kubenswrapper[4893]: I0314 08:24:04.446977 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7qfd\" (UniqueName: \"kubernetes.io/projected/3d7eda3e-abf0-407a-bb29-efee396949bc-kube-api-access-z7qfd\") on node \"crc\" DevicePath \"\"" Mar 14 08:24:05 crc kubenswrapper[4893]: I0314 08:24:05.013070 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557944-rl77b" event={"ID":"3d7eda3e-abf0-407a-bb29-efee396949bc","Type":"ContainerDied","Data":"9754bacfd86fcf93e880a1fed95d9fa0bf84921d6c7872404f6dad1ba9372206"} Mar 14 08:24:05 crc kubenswrapper[4893]: I0314 08:24:05.013136 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9754bacfd86fcf93e880a1fed95d9fa0bf84921d6c7872404f6dad1ba9372206" Mar 14 08:24:05 crc kubenswrapper[4893]: I0314 08:24:05.013137 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557944-rl77b" Mar 14 08:24:05 crc kubenswrapper[4893]: I0314 08:24:05.358329 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557938-mpczd"] Mar 14 08:24:05 crc kubenswrapper[4893]: I0314 08:24:05.364665 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557938-mpczd"] Mar 14 08:24:05 crc kubenswrapper[4893]: I0314 08:24:05.385954 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aae93951-cc2f-49ab-a84b-3a2fdfb5d2a1" path="/var/lib/kubelet/pods/aae93951-cc2f-49ab-a84b-3a2fdfb5d2a1/volumes" Mar 14 08:24:09 crc kubenswrapper[4893]: I0314 08:24:09.856278 4893 scope.go:117] "RemoveContainer" containerID="d9669102bdcc6575f93b9383cefc26fdeeadc8ba9b52001ae3e1fb75317c1a41" Mar 14 08:24:29 crc kubenswrapper[4893]: I0314 08:24:29.733051 4893 patch_prober.go:28] interesting pod/machine-config-daemon-d4x6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:24:29 crc kubenswrapper[4893]: I0314 08:24:29.733789 4893 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:24:59 crc kubenswrapper[4893]: I0314 08:24:59.731094 4893 patch_prober.go:28] interesting pod/machine-config-daemon-d4x6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:24:59 crc kubenswrapper[4893]: I0314 08:24:59.731736 4893 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:24:59 crc kubenswrapper[4893]: I0314 08:24:59.731791 4893 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" Mar 14 08:24:59 crc kubenswrapper[4893]: I0314 08:24:59.732486 4893 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9caed2e46027ef871354b1defae4055b2942b0a9ac8f8b215e24dd5cd352dc86"} pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 08:24:59 crc kubenswrapper[4893]: I0314 08:24:59.732567 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" containerID="cri-o://9caed2e46027ef871354b1defae4055b2942b0a9ac8f8b215e24dd5cd352dc86" gracePeriod=600 Mar 14 08:24:59 crc kubenswrapper[4893]: E0314 08:24:59.860376 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:25:00 crc kubenswrapper[4893]: I0314 08:25:00.451368 4893 generic.go:334] "Generic (PLEG): container finished" podID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerID="9caed2e46027ef871354b1defae4055b2942b0a9ac8f8b215e24dd5cd352dc86" exitCode=0 Mar 14 08:25:00 crc kubenswrapper[4893]: I0314 08:25:00.451421 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" event={"ID":"ad6724e5-48cf-4417-ae51-b1cb8c6af70d","Type":"ContainerDied","Data":"9caed2e46027ef871354b1defae4055b2942b0a9ac8f8b215e24dd5cd352dc86"} Mar 14 08:25:00 crc kubenswrapper[4893]: I0314 08:25:00.451469 4893 scope.go:117] "RemoveContainer" containerID="d044194b9e8d803d7530360c6cd8408591f39c223a9a3894c9a3515c611da710" Mar 14 08:25:00 crc kubenswrapper[4893]: I0314 08:25:00.452148 4893 scope.go:117] "RemoveContainer" containerID="9caed2e46027ef871354b1defae4055b2942b0a9ac8f8b215e24dd5cd352dc86" Mar 14 08:25:00 crc kubenswrapper[4893]: E0314 08:25:00.452597 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:25:09 crc kubenswrapper[4893]: I0314 08:25:09.541498 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-shk8l"] Mar 14 08:25:09 crc kubenswrapper[4893]: E0314 08:25:09.542408 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d7eda3e-abf0-407a-bb29-efee396949bc" containerName="oc" Mar 14 08:25:09 crc kubenswrapper[4893]: I0314 08:25:09.542421 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d7eda3e-abf0-407a-bb29-efee396949bc" containerName="oc" Mar 14 08:25:09 crc kubenswrapper[4893]: I0314 08:25:09.542567 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d7eda3e-abf0-407a-bb29-efee396949bc" containerName="oc" Mar 14 08:25:09 crc kubenswrapper[4893]: I0314 08:25:09.543741 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-shk8l" Mar 14 08:25:09 crc kubenswrapper[4893]: I0314 08:25:09.565741 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-shk8l"] Mar 14 08:25:09 crc kubenswrapper[4893]: I0314 08:25:09.706133 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e75172a-788f-4f51-af61-b344f39df901-catalog-content\") pod \"community-operators-shk8l\" (UID: \"6e75172a-788f-4f51-af61-b344f39df901\") " pod="openshift-marketplace/community-operators-shk8l" Mar 14 08:25:09 crc kubenswrapper[4893]: I0314 08:25:09.706202 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e75172a-788f-4f51-af61-b344f39df901-utilities\") pod \"community-operators-shk8l\" (UID: \"6e75172a-788f-4f51-af61-b344f39df901\") " pod="openshift-marketplace/community-operators-shk8l" Mar 14 08:25:09 crc kubenswrapper[4893]: I0314 08:25:09.706257 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mc76r\" (UniqueName: \"kubernetes.io/projected/6e75172a-788f-4f51-af61-b344f39df901-kube-api-access-mc76r\") pod \"community-operators-shk8l\" (UID: \"6e75172a-788f-4f51-af61-b344f39df901\") " pod="openshift-marketplace/community-operators-shk8l" Mar 14 08:25:09 crc kubenswrapper[4893]: I0314 08:25:09.808690 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e75172a-788f-4f51-af61-b344f39df901-catalog-content\") pod \"community-operators-shk8l\" (UID: \"6e75172a-788f-4f51-af61-b344f39df901\") " pod="openshift-marketplace/community-operators-shk8l" Mar 14 08:25:09 crc kubenswrapper[4893]: I0314 08:25:09.808785 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e75172a-788f-4f51-af61-b344f39df901-utilities\") pod \"community-operators-shk8l\" (UID: \"6e75172a-788f-4f51-af61-b344f39df901\") " pod="openshift-marketplace/community-operators-shk8l" Mar 14 08:25:09 crc kubenswrapper[4893]: I0314 08:25:09.808831 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mc76r\" (UniqueName: \"kubernetes.io/projected/6e75172a-788f-4f51-af61-b344f39df901-kube-api-access-mc76r\") pod \"community-operators-shk8l\" (UID: \"6e75172a-788f-4f51-af61-b344f39df901\") " pod="openshift-marketplace/community-operators-shk8l" Mar 14 08:25:09 crc kubenswrapper[4893]: I0314 08:25:09.809308 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e75172a-788f-4f51-af61-b344f39df901-catalog-content\") pod \"community-operators-shk8l\" (UID: \"6e75172a-788f-4f51-af61-b344f39df901\") " pod="openshift-marketplace/community-operators-shk8l" Mar 14 08:25:09 crc kubenswrapper[4893]: I0314 08:25:09.809351 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e75172a-788f-4f51-af61-b344f39df901-utilities\") pod \"community-operators-shk8l\" (UID: \"6e75172a-788f-4f51-af61-b344f39df901\") " pod="openshift-marketplace/community-operators-shk8l" Mar 14 08:25:09 crc kubenswrapper[4893]: I0314 08:25:09.831866 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mc76r\" (UniqueName: \"kubernetes.io/projected/6e75172a-788f-4f51-af61-b344f39df901-kube-api-access-mc76r\") pod \"community-operators-shk8l\" (UID: \"6e75172a-788f-4f51-af61-b344f39df901\") " pod="openshift-marketplace/community-operators-shk8l" Mar 14 08:25:09 crc kubenswrapper[4893]: I0314 08:25:09.875182 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-shk8l" Mar 14 08:25:10 crc kubenswrapper[4893]: I0314 08:25:10.343229 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-shk8l"] Mar 14 08:25:10 crc kubenswrapper[4893]: W0314 08:25:10.351319 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e75172a_788f_4f51_af61_b344f39df901.slice/crio-80bf1eac636b627e5a7eb3f31ad950ca24ad2e89620fcab4f015b72d96f5dfe2 WatchSource:0}: Error finding container 80bf1eac636b627e5a7eb3f31ad950ca24ad2e89620fcab4f015b72d96f5dfe2: Status 404 returned error can't find the container with id 80bf1eac636b627e5a7eb3f31ad950ca24ad2e89620fcab4f015b72d96f5dfe2 Mar 14 08:25:10 crc kubenswrapper[4893]: I0314 08:25:10.520901 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-shk8l" event={"ID":"6e75172a-788f-4f51-af61-b344f39df901","Type":"ContainerStarted","Data":"55b12de1fe5fc886fe219e4d3f8cd99bdd15efb804270cca80298906ebaadea9"} Mar 14 08:25:10 crc kubenswrapper[4893]: I0314 08:25:10.521287 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-shk8l" event={"ID":"6e75172a-788f-4f51-af61-b344f39df901","Type":"ContainerStarted","Data":"80bf1eac636b627e5a7eb3f31ad950ca24ad2e89620fcab4f015b72d96f5dfe2"} Mar 14 08:25:11 crc kubenswrapper[4893]: I0314 08:25:11.528740 4893 generic.go:334] "Generic (PLEG): container finished" podID="6e75172a-788f-4f51-af61-b344f39df901" containerID="55b12de1fe5fc886fe219e4d3f8cd99bdd15efb804270cca80298906ebaadea9" exitCode=0 Mar 14 08:25:11 crc kubenswrapper[4893]: I0314 08:25:11.528784 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-shk8l" event={"ID":"6e75172a-788f-4f51-af61-b344f39df901","Type":"ContainerDied","Data":"55b12de1fe5fc886fe219e4d3f8cd99bdd15efb804270cca80298906ebaadea9"} Mar 14 08:25:11 crc kubenswrapper[4893]: I0314 08:25:11.531178 4893 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 08:25:11 crc kubenswrapper[4893]: I0314 08:25:11.939338 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hfml5"] Mar 14 08:25:11 crc kubenswrapper[4893]: I0314 08:25:11.941190 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hfml5" Mar 14 08:25:11 crc kubenswrapper[4893]: I0314 08:25:11.956556 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hfml5"] Mar 14 08:25:12 crc kubenswrapper[4893]: I0314 08:25:12.041081 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcb2bc44-75bd-4c66-ada5-02c20e6a70ee-utilities\") pod \"redhat-marketplace-hfml5\" (UID: \"fcb2bc44-75bd-4c66-ada5-02c20e6a70ee\") " pod="openshift-marketplace/redhat-marketplace-hfml5" Mar 14 08:25:12 crc kubenswrapper[4893]: I0314 08:25:12.041167 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxrkk\" (UniqueName: \"kubernetes.io/projected/fcb2bc44-75bd-4c66-ada5-02c20e6a70ee-kube-api-access-pxrkk\") pod \"redhat-marketplace-hfml5\" (UID: \"fcb2bc44-75bd-4c66-ada5-02c20e6a70ee\") " pod="openshift-marketplace/redhat-marketplace-hfml5" Mar 14 08:25:12 crc kubenswrapper[4893]: I0314 08:25:12.041255 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcb2bc44-75bd-4c66-ada5-02c20e6a70ee-catalog-content\") pod \"redhat-marketplace-hfml5\" (UID: \"fcb2bc44-75bd-4c66-ada5-02c20e6a70ee\") " pod="openshift-marketplace/redhat-marketplace-hfml5" Mar 14 08:25:12 crc kubenswrapper[4893]: I0314 08:25:12.143043 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxrkk\" (UniqueName: \"kubernetes.io/projected/fcb2bc44-75bd-4c66-ada5-02c20e6a70ee-kube-api-access-pxrkk\") pod \"redhat-marketplace-hfml5\" (UID: \"fcb2bc44-75bd-4c66-ada5-02c20e6a70ee\") " pod="openshift-marketplace/redhat-marketplace-hfml5" Mar 14 08:25:12 crc kubenswrapper[4893]: I0314 08:25:12.143129 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcb2bc44-75bd-4c66-ada5-02c20e6a70ee-catalog-content\") pod \"redhat-marketplace-hfml5\" (UID: \"fcb2bc44-75bd-4c66-ada5-02c20e6a70ee\") " pod="openshift-marketplace/redhat-marketplace-hfml5" Mar 14 08:25:12 crc kubenswrapper[4893]: I0314 08:25:12.143211 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcb2bc44-75bd-4c66-ada5-02c20e6a70ee-utilities\") pod \"redhat-marketplace-hfml5\" (UID: \"fcb2bc44-75bd-4c66-ada5-02c20e6a70ee\") " pod="openshift-marketplace/redhat-marketplace-hfml5" Mar 14 08:25:12 crc kubenswrapper[4893]: I0314 08:25:12.143756 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcb2bc44-75bd-4c66-ada5-02c20e6a70ee-utilities\") pod \"redhat-marketplace-hfml5\" (UID: \"fcb2bc44-75bd-4c66-ada5-02c20e6a70ee\") " pod="openshift-marketplace/redhat-marketplace-hfml5" Mar 14 08:25:12 crc kubenswrapper[4893]: I0314 08:25:12.144382 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcb2bc44-75bd-4c66-ada5-02c20e6a70ee-catalog-content\") pod \"redhat-marketplace-hfml5\" (UID: \"fcb2bc44-75bd-4c66-ada5-02c20e6a70ee\") " pod="openshift-marketplace/redhat-marketplace-hfml5" Mar 14 08:25:12 crc kubenswrapper[4893]: I0314 08:25:12.166703 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxrkk\" (UniqueName: \"kubernetes.io/projected/fcb2bc44-75bd-4c66-ada5-02c20e6a70ee-kube-api-access-pxrkk\") pod \"redhat-marketplace-hfml5\" (UID: \"fcb2bc44-75bd-4c66-ada5-02c20e6a70ee\") " pod="openshift-marketplace/redhat-marketplace-hfml5" Mar 14 08:25:12 crc kubenswrapper[4893]: I0314 08:25:12.261365 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hfml5" Mar 14 08:25:12 crc kubenswrapper[4893]: I0314 08:25:12.549019 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-shk8l" event={"ID":"6e75172a-788f-4f51-af61-b344f39df901","Type":"ContainerStarted","Data":"9eb75849b87281a3469194671fee7721703e266b2f75d6dcd2cb7b6ad60d5214"} Mar 14 08:25:12 crc kubenswrapper[4893]: I0314 08:25:12.776454 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hfml5"] Mar 14 08:25:12 crc kubenswrapper[4893]: W0314 08:25:12.799879 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfcb2bc44_75bd_4c66_ada5_02c20e6a70ee.slice/crio-23af136b24014daca4e7d37d815c980732b314c175997be20c08c4d959ead13b WatchSource:0}: Error finding container 23af136b24014daca4e7d37d815c980732b314c175997be20c08c4d959ead13b: Status 404 returned error can't find the container with id 23af136b24014daca4e7d37d815c980732b314c175997be20c08c4d959ead13b Mar 14 08:25:13 crc kubenswrapper[4893]: I0314 08:25:13.376836 4893 scope.go:117] "RemoveContainer" containerID="9caed2e46027ef871354b1defae4055b2942b0a9ac8f8b215e24dd5cd352dc86" Mar 14 08:25:13 crc kubenswrapper[4893]: E0314 08:25:13.378189 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:25:13 crc kubenswrapper[4893]: I0314 08:25:13.565699 4893 generic.go:334] "Generic (PLEG): container finished" podID="6e75172a-788f-4f51-af61-b344f39df901" containerID="9eb75849b87281a3469194671fee7721703e266b2f75d6dcd2cb7b6ad60d5214" exitCode=0 Mar 14 08:25:13 crc kubenswrapper[4893]: I0314 08:25:13.565918 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-shk8l" event={"ID":"6e75172a-788f-4f51-af61-b344f39df901","Type":"ContainerDied","Data":"9eb75849b87281a3469194671fee7721703e266b2f75d6dcd2cb7b6ad60d5214"} Mar 14 08:25:13 crc kubenswrapper[4893]: I0314 08:25:13.572395 4893 generic.go:334] "Generic (PLEG): container finished" podID="fcb2bc44-75bd-4c66-ada5-02c20e6a70ee" containerID="ac972245dc2a2686df4eb8f7845b3f7d73ba129777bf205b08c4e7b050ec91be" exitCode=0 Mar 14 08:25:13 crc kubenswrapper[4893]: I0314 08:25:13.572479 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hfml5" event={"ID":"fcb2bc44-75bd-4c66-ada5-02c20e6a70ee","Type":"ContainerDied","Data":"ac972245dc2a2686df4eb8f7845b3f7d73ba129777bf205b08c4e7b050ec91be"} Mar 14 08:25:13 crc kubenswrapper[4893]: I0314 08:25:13.572582 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hfml5" event={"ID":"fcb2bc44-75bd-4c66-ada5-02c20e6a70ee","Type":"ContainerStarted","Data":"23af136b24014daca4e7d37d815c980732b314c175997be20c08c4d959ead13b"} Mar 14 08:25:15 crc kubenswrapper[4893]: I0314 08:25:15.591596 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-shk8l" event={"ID":"6e75172a-788f-4f51-af61-b344f39df901","Type":"ContainerStarted","Data":"1893855bfc3c4747a5d4fc744e224c36e774172f8f97482c305c24ce08ce1db7"} Mar 14 08:25:15 crc kubenswrapper[4893]: I0314 08:25:15.593753 4893 generic.go:334] "Generic (PLEG): container finished" podID="fcb2bc44-75bd-4c66-ada5-02c20e6a70ee" containerID="2616f8ddb87300c03f941fb1e2525ca40ba6c7c44129145c635578e86db8ee15" exitCode=0 Mar 14 08:25:15 crc kubenswrapper[4893]: I0314 08:25:15.593808 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hfml5" event={"ID":"fcb2bc44-75bd-4c66-ada5-02c20e6a70ee","Type":"ContainerDied","Data":"2616f8ddb87300c03f941fb1e2525ca40ba6c7c44129145c635578e86db8ee15"} Mar 14 08:25:15 crc kubenswrapper[4893]: I0314 08:25:15.619311 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-shk8l" podStartSLOduration=3.157419325 podStartE2EDuration="6.619283996s" podCreationTimestamp="2026-03-14 08:25:09 +0000 UTC" firstStartedPulling="2026-03-14 08:25:11.530803294 +0000 UTC m=+5190.792980106" lastFinishedPulling="2026-03-14 08:25:14.992667985 +0000 UTC m=+5194.254844777" observedRunningTime="2026-03-14 08:25:15.614977061 +0000 UTC m=+5194.877153933" watchObservedRunningTime="2026-03-14 08:25:15.619283996 +0000 UTC m=+5194.881460818" Mar 14 08:25:16 crc kubenswrapper[4893]: I0314 08:25:16.603111 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hfml5" event={"ID":"fcb2bc44-75bd-4c66-ada5-02c20e6a70ee","Type":"ContainerStarted","Data":"85cb50b3833bd8e49ecf7ad150165d0bc65692c65b97251ce614d9a2bbae6b7f"} Mar 14 08:25:16 crc kubenswrapper[4893]: I0314 08:25:16.624025 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hfml5" podStartSLOduration=3.241114085 podStartE2EDuration="5.6240055s" podCreationTimestamp="2026-03-14 08:25:11 +0000 UTC" firstStartedPulling="2026-03-14 08:25:13.574367203 +0000 UTC m=+5192.836544035" lastFinishedPulling="2026-03-14 08:25:15.957258648 +0000 UTC m=+5195.219435450" observedRunningTime="2026-03-14 08:25:16.621628371 +0000 UTC m=+5195.883805183" watchObservedRunningTime="2026-03-14 08:25:16.6240055 +0000 UTC m=+5195.886182292" Mar 14 08:25:19 crc kubenswrapper[4893]: I0314 08:25:19.875610 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-shk8l" Mar 14 08:25:19 crc kubenswrapper[4893]: I0314 08:25:19.876126 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-shk8l" Mar 14 08:25:19 crc kubenswrapper[4893]: I0314 08:25:19.913051 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-shk8l" Mar 14 08:25:20 crc kubenswrapper[4893]: I0314 08:25:20.665188 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-shk8l" Mar 14 08:25:21 crc kubenswrapper[4893]: I0314 08:25:21.121733 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-shk8l"] Mar 14 08:25:22 crc kubenswrapper[4893]: I0314 08:25:22.262465 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hfml5" Mar 14 08:25:22 crc kubenswrapper[4893]: I0314 08:25:22.262555 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hfml5" Mar 14 08:25:22 crc kubenswrapper[4893]: I0314 08:25:22.302295 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hfml5" Mar 14 08:25:22 crc kubenswrapper[4893]: I0314 08:25:22.640812 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-shk8l" podUID="6e75172a-788f-4f51-af61-b344f39df901" containerName="registry-server" containerID="cri-o://1893855bfc3c4747a5d4fc744e224c36e774172f8f97482c305c24ce08ce1db7" gracePeriod=2 Mar 14 08:25:22 crc kubenswrapper[4893]: I0314 08:25:22.713247 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hfml5" Mar 14 08:25:22 crc kubenswrapper[4893]: I0314 08:25:22.988641 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-shk8l" Mar 14 08:25:23 crc kubenswrapper[4893]: I0314 08:25:23.111772 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e75172a-788f-4f51-af61-b344f39df901-catalog-content\") pod \"6e75172a-788f-4f51-af61-b344f39df901\" (UID: \"6e75172a-788f-4f51-af61-b344f39df901\") " Mar 14 08:25:23 crc kubenswrapper[4893]: I0314 08:25:23.111829 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mc76r\" (UniqueName: \"kubernetes.io/projected/6e75172a-788f-4f51-af61-b344f39df901-kube-api-access-mc76r\") pod \"6e75172a-788f-4f51-af61-b344f39df901\" (UID: \"6e75172a-788f-4f51-af61-b344f39df901\") " Mar 14 08:25:23 crc kubenswrapper[4893]: I0314 08:25:23.111908 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e75172a-788f-4f51-af61-b344f39df901-utilities\") pod \"6e75172a-788f-4f51-af61-b344f39df901\" (UID: \"6e75172a-788f-4f51-af61-b344f39df901\") " Mar 14 08:25:23 crc kubenswrapper[4893]: I0314 08:25:23.112950 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e75172a-788f-4f51-af61-b344f39df901-utilities" (OuterVolumeSpecName: "utilities") pod "6e75172a-788f-4f51-af61-b344f39df901" (UID: "6e75172a-788f-4f51-af61-b344f39df901"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:25:23 crc kubenswrapper[4893]: I0314 08:25:23.117179 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e75172a-788f-4f51-af61-b344f39df901-kube-api-access-mc76r" (OuterVolumeSpecName: "kube-api-access-mc76r") pod "6e75172a-788f-4f51-af61-b344f39df901" (UID: "6e75172a-788f-4f51-af61-b344f39df901"). InnerVolumeSpecName "kube-api-access-mc76r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:25:23 crc kubenswrapper[4893]: I0314 08:25:23.179401 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e75172a-788f-4f51-af61-b344f39df901-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6e75172a-788f-4f51-af61-b344f39df901" (UID: "6e75172a-788f-4f51-af61-b344f39df901"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:25:23 crc kubenswrapper[4893]: I0314 08:25:23.213352 4893 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e75172a-788f-4f51-af61-b344f39df901-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 08:25:23 crc kubenswrapper[4893]: I0314 08:25:23.213381 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mc76r\" (UniqueName: \"kubernetes.io/projected/6e75172a-788f-4f51-af61-b344f39df901-kube-api-access-mc76r\") on node \"crc\" DevicePath \"\"" Mar 14 08:25:23 crc kubenswrapper[4893]: I0314 08:25:23.213392 4893 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e75172a-788f-4f51-af61-b344f39df901-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 08:25:23 crc kubenswrapper[4893]: I0314 08:25:23.647329 4893 generic.go:334] "Generic (PLEG): container finished" podID="6e75172a-788f-4f51-af61-b344f39df901" containerID="1893855bfc3c4747a5d4fc744e224c36e774172f8f97482c305c24ce08ce1db7" exitCode=0 Mar 14 08:25:23 crc kubenswrapper[4893]: I0314 08:25:23.647359 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-shk8l" event={"ID":"6e75172a-788f-4f51-af61-b344f39df901","Type":"ContainerDied","Data":"1893855bfc3c4747a5d4fc744e224c36e774172f8f97482c305c24ce08ce1db7"} Mar 14 08:25:23 crc kubenswrapper[4893]: I0314 08:25:23.647389 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-shk8l" Mar 14 08:25:23 crc kubenswrapper[4893]: I0314 08:25:23.647409 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-shk8l" event={"ID":"6e75172a-788f-4f51-af61-b344f39df901","Type":"ContainerDied","Data":"80bf1eac636b627e5a7eb3f31ad950ca24ad2e89620fcab4f015b72d96f5dfe2"} Mar 14 08:25:23 crc kubenswrapper[4893]: I0314 08:25:23.647426 4893 scope.go:117] "RemoveContainer" containerID="1893855bfc3c4747a5d4fc744e224c36e774172f8f97482c305c24ce08ce1db7" Mar 14 08:25:23 crc kubenswrapper[4893]: I0314 08:25:23.671108 4893 scope.go:117] "RemoveContainer" containerID="9eb75849b87281a3469194671fee7721703e266b2f75d6dcd2cb7b6ad60d5214" Mar 14 08:25:23 crc kubenswrapper[4893]: I0314 08:25:23.683716 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-shk8l"] Mar 14 08:25:23 crc kubenswrapper[4893]: I0314 08:25:23.692013 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-shk8l"] Mar 14 08:25:23 crc kubenswrapper[4893]: I0314 08:25:23.693039 4893 scope.go:117] "RemoveContainer" containerID="55b12de1fe5fc886fe219e4d3f8cd99bdd15efb804270cca80298906ebaadea9" Mar 14 08:25:23 crc kubenswrapper[4893]: I0314 08:25:23.725757 4893 scope.go:117] "RemoveContainer" containerID="1893855bfc3c4747a5d4fc744e224c36e774172f8f97482c305c24ce08ce1db7" Mar 14 08:25:23 crc kubenswrapper[4893]: E0314 08:25:23.726346 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1893855bfc3c4747a5d4fc744e224c36e774172f8f97482c305c24ce08ce1db7\": container with ID starting with 1893855bfc3c4747a5d4fc744e224c36e774172f8f97482c305c24ce08ce1db7 not found: ID does not exist" containerID="1893855bfc3c4747a5d4fc744e224c36e774172f8f97482c305c24ce08ce1db7" Mar 14 08:25:23 crc kubenswrapper[4893]: I0314 08:25:23.726575 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1893855bfc3c4747a5d4fc744e224c36e774172f8f97482c305c24ce08ce1db7"} err="failed to get container status \"1893855bfc3c4747a5d4fc744e224c36e774172f8f97482c305c24ce08ce1db7\": rpc error: code = NotFound desc = could not find container \"1893855bfc3c4747a5d4fc744e224c36e774172f8f97482c305c24ce08ce1db7\": container with ID starting with 1893855bfc3c4747a5d4fc744e224c36e774172f8f97482c305c24ce08ce1db7 not found: ID does not exist" Mar 14 08:25:23 crc kubenswrapper[4893]: I0314 08:25:23.726684 4893 scope.go:117] "RemoveContainer" containerID="9eb75849b87281a3469194671fee7721703e266b2f75d6dcd2cb7b6ad60d5214" Mar 14 08:25:23 crc kubenswrapper[4893]: E0314 08:25:23.727623 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9eb75849b87281a3469194671fee7721703e266b2f75d6dcd2cb7b6ad60d5214\": container with ID starting with 9eb75849b87281a3469194671fee7721703e266b2f75d6dcd2cb7b6ad60d5214 not found: ID does not exist" containerID="9eb75849b87281a3469194671fee7721703e266b2f75d6dcd2cb7b6ad60d5214" Mar 14 08:25:23 crc kubenswrapper[4893]: I0314 08:25:23.727686 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9eb75849b87281a3469194671fee7721703e266b2f75d6dcd2cb7b6ad60d5214"} err="failed to get container status \"9eb75849b87281a3469194671fee7721703e266b2f75d6dcd2cb7b6ad60d5214\": rpc error: code = NotFound desc = could not find container \"9eb75849b87281a3469194671fee7721703e266b2f75d6dcd2cb7b6ad60d5214\": container with ID starting with 9eb75849b87281a3469194671fee7721703e266b2f75d6dcd2cb7b6ad60d5214 not found: ID does not exist" Mar 14 08:25:23 crc kubenswrapper[4893]: I0314 08:25:23.727730 4893 scope.go:117] "RemoveContainer" containerID="55b12de1fe5fc886fe219e4d3f8cd99bdd15efb804270cca80298906ebaadea9" Mar 14 08:25:23 crc kubenswrapper[4893]: E0314 08:25:23.728065 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55b12de1fe5fc886fe219e4d3f8cd99bdd15efb804270cca80298906ebaadea9\": container with ID starting with 55b12de1fe5fc886fe219e4d3f8cd99bdd15efb804270cca80298906ebaadea9 not found: ID does not exist" containerID="55b12de1fe5fc886fe219e4d3f8cd99bdd15efb804270cca80298906ebaadea9" Mar 14 08:25:23 crc kubenswrapper[4893]: I0314 08:25:23.728101 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55b12de1fe5fc886fe219e4d3f8cd99bdd15efb804270cca80298906ebaadea9"} err="failed to get container status \"55b12de1fe5fc886fe219e4d3f8cd99bdd15efb804270cca80298906ebaadea9\": rpc error: code = NotFound desc = could not find container \"55b12de1fe5fc886fe219e4d3f8cd99bdd15efb804270cca80298906ebaadea9\": container with ID starting with 55b12de1fe5fc886fe219e4d3f8cd99bdd15efb804270cca80298906ebaadea9 not found: ID does not exist" Mar 14 08:25:23 crc kubenswrapper[4893]: I0314 08:25:23.926412 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hfml5"] Mar 14 08:25:25 crc kubenswrapper[4893]: I0314 08:25:25.376736 4893 scope.go:117] "RemoveContainer" containerID="9caed2e46027ef871354b1defae4055b2942b0a9ac8f8b215e24dd5cd352dc86" Mar 14 08:25:25 crc kubenswrapper[4893]: E0314 08:25:25.376968 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:25:25 crc kubenswrapper[4893]: I0314 08:25:25.386081 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e75172a-788f-4f51-af61-b344f39df901" path="/var/lib/kubelet/pods/6e75172a-788f-4f51-af61-b344f39df901/volumes" Mar 14 08:25:25 crc kubenswrapper[4893]: I0314 08:25:25.662211 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hfml5" podUID="fcb2bc44-75bd-4c66-ada5-02c20e6a70ee" containerName="registry-server" containerID="cri-o://85cb50b3833bd8e49ecf7ad150165d0bc65692c65b97251ce614d9a2bbae6b7f" gracePeriod=2 Mar 14 08:25:26 crc kubenswrapper[4893]: I0314 08:25:26.044762 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hfml5" Mar 14 08:25:26 crc kubenswrapper[4893]: I0314 08:25:26.161129 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcb2bc44-75bd-4c66-ada5-02c20e6a70ee-utilities\") pod \"fcb2bc44-75bd-4c66-ada5-02c20e6a70ee\" (UID: \"fcb2bc44-75bd-4c66-ada5-02c20e6a70ee\") " Mar 14 08:25:26 crc kubenswrapper[4893]: I0314 08:25:26.161246 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcb2bc44-75bd-4c66-ada5-02c20e6a70ee-catalog-content\") pod \"fcb2bc44-75bd-4c66-ada5-02c20e6a70ee\" (UID: \"fcb2bc44-75bd-4c66-ada5-02c20e6a70ee\") " Mar 14 08:25:26 crc kubenswrapper[4893]: I0314 08:25:26.161307 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxrkk\" (UniqueName: \"kubernetes.io/projected/fcb2bc44-75bd-4c66-ada5-02c20e6a70ee-kube-api-access-pxrkk\") pod \"fcb2bc44-75bd-4c66-ada5-02c20e6a70ee\" (UID: \"fcb2bc44-75bd-4c66-ada5-02c20e6a70ee\") " Mar 14 08:25:26 crc kubenswrapper[4893]: I0314 08:25:26.162004 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fcb2bc44-75bd-4c66-ada5-02c20e6a70ee-utilities" (OuterVolumeSpecName: "utilities") pod "fcb2bc44-75bd-4c66-ada5-02c20e6a70ee" (UID: "fcb2bc44-75bd-4c66-ada5-02c20e6a70ee"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:25:26 crc kubenswrapper[4893]: I0314 08:25:26.166356 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcb2bc44-75bd-4c66-ada5-02c20e6a70ee-kube-api-access-pxrkk" (OuterVolumeSpecName: "kube-api-access-pxrkk") pod "fcb2bc44-75bd-4c66-ada5-02c20e6a70ee" (UID: "fcb2bc44-75bd-4c66-ada5-02c20e6a70ee"). InnerVolumeSpecName "kube-api-access-pxrkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:25:26 crc kubenswrapper[4893]: I0314 08:25:26.187431 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fcb2bc44-75bd-4c66-ada5-02c20e6a70ee-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fcb2bc44-75bd-4c66-ada5-02c20e6a70ee" (UID: "fcb2bc44-75bd-4c66-ada5-02c20e6a70ee"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:25:26 crc kubenswrapper[4893]: I0314 08:25:26.262825 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pxrkk\" (UniqueName: \"kubernetes.io/projected/fcb2bc44-75bd-4c66-ada5-02c20e6a70ee-kube-api-access-pxrkk\") on node \"crc\" DevicePath \"\"" Mar 14 08:25:26 crc kubenswrapper[4893]: I0314 08:25:26.262879 4893 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcb2bc44-75bd-4c66-ada5-02c20e6a70ee-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 08:25:26 crc kubenswrapper[4893]: I0314 08:25:26.262890 4893 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcb2bc44-75bd-4c66-ada5-02c20e6a70ee-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 08:25:26 crc kubenswrapper[4893]: I0314 08:25:26.673473 4893 generic.go:334] "Generic (PLEG): container finished" podID="fcb2bc44-75bd-4c66-ada5-02c20e6a70ee" containerID="85cb50b3833bd8e49ecf7ad150165d0bc65692c65b97251ce614d9a2bbae6b7f" exitCode=0 Mar 14 08:25:26 crc kubenswrapper[4893]: I0314 08:25:26.673565 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hfml5" event={"ID":"fcb2bc44-75bd-4c66-ada5-02c20e6a70ee","Type":"ContainerDied","Data":"85cb50b3833bd8e49ecf7ad150165d0bc65692c65b97251ce614d9a2bbae6b7f"} Mar 14 08:25:26 crc kubenswrapper[4893]: I0314 08:25:26.673607 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hfml5" event={"ID":"fcb2bc44-75bd-4c66-ada5-02c20e6a70ee","Type":"ContainerDied","Data":"23af136b24014daca4e7d37d815c980732b314c175997be20c08c4d959ead13b"} Mar 14 08:25:26 crc kubenswrapper[4893]: I0314 08:25:26.673628 4893 scope.go:117] "RemoveContainer" containerID="85cb50b3833bd8e49ecf7ad150165d0bc65692c65b97251ce614d9a2bbae6b7f" Mar 14 08:25:26 crc kubenswrapper[4893]: I0314 08:25:26.673760 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hfml5" Mar 14 08:25:26 crc kubenswrapper[4893]: I0314 08:25:26.692407 4893 scope.go:117] "RemoveContainer" containerID="2616f8ddb87300c03f941fb1e2525ca40ba6c7c44129145c635578e86db8ee15" Mar 14 08:25:26 crc kubenswrapper[4893]: I0314 08:25:26.708161 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hfml5"] Mar 14 08:25:26 crc kubenswrapper[4893]: I0314 08:25:26.712901 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hfml5"] Mar 14 08:25:26 crc kubenswrapper[4893]: I0314 08:25:26.740392 4893 scope.go:117] "RemoveContainer" containerID="ac972245dc2a2686df4eb8f7845b3f7d73ba129777bf205b08c4e7b050ec91be" Mar 14 08:25:26 crc kubenswrapper[4893]: I0314 08:25:26.757022 4893 scope.go:117] "RemoveContainer" containerID="85cb50b3833bd8e49ecf7ad150165d0bc65692c65b97251ce614d9a2bbae6b7f" Mar 14 08:25:26 crc kubenswrapper[4893]: E0314 08:25:26.757453 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85cb50b3833bd8e49ecf7ad150165d0bc65692c65b97251ce614d9a2bbae6b7f\": container with ID starting with 85cb50b3833bd8e49ecf7ad150165d0bc65692c65b97251ce614d9a2bbae6b7f not found: ID does not exist" containerID="85cb50b3833bd8e49ecf7ad150165d0bc65692c65b97251ce614d9a2bbae6b7f" Mar 14 08:25:26 crc kubenswrapper[4893]: I0314 08:25:26.757498 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85cb50b3833bd8e49ecf7ad150165d0bc65692c65b97251ce614d9a2bbae6b7f"} err="failed to get container status \"85cb50b3833bd8e49ecf7ad150165d0bc65692c65b97251ce614d9a2bbae6b7f\": rpc error: code = NotFound desc = could not find container \"85cb50b3833bd8e49ecf7ad150165d0bc65692c65b97251ce614d9a2bbae6b7f\": container with ID starting with 85cb50b3833bd8e49ecf7ad150165d0bc65692c65b97251ce614d9a2bbae6b7f not found: ID does not exist" Mar 14 08:25:26 crc kubenswrapper[4893]: I0314 08:25:26.757550 4893 scope.go:117] "RemoveContainer" containerID="2616f8ddb87300c03f941fb1e2525ca40ba6c7c44129145c635578e86db8ee15" Mar 14 08:25:26 crc kubenswrapper[4893]: E0314 08:25:26.757915 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2616f8ddb87300c03f941fb1e2525ca40ba6c7c44129145c635578e86db8ee15\": container with ID starting with 2616f8ddb87300c03f941fb1e2525ca40ba6c7c44129145c635578e86db8ee15 not found: ID does not exist" containerID="2616f8ddb87300c03f941fb1e2525ca40ba6c7c44129145c635578e86db8ee15" Mar 14 08:25:26 crc kubenswrapper[4893]: I0314 08:25:26.757938 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2616f8ddb87300c03f941fb1e2525ca40ba6c7c44129145c635578e86db8ee15"} err="failed to get container status \"2616f8ddb87300c03f941fb1e2525ca40ba6c7c44129145c635578e86db8ee15\": rpc error: code = NotFound desc = could not find container \"2616f8ddb87300c03f941fb1e2525ca40ba6c7c44129145c635578e86db8ee15\": container with ID starting with 2616f8ddb87300c03f941fb1e2525ca40ba6c7c44129145c635578e86db8ee15 not found: ID does not exist" Mar 14 08:25:26 crc kubenswrapper[4893]: I0314 08:25:26.757954 4893 scope.go:117] "RemoveContainer" containerID="ac972245dc2a2686df4eb8f7845b3f7d73ba129777bf205b08c4e7b050ec91be" Mar 14 08:25:26 crc kubenswrapper[4893]: E0314 08:25:26.758210 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac972245dc2a2686df4eb8f7845b3f7d73ba129777bf205b08c4e7b050ec91be\": container with ID starting with ac972245dc2a2686df4eb8f7845b3f7d73ba129777bf205b08c4e7b050ec91be not found: ID does not exist" containerID="ac972245dc2a2686df4eb8f7845b3f7d73ba129777bf205b08c4e7b050ec91be" Mar 14 08:25:26 crc kubenswrapper[4893]: I0314 08:25:26.758233 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac972245dc2a2686df4eb8f7845b3f7d73ba129777bf205b08c4e7b050ec91be"} err="failed to get container status \"ac972245dc2a2686df4eb8f7845b3f7d73ba129777bf205b08c4e7b050ec91be\": rpc error: code = NotFound desc = could not find container \"ac972245dc2a2686df4eb8f7845b3f7d73ba129777bf205b08c4e7b050ec91be\": container with ID starting with ac972245dc2a2686df4eb8f7845b3f7d73ba129777bf205b08c4e7b050ec91be not found: ID does not exist" Mar 14 08:25:27 crc kubenswrapper[4893]: I0314 08:25:27.384320 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcb2bc44-75bd-4c66-ada5-02c20e6a70ee" path="/var/lib/kubelet/pods/fcb2bc44-75bd-4c66-ada5-02c20e6a70ee/volumes" Mar 14 08:25:40 crc kubenswrapper[4893]: I0314 08:25:40.376711 4893 scope.go:117] "RemoveContainer" containerID="9caed2e46027ef871354b1defae4055b2942b0a9ac8f8b215e24dd5cd352dc86" Mar 14 08:25:40 crc kubenswrapper[4893]: E0314 08:25:40.377602 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:25:51 crc kubenswrapper[4893]: I0314 08:25:51.380673 4893 scope.go:117] "RemoveContainer" containerID="9caed2e46027ef871354b1defae4055b2942b0a9ac8f8b215e24dd5cd352dc86" Mar 14 08:25:51 crc kubenswrapper[4893]: E0314 08:25:51.381791 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:26:00 crc kubenswrapper[4893]: I0314 08:26:00.136887 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557946-dllnx"] Mar 14 08:26:00 crc kubenswrapper[4893]: E0314 08:26:00.137812 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcb2bc44-75bd-4c66-ada5-02c20e6a70ee" containerName="extract-utilities" Mar 14 08:26:00 crc kubenswrapper[4893]: I0314 08:26:00.137830 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcb2bc44-75bd-4c66-ada5-02c20e6a70ee" containerName="extract-utilities" Mar 14 08:26:00 crc kubenswrapper[4893]: E0314 08:26:00.137850 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcb2bc44-75bd-4c66-ada5-02c20e6a70ee" containerName="extract-content" Mar 14 08:26:00 crc kubenswrapper[4893]: I0314 08:26:00.137858 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcb2bc44-75bd-4c66-ada5-02c20e6a70ee" containerName="extract-content" Mar 14 08:26:00 crc kubenswrapper[4893]: E0314 08:26:00.137865 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e75172a-788f-4f51-af61-b344f39df901" containerName="extract-utilities" Mar 14 08:26:00 crc kubenswrapper[4893]: I0314 08:26:00.137873 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e75172a-788f-4f51-af61-b344f39df901" containerName="extract-utilities" Mar 14 08:26:00 crc kubenswrapper[4893]: E0314 08:26:00.137890 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcb2bc44-75bd-4c66-ada5-02c20e6a70ee" containerName="registry-server" Mar 14 08:26:00 crc kubenswrapper[4893]: I0314 08:26:00.137897 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcb2bc44-75bd-4c66-ada5-02c20e6a70ee" containerName="registry-server" Mar 14 08:26:00 crc kubenswrapper[4893]: E0314 08:26:00.137910 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e75172a-788f-4f51-af61-b344f39df901" containerName="registry-server" Mar 14 08:26:00 crc kubenswrapper[4893]: I0314 08:26:00.137917 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e75172a-788f-4f51-af61-b344f39df901" containerName="registry-server" Mar 14 08:26:00 crc kubenswrapper[4893]: E0314 08:26:00.137941 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e75172a-788f-4f51-af61-b344f39df901" containerName="extract-content" Mar 14 08:26:00 crc kubenswrapper[4893]: I0314 08:26:00.137974 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e75172a-788f-4f51-af61-b344f39df901" containerName="extract-content" Mar 14 08:26:00 crc kubenswrapper[4893]: I0314 08:26:00.138153 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e75172a-788f-4f51-af61-b344f39df901" containerName="registry-server" Mar 14 08:26:00 crc kubenswrapper[4893]: I0314 08:26:00.138168 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcb2bc44-75bd-4c66-ada5-02c20e6a70ee" containerName="registry-server" Mar 14 08:26:00 crc kubenswrapper[4893]: I0314 08:26:00.138718 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557946-dllnx" Mar 14 08:26:00 crc kubenswrapper[4893]: I0314 08:26:00.142423 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 08:26:00 crc kubenswrapper[4893]: I0314 08:26:00.142476 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 08:26:00 crc kubenswrapper[4893]: I0314 08:26:00.142697 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-44qb7" Mar 14 08:26:00 crc kubenswrapper[4893]: I0314 08:26:00.151253 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557946-dllnx"] Mar 14 08:26:00 crc kubenswrapper[4893]: I0314 08:26:00.255460 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nvsn\" (UniqueName: \"kubernetes.io/projected/79ca7dde-9be2-477e-a6fe-8bf762e0d234-kube-api-access-5nvsn\") pod \"auto-csr-approver-29557946-dllnx\" (UID: \"79ca7dde-9be2-477e-a6fe-8bf762e0d234\") " pod="openshift-infra/auto-csr-approver-29557946-dllnx" Mar 14 08:26:00 crc kubenswrapper[4893]: I0314 08:26:00.357494 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nvsn\" (UniqueName: \"kubernetes.io/projected/79ca7dde-9be2-477e-a6fe-8bf762e0d234-kube-api-access-5nvsn\") pod \"auto-csr-approver-29557946-dllnx\" (UID: \"79ca7dde-9be2-477e-a6fe-8bf762e0d234\") " pod="openshift-infra/auto-csr-approver-29557946-dllnx" Mar 14 08:26:00 crc kubenswrapper[4893]: I0314 08:26:00.375270 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nvsn\" (UniqueName: \"kubernetes.io/projected/79ca7dde-9be2-477e-a6fe-8bf762e0d234-kube-api-access-5nvsn\") pod \"auto-csr-approver-29557946-dllnx\" (UID: \"79ca7dde-9be2-477e-a6fe-8bf762e0d234\") " pod="openshift-infra/auto-csr-approver-29557946-dllnx" Mar 14 08:26:00 crc kubenswrapper[4893]: I0314 08:26:00.466320 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557946-dllnx" Mar 14 08:26:00 crc kubenswrapper[4893]: I0314 08:26:00.872269 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557946-dllnx"] Mar 14 08:26:00 crc kubenswrapper[4893]: I0314 08:26:00.942670 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557946-dllnx" event={"ID":"79ca7dde-9be2-477e-a6fe-8bf762e0d234","Type":"ContainerStarted","Data":"985e2c25f0b0fae67b6f4fe00a9eea1a25be0728101708df77d5e41d75ce2452"} Mar 14 08:26:01 crc kubenswrapper[4893]: I0314 08:26:01.952806 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557946-dllnx" event={"ID":"79ca7dde-9be2-477e-a6fe-8bf762e0d234","Type":"ContainerStarted","Data":"fe473cf3aad2a371b299f9f39d2827e746b7c2b39c7784e3208f8afc8bb9f8ae"} Mar 14 08:26:01 crc kubenswrapper[4893]: I0314 08:26:01.968553 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557946-dllnx" podStartSLOduration=1.168910826 podStartE2EDuration="1.96853163s" podCreationTimestamp="2026-03-14 08:26:00 +0000 UTC" firstStartedPulling="2026-03-14 08:26:00.880485143 +0000 UTC m=+5240.142661935" lastFinishedPulling="2026-03-14 08:26:01.680105947 +0000 UTC m=+5240.942282739" observedRunningTime="2026-03-14 08:26:01.96320268 +0000 UTC m=+5241.225379462" watchObservedRunningTime="2026-03-14 08:26:01.96853163 +0000 UTC m=+5241.230708422" Mar 14 08:26:02 crc kubenswrapper[4893]: I0314 08:26:02.376548 4893 scope.go:117] "RemoveContainer" containerID="9caed2e46027ef871354b1defae4055b2942b0a9ac8f8b215e24dd5cd352dc86" Mar 14 08:26:02 crc kubenswrapper[4893]: E0314 08:26:02.376920 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:26:02 crc kubenswrapper[4893]: I0314 08:26:02.966167 4893 generic.go:334] "Generic (PLEG): container finished" podID="79ca7dde-9be2-477e-a6fe-8bf762e0d234" containerID="fe473cf3aad2a371b299f9f39d2827e746b7c2b39c7784e3208f8afc8bb9f8ae" exitCode=0 Mar 14 08:26:02 crc kubenswrapper[4893]: I0314 08:26:02.966270 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557946-dllnx" event={"ID":"79ca7dde-9be2-477e-a6fe-8bf762e0d234","Type":"ContainerDied","Data":"fe473cf3aad2a371b299f9f39d2827e746b7c2b39c7784e3208f8afc8bb9f8ae"} Mar 14 08:26:04 crc kubenswrapper[4893]: I0314 08:26:04.254975 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557946-dllnx" Mar 14 08:26:04 crc kubenswrapper[4893]: I0314 08:26:04.423431 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5nvsn\" (UniqueName: \"kubernetes.io/projected/79ca7dde-9be2-477e-a6fe-8bf762e0d234-kube-api-access-5nvsn\") pod \"79ca7dde-9be2-477e-a6fe-8bf762e0d234\" (UID: \"79ca7dde-9be2-477e-a6fe-8bf762e0d234\") " Mar 14 08:26:04 crc kubenswrapper[4893]: I0314 08:26:04.430498 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79ca7dde-9be2-477e-a6fe-8bf762e0d234-kube-api-access-5nvsn" (OuterVolumeSpecName: "kube-api-access-5nvsn") pod "79ca7dde-9be2-477e-a6fe-8bf762e0d234" (UID: "79ca7dde-9be2-477e-a6fe-8bf762e0d234"). InnerVolumeSpecName "kube-api-access-5nvsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:26:04 crc kubenswrapper[4893]: I0314 08:26:04.473234 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557940-hzvhr"] Mar 14 08:26:04 crc kubenswrapper[4893]: I0314 08:26:04.479059 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557940-hzvhr"] Mar 14 08:26:04 crc kubenswrapper[4893]: I0314 08:26:04.524732 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5nvsn\" (UniqueName: \"kubernetes.io/projected/79ca7dde-9be2-477e-a6fe-8bf762e0d234-kube-api-access-5nvsn\") on node \"crc\" DevicePath \"\"" Mar 14 08:26:04 crc kubenswrapper[4893]: I0314 08:26:04.984993 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557946-dllnx" event={"ID":"79ca7dde-9be2-477e-a6fe-8bf762e0d234","Type":"ContainerDied","Data":"985e2c25f0b0fae67b6f4fe00a9eea1a25be0728101708df77d5e41d75ce2452"} Mar 14 08:26:04 crc kubenswrapper[4893]: I0314 08:26:04.985083 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="985e2c25f0b0fae67b6f4fe00a9eea1a25be0728101708df77d5e41d75ce2452" Mar 14 08:26:04 crc kubenswrapper[4893]: I0314 08:26:04.985126 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557946-dllnx" Mar 14 08:26:05 crc kubenswrapper[4893]: I0314 08:26:05.387990 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86f97d66-0b8d-4ba5-9792-0dd9b9a21472" path="/var/lib/kubelet/pods/86f97d66-0b8d-4ba5-9792-0dd9b9a21472/volumes" Mar 14 08:26:09 crc kubenswrapper[4893]: I0314 08:26:09.951559 4893 scope.go:117] "RemoveContainer" containerID="b262c09ce542900ef4f708769b4f8e681a36fd9063735c1b7789726c0ec2d8ab" Mar 14 08:26:13 crc kubenswrapper[4893]: I0314 08:26:13.376729 4893 scope.go:117] "RemoveContainer" containerID="9caed2e46027ef871354b1defae4055b2942b0a9ac8f8b215e24dd5cd352dc86" Mar 14 08:26:13 crc kubenswrapper[4893]: E0314 08:26:13.377291 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:26:24 crc kubenswrapper[4893]: I0314 08:26:24.378028 4893 scope.go:117] "RemoveContainer" containerID="9caed2e46027ef871354b1defae4055b2942b0a9ac8f8b215e24dd5cd352dc86" Mar 14 08:26:24 crc kubenswrapper[4893]: E0314 08:26:24.380806 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:26:36 crc kubenswrapper[4893]: I0314 08:26:36.376946 4893 scope.go:117] "RemoveContainer" containerID="9caed2e46027ef871354b1defae4055b2942b0a9ac8f8b215e24dd5cd352dc86" Mar 14 08:26:36 crc kubenswrapper[4893]: E0314 08:26:36.378226 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:26:50 crc kubenswrapper[4893]: I0314 08:26:50.377033 4893 scope.go:117] "RemoveContainer" containerID="9caed2e46027ef871354b1defae4055b2942b0a9ac8f8b215e24dd5cd352dc86" Mar 14 08:26:50 crc kubenswrapper[4893]: E0314 08:26:50.378784 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:27:02 crc kubenswrapper[4893]: I0314 08:27:02.376712 4893 scope.go:117] "RemoveContainer" containerID="9caed2e46027ef871354b1defae4055b2942b0a9ac8f8b215e24dd5cd352dc86" Mar 14 08:27:02 crc kubenswrapper[4893]: E0314 08:27:02.377457 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:27:13 crc kubenswrapper[4893]: I0314 08:27:13.378023 4893 scope.go:117] "RemoveContainer" containerID="9caed2e46027ef871354b1defae4055b2942b0a9ac8f8b215e24dd5cd352dc86" Mar 14 08:27:13 crc kubenswrapper[4893]: E0314 08:27:13.379226 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:27:25 crc kubenswrapper[4893]: I0314 08:27:25.379496 4893 scope.go:117] "RemoveContainer" containerID="9caed2e46027ef871354b1defae4055b2942b0a9ac8f8b215e24dd5cd352dc86" Mar 14 08:27:25 crc kubenswrapper[4893]: E0314 08:27:25.382985 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:27:37 crc kubenswrapper[4893]: I0314 08:27:37.376889 4893 scope.go:117] "RemoveContainer" containerID="9caed2e46027ef871354b1defae4055b2942b0a9ac8f8b215e24dd5cd352dc86" Mar 14 08:27:37 crc kubenswrapper[4893]: E0314 08:27:37.378177 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:27:50 crc kubenswrapper[4893]: I0314 08:27:50.376664 4893 scope.go:117] "RemoveContainer" containerID="9caed2e46027ef871354b1defae4055b2942b0a9ac8f8b215e24dd5cd352dc86" Mar 14 08:27:50 crc kubenswrapper[4893]: E0314 08:27:50.377319 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:28:00 crc kubenswrapper[4893]: I0314 08:28:00.150251 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557948-l2h84"] Mar 14 08:28:00 crc kubenswrapper[4893]: E0314 08:28:00.151131 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79ca7dde-9be2-477e-a6fe-8bf762e0d234" containerName="oc" Mar 14 08:28:00 crc kubenswrapper[4893]: I0314 08:28:00.151146 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="79ca7dde-9be2-477e-a6fe-8bf762e0d234" containerName="oc" Mar 14 08:28:00 crc kubenswrapper[4893]: I0314 08:28:00.151340 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="79ca7dde-9be2-477e-a6fe-8bf762e0d234" containerName="oc" Mar 14 08:28:00 crc kubenswrapper[4893]: I0314 08:28:00.151875 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557948-l2h84" Mar 14 08:28:00 crc kubenswrapper[4893]: I0314 08:28:00.154094 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-44qb7" Mar 14 08:28:00 crc kubenswrapper[4893]: I0314 08:28:00.154703 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 08:28:00 crc kubenswrapper[4893]: I0314 08:28:00.155968 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 08:28:00 crc kubenswrapper[4893]: I0314 08:28:00.160477 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557948-l2h84"] Mar 14 08:28:00 crc kubenswrapper[4893]: I0314 08:28:00.250649 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zlbl\" (UniqueName: \"kubernetes.io/projected/92830b14-dfcc-4ea9-a289-c6a3cf957677-kube-api-access-4zlbl\") pod \"auto-csr-approver-29557948-l2h84\" (UID: \"92830b14-dfcc-4ea9-a289-c6a3cf957677\") " pod="openshift-infra/auto-csr-approver-29557948-l2h84" Mar 14 08:28:00 crc kubenswrapper[4893]: I0314 08:28:00.352007 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zlbl\" (UniqueName: \"kubernetes.io/projected/92830b14-dfcc-4ea9-a289-c6a3cf957677-kube-api-access-4zlbl\") pod \"auto-csr-approver-29557948-l2h84\" (UID: \"92830b14-dfcc-4ea9-a289-c6a3cf957677\") " pod="openshift-infra/auto-csr-approver-29557948-l2h84" Mar 14 08:28:00 crc kubenswrapper[4893]: I0314 08:28:00.370946 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zlbl\" (UniqueName: \"kubernetes.io/projected/92830b14-dfcc-4ea9-a289-c6a3cf957677-kube-api-access-4zlbl\") pod \"auto-csr-approver-29557948-l2h84\" (UID: \"92830b14-dfcc-4ea9-a289-c6a3cf957677\") " pod="openshift-infra/auto-csr-approver-29557948-l2h84" Mar 14 08:28:00 crc kubenswrapper[4893]: I0314 08:28:00.486777 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557948-l2h84" Mar 14 08:28:00 crc kubenswrapper[4893]: I0314 08:28:00.748322 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557948-l2h84"] Mar 14 08:28:01 crc kubenswrapper[4893]: I0314 08:28:01.019149 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557948-l2h84" event={"ID":"92830b14-dfcc-4ea9-a289-c6a3cf957677","Type":"ContainerStarted","Data":"48c85497aac2f94f0eef783d99df5b5255cea205e187ba507866f368f4bb61e9"} Mar 14 08:28:02 crc kubenswrapper[4893]: I0314 08:28:02.038996 4893 generic.go:334] "Generic (PLEG): container finished" podID="92830b14-dfcc-4ea9-a289-c6a3cf957677" containerID="8cc5b152061df97e15f797993ef1927fe7d1527506eb88b12172f2039d5acd86" exitCode=0 Mar 14 08:28:02 crc kubenswrapper[4893]: I0314 08:28:02.039160 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557948-l2h84" event={"ID":"92830b14-dfcc-4ea9-a289-c6a3cf957677","Type":"ContainerDied","Data":"8cc5b152061df97e15f797993ef1927fe7d1527506eb88b12172f2039d5acd86"} Mar 14 08:28:03 crc kubenswrapper[4893]: I0314 08:28:03.328974 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557948-l2h84" Mar 14 08:28:03 crc kubenswrapper[4893]: I0314 08:28:03.377392 4893 scope.go:117] "RemoveContainer" containerID="9caed2e46027ef871354b1defae4055b2942b0a9ac8f8b215e24dd5cd352dc86" Mar 14 08:28:03 crc kubenswrapper[4893]: E0314 08:28:03.377789 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:28:03 crc kubenswrapper[4893]: I0314 08:28:03.499641 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zlbl\" (UniqueName: \"kubernetes.io/projected/92830b14-dfcc-4ea9-a289-c6a3cf957677-kube-api-access-4zlbl\") pod \"92830b14-dfcc-4ea9-a289-c6a3cf957677\" (UID: \"92830b14-dfcc-4ea9-a289-c6a3cf957677\") " Mar 14 08:28:03 crc kubenswrapper[4893]: I0314 08:28:03.511029 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92830b14-dfcc-4ea9-a289-c6a3cf957677-kube-api-access-4zlbl" (OuterVolumeSpecName: "kube-api-access-4zlbl") pod "92830b14-dfcc-4ea9-a289-c6a3cf957677" (UID: "92830b14-dfcc-4ea9-a289-c6a3cf957677"). InnerVolumeSpecName "kube-api-access-4zlbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:28:03 crc kubenswrapper[4893]: I0314 08:28:03.601820 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zlbl\" (UniqueName: \"kubernetes.io/projected/92830b14-dfcc-4ea9-a289-c6a3cf957677-kube-api-access-4zlbl\") on node \"crc\" DevicePath \"\"" Mar 14 08:28:04 crc kubenswrapper[4893]: I0314 08:28:04.060842 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557948-l2h84" event={"ID":"92830b14-dfcc-4ea9-a289-c6a3cf957677","Type":"ContainerDied","Data":"48c85497aac2f94f0eef783d99df5b5255cea205e187ba507866f368f4bb61e9"} Mar 14 08:28:04 crc kubenswrapper[4893]: I0314 08:28:04.060907 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48c85497aac2f94f0eef783d99df5b5255cea205e187ba507866f368f4bb61e9" Mar 14 08:28:04 crc kubenswrapper[4893]: I0314 08:28:04.060934 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557948-l2h84" Mar 14 08:28:04 crc kubenswrapper[4893]: I0314 08:28:04.399782 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557942-l8gwc"] Mar 14 08:28:04 crc kubenswrapper[4893]: I0314 08:28:04.408045 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557942-l8gwc"] Mar 14 08:28:05 crc kubenswrapper[4893]: I0314 08:28:05.384166 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd929b50-edb0-4172-88ed-e9dba2bae63f" path="/var/lib/kubelet/pods/cd929b50-edb0-4172-88ed-e9dba2bae63f/volumes" Mar 14 08:28:10 crc kubenswrapper[4893]: I0314 08:28:10.057013 4893 scope.go:117] "RemoveContainer" containerID="5588770ba7a03358964c3150b45dd361cfb3b97c02eae71f7f27bb1bb998fe28" Mar 14 08:28:14 crc kubenswrapper[4893]: I0314 08:28:14.376626 4893 scope.go:117] "RemoveContainer" containerID="9caed2e46027ef871354b1defae4055b2942b0a9ac8f8b215e24dd5cd352dc86" Mar 14 08:28:14 crc kubenswrapper[4893]: E0314 08:28:14.377041 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:28:29 crc kubenswrapper[4893]: I0314 08:28:29.380241 4893 scope.go:117] "RemoveContainer" containerID="9caed2e46027ef871354b1defae4055b2942b0a9ac8f8b215e24dd5cd352dc86" Mar 14 08:28:29 crc kubenswrapper[4893]: E0314 08:28:29.381112 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:28:43 crc kubenswrapper[4893]: I0314 08:28:43.727949 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-slzwz"] Mar 14 08:28:43 crc kubenswrapper[4893]: E0314 08:28:43.728991 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92830b14-dfcc-4ea9-a289-c6a3cf957677" containerName="oc" Mar 14 08:28:43 crc kubenswrapper[4893]: I0314 08:28:43.729008 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="92830b14-dfcc-4ea9-a289-c6a3cf957677" containerName="oc" Mar 14 08:28:43 crc kubenswrapper[4893]: I0314 08:28:43.729188 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="92830b14-dfcc-4ea9-a289-c6a3cf957677" containerName="oc" Mar 14 08:28:43 crc kubenswrapper[4893]: I0314 08:28:43.730333 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-slzwz" Mar 14 08:28:43 crc kubenswrapper[4893]: I0314 08:28:43.753793 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-slzwz"] Mar 14 08:28:43 crc kubenswrapper[4893]: I0314 08:28:43.896495 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e545005-53cb-437b-96cf-1010bd8f2f9e-utilities\") pod \"redhat-operators-slzwz\" (UID: \"8e545005-53cb-437b-96cf-1010bd8f2f9e\") " pod="openshift-marketplace/redhat-operators-slzwz" Mar 14 08:28:43 crc kubenswrapper[4893]: I0314 08:28:43.896670 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e545005-53cb-437b-96cf-1010bd8f2f9e-catalog-content\") pod \"redhat-operators-slzwz\" (UID: \"8e545005-53cb-437b-96cf-1010bd8f2f9e\") " pod="openshift-marketplace/redhat-operators-slzwz" Mar 14 08:28:43 crc kubenswrapper[4893]: I0314 08:28:43.896701 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztg76\" (UniqueName: \"kubernetes.io/projected/8e545005-53cb-437b-96cf-1010bd8f2f9e-kube-api-access-ztg76\") pod \"redhat-operators-slzwz\" (UID: \"8e545005-53cb-437b-96cf-1010bd8f2f9e\") " pod="openshift-marketplace/redhat-operators-slzwz" Mar 14 08:28:43 crc kubenswrapper[4893]: I0314 08:28:43.997550 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e545005-53cb-437b-96cf-1010bd8f2f9e-catalog-content\") pod \"redhat-operators-slzwz\" (UID: \"8e545005-53cb-437b-96cf-1010bd8f2f9e\") " pod="openshift-marketplace/redhat-operators-slzwz" Mar 14 08:28:43 crc kubenswrapper[4893]: I0314 08:28:43.997616 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztg76\" (UniqueName: \"kubernetes.io/projected/8e545005-53cb-437b-96cf-1010bd8f2f9e-kube-api-access-ztg76\") pod \"redhat-operators-slzwz\" (UID: \"8e545005-53cb-437b-96cf-1010bd8f2f9e\") " pod="openshift-marketplace/redhat-operators-slzwz" Mar 14 08:28:43 crc kubenswrapper[4893]: I0314 08:28:43.997677 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e545005-53cb-437b-96cf-1010bd8f2f9e-utilities\") pod \"redhat-operators-slzwz\" (UID: \"8e545005-53cb-437b-96cf-1010bd8f2f9e\") " pod="openshift-marketplace/redhat-operators-slzwz" Mar 14 08:28:43 crc kubenswrapper[4893]: I0314 08:28:43.998148 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e545005-53cb-437b-96cf-1010bd8f2f9e-catalog-content\") pod \"redhat-operators-slzwz\" (UID: \"8e545005-53cb-437b-96cf-1010bd8f2f9e\") " pod="openshift-marketplace/redhat-operators-slzwz" Mar 14 08:28:43 crc kubenswrapper[4893]: I0314 08:28:43.998227 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e545005-53cb-437b-96cf-1010bd8f2f9e-utilities\") pod \"redhat-operators-slzwz\" (UID: \"8e545005-53cb-437b-96cf-1010bd8f2f9e\") " pod="openshift-marketplace/redhat-operators-slzwz" Mar 14 08:28:44 crc kubenswrapper[4893]: I0314 08:28:44.022687 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztg76\" (UniqueName: \"kubernetes.io/projected/8e545005-53cb-437b-96cf-1010bd8f2f9e-kube-api-access-ztg76\") pod \"redhat-operators-slzwz\" (UID: \"8e545005-53cb-437b-96cf-1010bd8f2f9e\") " pod="openshift-marketplace/redhat-operators-slzwz" Mar 14 08:28:44 crc kubenswrapper[4893]: I0314 08:28:44.052689 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-slzwz" Mar 14 08:28:44 crc kubenswrapper[4893]: I0314 08:28:44.290027 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-slzwz"] Mar 14 08:28:44 crc kubenswrapper[4893]: I0314 08:28:44.377129 4893 scope.go:117] "RemoveContainer" containerID="9caed2e46027ef871354b1defae4055b2942b0a9ac8f8b215e24dd5cd352dc86" Mar 14 08:28:44 crc kubenswrapper[4893]: E0314 08:28:44.377595 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:28:44 crc kubenswrapper[4893]: I0314 08:28:44.406160 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-slzwz" event={"ID":"8e545005-53cb-437b-96cf-1010bd8f2f9e","Type":"ContainerStarted","Data":"88ed42d9ed8612c3f0e836351196f863e878df6c71600bd1d7e0a7611a2fd618"} Mar 14 08:28:45 crc kubenswrapper[4893]: I0314 08:28:45.415785 4893 generic.go:334] "Generic (PLEG): container finished" podID="8e545005-53cb-437b-96cf-1010bd8f2f9e" containerID="f4a2336cdf200eb7fbc971c3599b32c0e54fae342cff6cdbd3e36634a4207e8c" exitCode=0 Mar 14 08:28:45 crc kubenswrapper[4893]: I0314 08:28:45.415856 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-slzwz" event={"ID":"8e545005-53cb-437b-96cf-1010bd8f2f9e","Type":"ContainerDied","Data":"f4a2336cdf200eb7fbc971c3599b32c0e54fae342cff6cdbd3e36634a4207e8c"} Mar 14 08:28:46 crc kubenswrapper[4893]: I0314 08:28:46.425421 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-slzwz" event={"ID":"8e545005-53cb-437b-96cf-1010bd8f2f9e","Type":"ContainerStarted","Data":"7e08536aa92b443a57a39d62bffb6bfee63e859353f429cd57369c646559ee93"} Mar 14 08:28:47 crc kubenswrapper[4893]: I0314 08:28:47.437680 4893 generic.go:334] "Generic (PLEG): container finished" podID="8e545005-53cb-437b-96cf-1010bd8f2f9e" containerID="7e08536aa92b443a57a39d62bffb6bfee63e859353f429cd57369c646559ee93" exitCode=0 Mar 14 08:28:47 crc kubenswrapper[4893]: I0314 08:28:47.437751 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-slzwz" event={"ID":"8e545005-53cb-437b-96cf-1010bd8f2f9e","Type":"ContainerDied","Data":"7e08536aa92b443a57a39d62bffb6bfee63e859353f429cd57369c646559ee93"} Mar 14 08:28:48 crc kubenswrapper[4893]: I0314 08:28:48.449116 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-slzwz" event={"ID":"8e545005-53cb-437b-96cf-1010bd8f2f9e","Type":"ContainerStarted","Data":"39778b6f196adfb09e105407c05cd27e6b35c35c511e06e84cec58160d46eb1f"} Mar 14 08:28:48 crc kubenswrapper[4893]: I0314 08:28:48.474321 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-slzwz" podStartSLOduration=3.017211916 podStartE2EDuration="5.474295012s" podCreationTimestamp="2026-03-14 08:28:43 +0000 UTC" firstStartedPulling="2026-03-14 08:28:45.41779208 +0000 UTC m=+5404.679968872" lastFinishedPulling="2026-03-14 08:28:47.874875136 +0000 UTC m=+5407.137051968" observedRunningTime="2026-03-14 08:28:48.467490796 +0000 UTC m=+5407.729667668" watchObservedRunningTime="2026-03-14 08:28:48.474295012 +0000 UTC m=+5407.736471824" Mar 14 08:28:54 crc kubenswrapper[4893]: I0314 08:28:54.053457 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-slzwz" Mar 14 08:28:54 crc kubenswrapper[4893]: I0314 08:28:54.054109 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-slzwz" Mar 14 08:28:55 crc kubenswrapper[4893]: I0314 08:28:55.105323 4893 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-slzwz" podUID="8e545005-53cb-437b-96cf-1010bd8f2f9e" containerName="registry-server" probeResult="failure" output=< Mar 14 08:28:55 crc kubenswrapper[4893]: timeout: failed to connect service ":50051" within 1s Mar 14 08:28:55 crc kubenswrapper[4893]: > Mar 14 08:28:57 crc kubenswrapper[4893]: I0314 08:28:57.381430 4893 scope.go:117] "RemoveContainer" containerID="9caed2e46027ef871354b1defae4055b2942b0a9ac8f8b215e24dd5cd352dc86" Mar 14 08:28:57 crc kubenswrapper[4893]: E0314 08:28:57.381898 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:29:04 crc kubenswrapper[4893]: I0314 08:29:04.092228 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-slzwz" Mar 14 08:29:04 crc kubenswrapper[4893]: I0314 08:29:04.131090 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-slzwz" Mar 14 08:29:04 crc kubenswrapper[4893]: I0314 08:29:04.325340 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-slzwz"] Mar 14 08:29:05 crc kubenswrapper[4893]: I0314 08:29:05.741020 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-slzwz" podUID="8e545005-53cb-437b-96cf-1010bd8f2f9e" containerName="registry-server" containerID="cri-o://39778b6f196adfb09e105407c05cd27e6b35c35c511e06e84cec58160d46eb1f" gracePeriod=2 Mar 14 08:29:06 crc kubenswrapper[4893]: I0314 08:29:06.195627 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-slzwz" Mar 14 08:29:06 crc kubenswrapper[4893]: I0314 08:29:06.391424 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e545005-53cb-437b-96cf-1010bd8f2f9e-utilities\") pod \"8e545005-53cb-437b-96cf-1010bd8f2f9e\" (UID: \"8e545005-53cb-437b-96cf-1010bd8f2f9e\") " Mar 14 08:29:06 crc kubenswrapper[4893]: I0314 08:29:06.391548 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztg76\" (UniqueName: \"kubernetes.io/projected/8e545005-53cb-437b-96cf-1010bd8f2f9e-kube-api-access-ztg76\") pod \"8e545005-53cb-437b-96cf-1010bd8f2f9e\" (UID: \"8e545005-53cb-437b-96cf-1010bd8f2f9e\") " Mar 14 08:29:06 crc kubenswrapper[4893]: I0314 08:29:06.391770 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e545005-53cb-437b-96cf-1010bd8f2f9e-catalog-content\") pod \"8e545005-53cb-437b-96cf-1010bd8f2f9e\" (UID: \"8e545005-53cb-437b-96cf-1010bd8f2f9e\") " Mar 14 08:29:06 crc kubenswrapper[4893]: I0314 08:29:06.392235 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e545005-53cb-437b-96cf-1010bd8f2f9e-utilities" (OuterVolumeSpecName: "utilities") pod "8e545005-53cb-437b-96cf-1010bd8f2f9e" (UID: "8e545005-53cb-437b-96cf-1010bd8f2f9e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:29:06 crc kubenswrapper[4893]: I0314 08:29:06.393192 4893 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e545005-53cb-437b-96cf-1010bd8f2f9e-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:06 crc kubenswrapper[4893]: I0314 08:29:06.400986 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e545005-53cb-437b-96cf-1010bd8f2f9e-kube-api-access-ztg76" (OuterVolumeSpecName: "kube-api-access-ztg76") pod "8e545005-53cb-437b-96cf-1010bd8f2f9e" (UID: "8e545005-53cb-437b-96cf-1010bd8f2f9e"). InnerVolumeSpecName "kube-api-access-ztg76". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:29:06 crc kubenswrapper[4893]: I0314 08:29:06.494498 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztg76\" (UniqueName: \"kubernetes.io/projected/8e545005-53cb-437b-96cf-1010bd8f2f9e-kube-api-access-ztg76\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:06 crc kubenswrapper[4893]: I0314 08:29:06.528587 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e545005-53cb-437b-96cf-1010bd8f2f9e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8e545005-53cb-437b-96cf-1010bd8f2f9e" (UID: "8e545005-53cb-437b-96cf-1010bd8f2f9e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:29:06 crc kubenswrapper[4893]: I0314 08:29:06.595987 4893 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e545005-53cb-437b-96cf-1010bd8f2f9e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:06 crc kubenswrapper[4893]: I0314 08:29:06.755017 4893 generic.go:334] "Generic (PLEG): container finished" podID="8e545005-53cb-437b-96cf-1010bd8f2f9e" containerID="39778b6f196adfb09e105407c05cd27e6b35c35c511e06e84cec58160d46eb1f" exitCode=0 Mar 14 08:29:06 crc kubenswrapper[4893]: I0314 08:29:06.755067 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-slzwz" event={"ID":"8e545005-53cb-437b-96cf-1010bd8f2f9e","Type":"ContainerDied","Data":"39778b6f196adfb09e105407c05cd27e6b35c35c511e06e84cec58160d46eb1f"} Mar 14 08:29:06 crc kubenswrapper[4893]: I0314 08:29:06.755106 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-slzwz" event={"ID":"8e545005-53cb-437b-96cf-1010bd8f2f9e","Type":"ContainerDied","Data":"88ed42d9ed8612c3f0e836351196f863e878df6c71600bd1d7e0a7611a2fd618"} Mar 14 08:29:06 crc kubenswrapper[4893]: I0314 08:29:06.755127 4893 scope.go:117] "RemoveContainer" containerID="39778b6f196adfb09e105407c05cd27e6b35c35c511e06e84cec58160d46eb1f" Mar 14 08:29:06 crc kubenswrapper[4893]: I0314 08:29:06.755152 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-slzwz" Mar 14 08:29:06 crc kubenswrapper[4893]: I0314 08:29:06.788856 4893 scope.go:117] "RemoveContainer" containerID="7e08536aa92b443a57a39d62bffb6bfee63e859353f429cd57369c646559ee93" Mar 14 08:29:06 crc kubenswrapper[4893]: I0314 08:29:06.815801 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-slzwz"] Mar 14 08:29:06 crc kubenswrapper[4893]: I0314 08:29:06.827055 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-slzwz"] Mar 14 08:29:06 crc kubenswrapper[4893]: I0314 08:29:06.836421 4893 scope.go:117] "RemoveContainer" containerID="f4a2336cdf200eb7fbc971c3599b32c0e54fae342cff6cdbd3e36634a4207e8c" Mar 14 08:29:06 crc kubenswrapper[4893]: I0314 08:29:06.859887 4893 scope.go:117] "RemoveContainer" containerID="39778b6f196adfb09e105407c05cd27e6b35c35c511e06e84cec58160d46eb1f" Mar 14 08:29:06 crc kubenswrapper[4893]: E0314 08:29:06.860456 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39778b6f196adfb09e105407c05cd27e6b35c35c511e06e84cec58160d46eb1f\": container with ID starting with 39778b6f196adfb09e105407c05cd27e6b35c35c511e06e84cec58160d46eb1f not found: ID does not exist" containerID="39778b6f196adfb09e105407c05cd27e6b35c35c511e06e84cec58160d46eb1f" Mar 14 08:29:06 crc kubenswrapper[4893]: I0314 08:29:06.860502 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39778b6f196adfb09e105407c05cd27e6b35c35c511e06e84cec58160d46eb1f"} err="failed to get container status \"39778b6f196adfb09e105407c05cd27e6b35c35c511e06e84cec58160d46eb1f\": rpc error: code = NotFound desc = could not find container \"39778b6f196adfb09e105407c05cd27e6b35c35c511e06e84cec58160d46eb1f\": container with ID starting with 39778b6f196adfb09e105407c05cd27e6b35c35c511e06e84cec58160d46eb1f not found: ID does not exist" Mar 14 08:29:06 crc kubenswrapper[4893]: I0314 08:29:06.860553 4893 scope.go:117] "RemoveContainer" containerID="7e08536aa92b443a57a39d62bffb6bfee63e859353f429cd57369c646559ee93" Mar 14 08:29:06 crc kubenswrapper[4893]: E0314 08:29:06.861135 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e08536aa92b443a57a39d62bffb6bfee63e859353f429cd57369c646559ee93\": container with ID starting with 7e08536aa92b443a57a39d62bffb6bfee63e859353f429cd57369c646559ee93 not found: ID does not exist" containerID="7e08536aa92b443a57a39d62bffb6bfee63e859353f429cd57369c646559ee93" Mar 14 08:29:06 crc kubenswrapper[4893]: I0314 08:29:06.861188 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e08536aa92b443a57a39d62bffb6bfee63e859353f429cd57369c646559ee93"} err="failed to get container status \"7e08536aa92b443a57a39d62bffb6bfee63e859353f429cd57369c646559ee93\": rpc error: code = NotFound desc = could not find container \"7e08536aa92b443a57a39d62bffb6bfee63e859353f429cd57369c646559ee93\": container with ID starting with 7e08536aa92b443a57a39d62bffb6bfee63e859353f429cd57369c646559ee93 not found: ID does not exist" Mar 14 08:29:06 crc kubenswrapper[4893]: I0314 08:29:06.861221 4893 scope.go:117] "RemoveContainer" containerID="f4a2336cdf200eb7fbc971c3599b32c0e54fae342cff6cdbd3e36634a4207e8c" Mar 14 08:29:06 crc kubenswrapper[4893]: E0314 08:29:06.861750 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4a2336cdf200eb7fbc971c3599b32c0e54fae342cff6cdbd3e36634a4207e8c\": container with ID starting with f4a2336cdf200eb7fbc971c3599b32c0e54fae342cff6cdbd3e36634a4207e8c not found: ID does not exist" containerID="f4a2336cdf200eb7fbc971c3599b32c0e54fae342cff6cdbd3e36634a4207e8c" Mar 14 08:29:06 crc kubenswrapper[4893]: I0314 08:29:06.861779 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4a2336cdf200eb7fbc971c3599b32c0e54fae342cff6cdbd3e36634a4207e8c"} err="failed to get container status \"f4a2336cdf200eb7fbc971c3599b32c0e54fae342cff6cdbd3e36634a4207e8c\": rpc error: code = NotFound desc = could not find container \"f4a2336cdf200eb7fbc971c3599b32c0e54fae342cff6cdbd3e36634a4207e8c\": container with ID starting with f4a2336cdf200eb7fbc971c3599b32c0e54fae342cff6cdbd3e36634a4207e8c not found: ID does not exist" Mar 14 08:29:07 crc kubenswrapper[4893]: I0314 08:29:07.387400 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e545005-53cb-437b-96cf-1010bd8f2f9e" path="/var/lib/kubelet/pods/8e545005-53cb-437b-96cf-1010bd8f2f9e/volumes" Mar 14 08:29:10 crc kubenswrapper[4893]: I0314 08:29:10.376645 4893 scope.go:117] "RemoveContainer" containerID="9caed2e46027ef871354b1defae4055b2942b0a9ac8f8b215e24dd5cd352dc86" Mar 14 08:29:10 crc kubenswrapper[4893]: E0314 08:29:10.377118 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:29:21 crc kubenswrapper[4893]: I0314 08:29:21.379915 4893 scope.go:117] "RemoveContainer" containerID="9caed2e46027ef871354b1defae4055b2942b0a9ac8f8b215e24dd5cd352dc86" Mar 14 08:29:21 crc kubenswrapper[4893]: E0314 08:29:21.380706 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:29:33 crc kubenswrapper[4893]: I0314 08:29:33.376494 4893 scope.go:117] "RemoveContainer" containerID="9caed2e46027ef871354b1defae4055b2942b0a9ac8f8b215e24dd5cd352dc86" Mar 14 08:29:33 crc kubenswrapper[4893]: E0314 08:29:33.377333 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:29:47 crc kubenswrapper[4893]: I0314 08:29:47.376723 4893 scope.go:117] "RemoveContainer" containerID="9caed2e46027ef871354b1defae4055b2942b0a9ac8f8b215e24dd5cd352dc86" Mar 14 08:29:47 crc kubenswrapper[4893]: E0314 08:29:47.377570 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:29:58 crc kubenswrapper[4893]: I0314 08:29:58.024599 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xh5r9"] Mar 14 08:29:58 crc kubenswrapper[4893]: E0314 08:29:58.025409 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e545005-53cb-437b-96cf-1010bd8f2f9e" containerName="extract-content" Mar 14 08:29:58 crc kubenswrapper[4893]: I0314 08:29:58.025422 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e545005-53cb-437b-96cf-1010bd8f2f9e" containerName="extract-content" Mar 14 08:29:58 crc kubenswrapper[4893]: E0314 08:29:58.025448 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e545005-53cb-437b-96cf-1010bd8f2f9e" containerName="registry-server" Mar 14 08:29:58 crc kubenswrapper[4893]: I0314 08:29:58.025455 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e545005-53cb-437b-96cf-1010bd8f2f9e" containerName="registry-server" Mar 14 08:29:58 crc kubenswrapper[4893]: E0314 08:29:58.025473 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e545005-53cb-437b-96cf-1010bd8f2f9e" containerName="extract-utilities" Mar 14 08:29:58 crc kubenswrapper[4893]: I0314 08:29:58.025479 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e545005-53cb-437b-96cf-1010bd8f2f9e" containerName="extract-utilities" Mar 14 08:29:58 crc kubenswrapper[4893]: I0314 08:29:58.025631 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e545005-53cb-437b-96cf-1010bd8f2f9e" containerName="registry-server" Mar 14 08:29:58 crc kubenswrapper[4893]: I0314 08:29:58.026578 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xh5r9" Mar 14 08:29:58 crc kubenswrapper[4893]: I0314 08:29:58.079051 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xh5r9"] Mar 14 08:29:58 crc kubenswrapper[4893]: I0314 08:29:58.166790 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e452fd28-7c38-40e8-8679-acb0d21e521e-catalog-content\") pod \"certified-operators-xh5r9\" (UID: \"e452fd28-7c38-40e8-8679-acb0d21e521e\") " pod="openshift-marketplace/certified-operators-xh5r9" Mar 14 08:29:58 crc kubenswrapper[4893]: I0314 08:29:58.166853 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5zl7\" (UniqueName: \"kubernetes.io/projected/e452fd28-7c38-40e8-8679-acb0d21e521e-kube-api-access-w5zl7\") pod \"certified-operators-xh5r9\" (UID: \"e452fd28-7c38-40e8-8679-acb0d21e521e\") " pod="openshift-marketplace/certified-operators-xh5r9" Mar 14 08:29:58 crc kubenswrapper[4893]: I0314 08:29:58.166886 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e452fd28-7c38-40e8-8679-acb0d21e521e-utilities\") pod \"certified-operators-xh5r9\" (UID: \"e452fd28-7c38-40e8-8679-acb0d21e521e\") " pod="openshift-marketplace/certified-operators-xh5r9" Mar 14 08:29:58 crc kubenswrapper[4893]: I0314 08:29:58.268270 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e452fd28-7c38-40e8-8679-acb0d21e521e-catalog-content\") pod \"certified-operators-xh5r9\" (UID: \"e452fd28-7c38-40e8-8679-acb0d21e521e\") " pod="openshift-marketplace/certified-operators-xh5r9" Mar 14 08:29:58 crc kubenswrapper[4893]: I0314 08:29:58.268343 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5zl7\" (UniqueName: \"kubernetes.io/projected/e452fd28-7c38-40e8-8679-acb0d21e521e-kube-api-access-w5zl7\") pod \"certified-operators-xh5r9\" (UID: \"e452fd28-7c38-40e8-8679-acb0d21e521e\") " pod="openshift-marketplace/certified-operators-xh5r9" Mar 14 08:29:58 crc kubenswrapper[4893]: I0314 08:29:58.268380 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e452fd28-7c38-40e8-8679-acb0d21e521e-utilities\") pod \"certified-operators-xh5r9\" (UID: \"e452fd28-7c38-40e8-8679-acb0d21e521e\") " pod="openshift-marketplace/certified-operators-xh5r9" Mar 14 08:29:58 crc kubenswrapper[4893]: I0314 08:29:58.268878 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e452fd28-7c38-40e8-8679-acb0d21e521e-catalog-content\") pod \"certified-operators-xh5r9\" (UID: \"e452fd28-7c38-40e8-8679-acb0d21e521e\") " pod="openshift-marketplace/certified-operators-xh5r9" Mar 14 08:29:58 crc kubenswrapper[4893]: I0314 08:29:58.268933 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e452fd28-7c38-40e8-8679-acb0d21e521e-utilities\") pod \"certified-operators-xh5r9\" (UID: \"e452fd28-7c38-40e8-8679-acb0d21e521e\") " pod="openshift-marketplace/certified-operators-xh5r9" Mar 14 08:29:58 crc kubenswrapper[4893]: I0314 08:29:58.285662 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5zl7\" (UniqueName: \"kubernetes.io/projected/e452fd28-7c38-40e8-8679-acb0d21e521e-kube-api-access-w5zl7\") pod \"certified-operators-xh5r9\" (UID: \"e452fd28-7c38-40e8-8679-acb0d21e521e\") " pod="openshift-marketplace/certified-operators-xh5r9" Mar 14 08:29:58 crc kubenswrapper[4893]: I0314 08:29:58.365494 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xh5r9" Mar 14 08:29:59 crc kubenswrapper[4893]: I0314 08:29:59.008114 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xh5r9"] Mar 14 08:29:59 crc kubenswrapper[4893]: I0314 08:29:59.172686 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xh5r9" event={"ID":"e452fd28-7c38-40e8-8679-acb0d21e521e","Type":"ContainerStarted","Data":"f0481b6ef92bd9ca1f059097ed7221e9dfdd3419988761faefd465cc39f66814"} Mar 14 08:30:00 crc kubenswrapper[4893]: I0314 08:30:00.167482 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557950-dkbcc"] Mar 14 08:30:00 crc kubenswrapper[4893]: I0314 08:30:00.168614 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557950-dkbcc" Mar 14 08:30:00 crc kubenswrapper[4893]: I0314 08:30:00.170923 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 08:30:00 crc kubenswrapper[4893]: I0314 08:30:00.173705 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557950-gd5tq"] Mar 14 08:30:00 crc kubenswrapper[4893]: I0314 08:30:00.174939 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557950-gd5tq" Mar 14 08:30:00 crc kubenswrapper[4893]: I0314 08:30:00.177344 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-44qb7" Mar 14 08:30:00 crc kubenswrapper[4893]: I0314 08:30:00.177479 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 08:30:00 crc kubenswrapper[4893]: I0314 08:30:00.182295 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 14 08:30:00 crc kubenswrapper[4893]: I0314 08:30:00.182361 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 14 08:30:00 crc kubenswrapper[4893]: I0314 08:30:00.182449 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557950-dkbcc"] Mar 14 08:30:00 crc kubenswrapper[4893]: I0314 08:30:00.188510 4893 generic.go:334] "Generic (PLEG): container finished" podID="e452fd28-7c38-40e8-8679-acb0d21e521e" containerID="ae10efc3526b99e50b844502e512633ade4b6e92a784910965f995ee236f3e25" exitCode=0 Mar 14 08:30:00 crc kubenswrapper[4893]: I0314 08:30:00.188608 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xh5r9" event={"ID":"e452fd28-7c38-40e8-8679-acb0d21e521e","Type":"ContainerDied","Data":"ae10efc3526b99e50b844502e512633ade4b6e92a784910965f995ee236f3e25"} Mar 14 08:30:00 crc kubenswrapper[4893]: I0314 08:30:00.189937 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557950-gd5tq"] Mar 14 08:30:00 crc kubenswrapper[4893]: I0314 08:30:00.238509 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0ae27235-e9fd-4795-93b1-b3e050a2f894-secret-volume\") pod \"collect-profiles-29557950-gd5tq\" (UID: \"0ae27235-e9fd-4795-93b1-b3e050a2f894\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557950-gd5tq" Mar 14 08:30:00 crc kubenswrapper[4893]: I0314 08:30:00.239058 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjqpw\" (UniqueName: \"kubernetes.io/projected/a84b836f-f0fd-4554-8a08-51f2fe190947-kube-api-access-pjqpw\") pod \"auto-csr-approver-29557950-dkbcc\" (UID: \"a84b836f-f0fd-4554-8a08-51f2fe190947\") " pod="openshift-infra/auto-csr-approver-29557950-dkbcc" Mar 14 08:30:00 crc kubenswrapper[4893]: I0314 08:30:00.239184 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gz4w\" (UniqueName: \"kubernetes.io/projected/0ae27235-e9fd-4795-93b1-b3e050a2f894-kube-api-access-7gz4w\") pod \"collect-profiles-29557950-gd5tq\" (UID: \"0ae27235-e9fd-4795-93b1-b3e050a2f894\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557950-gd5tq" Mar 14 08:30:00 crc kubenswrapper[4893]: I0314 08:30:00.239229 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0ae27235-e9fd-4795-93b1-b3e050a2f894-config-volume\") pod \"collect-profiles-29557950-gd5tq\" (UID: \"0ae27235-e9fd-4795-93b1-b3e050a2f894\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557950-gd5tq" Mar 14 08:30:00 crc kubenswrapper[4893]: I0314 08:30:00.339764 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gz4w\" (UniqueName: \"kubernetes.io/projected/0ae27235-e9fd-4795-93b1-b3e050a2f894-kube-api-access-7gz4w\") pod \"collect-profiles-29557950-gd5tq\" (UID: \"0ae27235-e9fd-4795-93b1-b3e050a2f894\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557950-gd5tq" Mar 14 08:30:00 crc kubenswrapper[4893]: I0314 08:30:00.339829 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0ae27235-e9fd-4795-93b1-b3e050a2f894-config-volume\") pod \"collect-profiles-29557950-gd5tq\" (UID: \"0ae27235-e9fd-4795-93b1-b3e050a2f894\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557950-gd5tq" Mar 14 08:30:00 crc kubenswrapper[4893]: I0314 08:30:00.339851 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0ae27235-e9fd-4795-93b1-b3e050a2f894-secret-volume\") pod \"collect-profiles-29557950-gd5tq\" (UID: \"0ae27235-e9fd-4795-93b1-b3e050a2f894\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557950-gd5tq" Mar 14 08:30:00 crc kubenswrapper[4893]: I0314 08:30:00.339911 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjqpw\" (UniqueName: \"kubernetes.io/projected/a84b836f-f0fd-4554-8a08-51f2fe190947-kube-api-access-pjqpw\") pod \"auto-csr-approver-29557950-dkbcc\" (UID: \"a84b836f-f0fd-4554-8a08-51f2fe190947\") " pod="openshift-infra/auto-csr-approver-29557950-dkbcc" Mar 14 08:30:00 crc kubenswrapper[4893]: I0314 08:30:00.340773 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0ae27235-e9fd-4795-93b1-b3e050a2f894-config-volume\") pod \"collect-profiles-29557950-gd5tq\" (UID: \"0ae27235-e9fd-4795-93b1-b3e050a2f894\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557950-gd5tq" Mar 14 08:30:00 crc kubenswrapper[4893]: I0314 08:30:00.357314 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0ae27235-e9fd-4795-93b1-b3e050a2f894-secret-volume\") pod \"collect-profiles-29557950-gd5tq\" (UID: \"0ae27235-e9fd-4795-93b1-b3e050a2f894\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557950-gd5tq" Mar 14 08:30:00 crc kubenswrapper[4893]: I0314 08:30:00.360264 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjqpw\" (UniqueName: \"kubernetes.io/projected/a84b836f-f0fd-4554-8a08-51f2fe190947-kube-api-access-pjqpw\") pod \"auto-csr-approver-29557950-dkbcc\" (UID: \"a84b836f-f0fd-4554-8a08-51f2fe190947\") " pod="openshift-infra/auto-csr-approver-29557950-dkbcc" Mar 14 08:30:00 crc kubenswrapper[4893]: I0314 08:30:00.370358 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gz4w\" (UniqueName: \"kubernetes.io/projected/0ae27235-e9fd-4795-93b1-b3e050a2f894-kube-api-access-7gz4w\") pod \"collect-profiles-29557950-gd5tq\" (UID: \"0ae27235-e9fd-4795-93b1-b3e050a2f894\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557950-gd5tq" Mar 14 08:30:00 crc kubenswrapper[4893]: I0314 08:30:00.487778 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557950-dkbcc" Mar 14 08:30:00 crc kubenswrapper[4893]: I0314 08:30:00.495136 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557950-gd5tq" Mar 14 08:30:00 crc kubenswrapper[4893]: I0314 08:30:00.928223 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557950-gd5tq"] Mar 14 08:30:00 crc kubenswrapper[4893]: W0314 08:30:00.937032 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ae27235_e9fd_4795_93b1_b3e050a2f894.slice/crio-413afd5aa236410b42f7c11c75c037aff114ad4f3ca6f9442d1de712b12617b1 WatchSource:0}: Error finding container 413afd5aa236410b42f7c11c75c037aff114ad4f3ca6f9442d1de712b12617b1: Status 404 returned error can't find the container with id 413afd5aa236410b42f7c11c75c037aff114ad4f3ca6f9442d1de712b12617b1 Mar 14 08:30:00 crc kubenswrapper[4893]: I0314 08:30:00.981608 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557950-dkbcc"] Mar 14 08:30:01 crc kubenswrapper[4893]: I0314 08:30:01.195634 4893 generic.go:334] "Generic (PLEG): container finished" podID="e452fd28-7c38-40e8-8679-acb0d21e521e" containerID="a73b95ae7fbf596ae2d5393144bab4059b62a6e46dfbeb52ab78a18d1a25231f" exitCode=0 Mar 14 08:30:01 crc kubenswrapper[4893]: I0314 08:30:01.195688 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xh5r9" event={"ID":"e452fd28-7c38-40e8-8679-acb0d21e521e","Type":"ContainerDied","Data":"a73b95ae7fbf596ae2d5393144bab4059b62a6e46dfbeb52ab78a18d1a25231f"} Mar 14 08:30:01 crc kubenswrapper[4893]: I0314 08:30:01.198346 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557950-dkbcc" event={"ID":"a84b836f-f0fd-4554-8a08-51f2fe190947","Type":"ContainerStarted","Data":"5537157ef73c30df1289f047ce1ffe9d0662c90d898d15d31911d714d690d9bd"} Mar 14 08:30:01 crc kubenswrapper[4893]: I0314 08:30:01.200679 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557950-gd5tq" event={"ID":"0ae27235-e9fd-4795-93b1-b3e050a2f894","Type":"ContainerStarted","Data":"5db305ff9fd5aac1d480b8e496f0bc51d1d94e18a5d2c8734625add377eee9b8"} Mar 14 08:30:01 crc kubenswrapper[4893]: I0314 08:30:01.200724 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557950-gd5tq" event={"ID":"0ae27235-e9fd-4795-93b1-b3e050a2f894","Type":"ContainerStarted","Data":"413afd5aa236410b42f7c11c75c037aff114ad4f3ca6f9442d1de712b12617b1"} Mar 14 08:30:01 crc kubenswrapper[4893]: I0314 08:30:01.249206 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29557950-gd5tq" podStartSLOduration=1.249185371 podStartE2EDuration="1.249185371s" podCreationTimestamp="2026-03-14 08:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:30:01.235798514 +0000 UTC m=+5480.497975326" watchObservedRunningTime="2026-03-14 08:30:01.249185371 +0000 UTC m=+5480.511362173" Mar 14 08:30:01 crc kubenswrapper[4893]: I0314 08:30:01.381877 4893 scope.go:117] "RemoveContainer" containerID="9caed2e46027ef871354b1defae4055b2942b0a9ac8f8b215e24dd5cd352dc86" Mar 14 08:30:02 crc kubenswrapper[4893]: I0314 08:30:02.210973 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" event={"ID":"ad6724e5-48cf-4417-ae51-b1cb8c6af70d","Type":"ContainerStarted","Data":"3083f08c6eb6e0b335286ff295ebf4380e7a6e56f11d4fec39ab2b9ffa84f078"} Mar 14 08:30:02 crc kubenswrapper[4893]: I0314 08:30:02.214680 4893 generic.go:334] "Generic (PLEG): container finished" podID="0ae27235-e9fd-4795-93b1-b3e050a2f894" containerID="5db305ff9fd5aac1d480b8e496f0bc51d1d94e18a5d2c8734625add377eee9b8" exitCode=0 Mar 14 08:30:02 crc kubenswrapper[4893]: I0314 08:30:02.214773 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557950-gd5tq" event={"ID":"0ae27235-e9fd-4795-93b1-b3e050a2f894","Type":"ContainerDied","Data":"5db305ff9fd5aac1d480b8e496f0bc51d1d94e18a5d2c8734625add377eee9b8"} Mar 14 08:30:02 crc kubenswrapper[4893]: I0314 08:30:02.218041 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xh5r9" event={"ID":"e452fd28-7c38-40e8-8679-acb0d21e521e","Type":"ContainerStarted","Data":"2e7db4acec8be943ed88c0d6428b36b8b5aeee96bf812d8ef2c1fceb5301fb04"} Mar 14 08:30:02 crc kubenswrapper[4893]: I0314 08:30:02.250792 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xh5r9" podStartSLOduration=2.6172791010000003 podStartE2EDuration="4.250775178s" podCreationTimestamp="2026-03-14 08:29:58 +0000 UTC" firstStartedPulling="2026-03-14 08:30:00.19131565 +0000 UTC m=+5479.453492432" lastFinishedPulling="2026-03-14 08:30:01.824811707 +0000 UTC m=+5481.086988509" observedRunningTime="2026-03-14 08:30:02.24472514 +0000 UTC m=+5481.506901942" watchObservedRunningTime="2026-03-14 08:30:02.250775178 +0000 UTC m=+5481.512951970" Mar 14 08:30:03 crc kubenswrapper[4893]: I0314 08:30:03.226611 4893 generic.go:334] "Generic (PLEG): container finished" podID="a84b836f-f0fd-4554-8a08-51f2fe190947" containerID="ec6635435e148993d86b30cf27351aa2053df23fc218f5701b6c230e1b18ffc4" exitCode=0 Mar 14 08:30:03 crc kubenswrapper[4893]: I0314 08:30:03.226673 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557950-dkbcc" event={"ID":"a84b836f-f0fd-4554-8a08-51f2fe190947","Type":"ContainerDied","Data":"ec6635435e148993d86b30cf27351aa2053df23fc218f5701b6c230e1b18ffc4"} Mar 14 08:30:03 crc kubenswrapper[4893]: I0314 08:30:03.560643 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557950-gd5tq" Mar 14 08:30:03 crc kubenswrapper[4893]: I0314 08:30:03.684226 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gz4w\" (UniqueName: \"kubernetes.io/projected/0ae27235-e9fd-4795-93b1-b3e050a2f894-kube-api-access-7gz4w\") pod \"0ae27235-e9fd-4795-93b1-b3e050a2f894\" (UID: \"0ae27235-e9fd-4795-93b1-b3e050a2f894\") " Mar 14 08:30:03 crc kubenswrapper[4893]: I0314 08:30:03.684286 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0ae27235-e9fd-4795-93b1-b3e050a2f894-secret-volume\") pod \"0ae27235-e9fd-4795-93b1-b3e050a2f894\" (UID: \"0ae27235-e9fd-4795-93b1-b3e050a2f894\") " Mar 14 08:30:03 crc kubenswrapper[4893]: I0314 08:30:03.684383 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0ae27235-e9fd-4795-93b1-b3e050a2f894-config-volume\") pod \"0ae27235-e9fd-4795-93b1-b3e050a2f894\" (UID: \"0ae27235-e9fd-4795-93b1-b3e050a2f894\") " Mar 14 08:30:03 crc kubenswrapper[4893]: I0314 08:30:03.685252 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ae27235-e9fd-4795-93b1-b3e050a2f894-config-volume" (OuterVolumeSpecName: "config-volume") pod "0ae27235-e9fd-4795-93b1-b3e050a2f894" (UID: "0ae27235-e9fd-4795-93b1-b3e050a2f894"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:30:03 crc kubenswrapper[4893]: I0314 08:30:03.706846 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ae27235-e9fd-4795-93b1-b3e050a2f894-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0ae27235-e9fd-4795-93b1-b3e050a2f894" (UID: "0ae27235-e9fd-4795-93b1-b3e050a2f894"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:30:03 crc kubenswrapper[4893]: I0314 08:30:03.707019 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ae27235-e9fd-4795-93b1-b3e050a2f894-kube-api-access-7gz4w" (OuterVolumeSpecName: "kube-api-access-7gz4w") pod "0ae27235-e9fd-4795-93b1-b3e050a2f894" (UID: "0ae27235-e9fd-4795-93b1-b3e050a2f894"). InnerVolumeSpecName "kube-api-access-7gz4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:30:03 crc kubenswrapper[4893]: I0314 08:30:03.786089 4893 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0ae27235-e9fd-4795-93b1-b3e050a2f894-config-volume\") on node \"crc\" DevicePath \"\"" Mar 14 08:30:03 crc kubenswrapper[4893]: I0314 08:30:03.786128 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gz4w\" (UniqueName: \"kubernetes.io/projected/0ae27235-e9fd-4795-93b1-b3e050a2f894-kube-api-access-7gz4w\") on node \"crc\" DevicePath \"\"" Mar 14 08:30:03 crc kubenswrapper[4893]: I0314 08:30:03.786138 4893 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0ae27235-e9fd-4795-93b1-b3e050a2f894-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 14 08:30:04 crc kubenswrapper[4893]: I0314 08:30:04.239223 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557950-gd5tq" event={"ID":"0ae27235-e9fd-4795-93b1-b3e050a2f894","Type":"ContainerDied","Data":"413afd5aa236410b42f7c11c75c037aff114ad4f3ca6f9442d1de712b12617b1"} Mar 14 08:30:04 crc kubenswrapper[4893]: I0314 08:30:04.239299 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="413afd5aa236410b42f7c11c75c037aff114ad4f3ca6f9442d1de712b12617b1" Mar 14 08:30:04 crc kubenswrapper[4893]: I0314 08:30:04.239235 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557950-gd5tq" Mar 14 08:30:04 crc kubenswrapper[4893]: I0314 08:30:04.324663 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557905-v85sq"] Mar 14 08:30:04 crc kubenswrapper[4893]: I0314 08:30:04.336866 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557905-v85sq"] Mar 14 08:30:04 crc kubenswrapper[4893]: I0314 08:30:04.497301 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557950-dkbcc" Mar 14 08:30:04 crc kubenswrapper[4893]: I0314 08:30:04.597482 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjqpw\" (UniqueName: \"kubernetes.io/projected/a84b836f-f0fd-4554-8a08-51f2fe190947-kube-api-access-pjqpw\") pod \"a84b836f-f0fd-4554-8a08-51f2fe190947\" (UID: \"a84b836f-f0fd-4554-8a08-51f2fe190947\") " Mar 14 08:30:04 crc kubenswrapper[4893]: I0314 08:30:04.605994 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a84b836f-f0fd-4554-8a08-51f2fe190947-kube-api-access-pjqpw" (OuterVolumeSpecName: "kube-api-access-pjqpw") pod "a84b836f-f0fd-4554-8a08-51f2fe190947" (UID: "a84b836f-f0fd-4554-8a08-51f2fe190947"). InnerVolumeSpecName "kube-api-access-pjqpw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:30:04 crc kubenswrapper[4893]: I0314 08:30:04.699175 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjqpw\" (UniqueName: \"kubernetes.io/projected/a84b836f-f0fd-4554-8a08-51f2fe190947-kube-api-access-pjqpw\") on node \"crc\" DevicePath \"\"" Mar 14 08:30:05 crc kubenswrapper[4893]: I0314 08:30:05.247869 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557950-dkbcc" event={"ID":"a84b836f-f0fd-4554-8a08-51f2fe190947","Type":"ContainerDied","Data":"5537157ef73c30df1289f047ce1ffe9d0662c90d898d15d31911d714d690d9bd"} Mar 14 08:30:05 crc kubenswrapper[4893]: I0314 08:30:05.247928 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5537157ef73c30df1289f047ce1ffe9d0662c90d898d15d31911d714d690d9bd" Mar 14 08:30:05 crc kubenswrapper[4893]: I0314 08:30:05.247953 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557950-dkbcc" Mar 14 08:30:05 crc kubenswrapper[4893]: I0314 08:30:05.385919 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42235e44-e2b1-4658-a6f9-994a77d58696" path="/var/lib/kubelet/pods/42235e44-e2b1-4658-a6f9-994a77d58696/volumes" Mar 14 08:30:05 crc kubenswrapper[4893]: I0314 08:30:05.555037 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557944-rl77b"] Mar 14 08:30:05 crc kubenswrapper[4893]: I0314 08:30:05.561749 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557944-rl77b"] Mar 14 08:30:07 crc kubenswrapper[4893]: I0314 08:30:07.391488 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d7eda3e-abf0-407a-bb29-efee396949bc" path="/var/lib/kubelet/pods/3d7eda3e-abf0-407a-bb29-efee396949bc/volumes" Mar 14 08:30:08 crc kubenswrapper[4893]: I0314 08:30:08.365981 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xh5r9" Mar 14 08:30:08 crc kubenswrapper[4893]: I0314 08:30:08.366054 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xh5r9" Mar 14 08:30:08 crc kubenswrapper[4893]: I0314 08:30:08.422728 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xh5r9" Mar 14 08:30:09 crc kubenswrapper[4893]: I0314 08:30:09.333058 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xh5r9" Mar 14 08:30:09 crc kubenswrapper[4893]: I0314 08:30:09.408324 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xh5r9"] Mar 14 08:30:10 crc kubenswrapper[4893]: I0314 08:30:10.168506 4893 scope.go:117] "RemoveContainer" containerID="1b4d799c3397d0bacd3987a1f21bdcd15a65ff61014165699516fde7d0ec6ebe" Mar 14 08:30:10 crc kubenswrapper[4893]: I0314 08:30:10.189824 4893 scope.go:117] "RemoveContainer" containerID="7ef3de15bcbd9abc0cd13171e49ba1e36807a035d530cf15d179a4a1c6f755cb" Mar 14 08:30:11 crc kubenswrapper[4893]: I0314 08:30:11.297390 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xh5r9" podUID="e452fd28-7c38-40e8-8679-acb0d21e521e" containerName="registry-server" containerID="cri-o://2e7db4acec8be943ed88c0d6428b36b8b5aeee96bf812d8ef2c1fceb5301fb04" gracePeriod=2 Mar 14 08:30:11 crc kubenswrapper[4893]: I0314 08:30:11.679034 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xh5r9" Mar 14 08:30:11 crc kubenswrapper[4893]: I0314 08:30:11.798348 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e452fd28-7c38-40e8-8679-acb0d21e521e-utilities\") pod \"e452fd28-7c38-40e8-8679-acb0d21e521e\" (UID: \"e452fd28-7c38-40e8-8679-acb0d21e521e\") " Mar 14 08:30:11 crc kubenswrapper[4893]: I0314 08:30:11.798902 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5zl7\" (UniqueName: \"kubernetes.io/projected/e452fd28-7c38-40e8-8679-acb0d21e521e-kube-api-access-w5zl7\") pod \"e452fd28-7c38-40e8-8679-acb0d21e521e\" (UID: \"e452fd28-7c38-40e8-8679-acb0d21e521e\") " Mar 14 08:30:11 crc kubenswrapper[4893]: I0314 08:30:11.799000 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e452fd28-7c38-40e8-8679-acb0d21e521e-catalog-content\") pod \"e452fd28-7c38-40e8-8679-acb0d21e521e\" (UID: \"e452fd28-7c38-40e8-8679-acb0d21e521e\") " Mar 14 08:30:11 crc kubenswrapper[4893]: I0314 08:30:11.799344 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e452fd28-7c38-40e8-8679-acb0d21e521e-utilities" (OuterVolumeSpecName: "utilities") pod "e452fd28-7c38-40e8-8679-acb0d21e521e" (UID: "e452fd28-7c38-40e8-8679-acb0d21e521e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:30:11 crc kubenswrapper[4893]: I0314 08:30:11.805880 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e452fd28-7c38-40e8-8679-acb0d21e521e-kube-api-access-w5zl7" (OuterVolumeSpecName: "kube-api-access-w5zl7") pod "e452fd28-7c38-40e8-8679-acb0d21e521e" (UID: "e452fd28-7c38-40e8-8679-acb0d21e521e"). InnerVolumeSpecName "kube-api-access-w5zl7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:30:11 crc kubenswrapper[4893]: I0314 08:30:11.900982 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5zl7\" (UniqueName: \"kubernetes.io/projected/e452fd28-7c38-40e8-8679-acb0d21e521e-kube-api-access-w5zl7\") on node \"crc\" DevicePath \"\"" Mar 14 08:30:11 crc kubenswrapper[4893]: I0314 08:30:11.901015 4893 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e452fd28-7c38-40e8-8679-acb0d21e521e-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 08:30:12 crc kubenswrapper[4893]: I0314 08:30:12.313587 4893 generic.go:334] "Generic (PLEG): container finished" podID="e452fd28-7c38-40e8-8679-acb0d21e521e" containerID="2e7db4acec8be943ed88c0d6428b36b8b5aeee96bf812d8ef2c1fceb5301fb04" exitCode=0 Mar 14 08:30:12 crc kubenswrapper[4893]: I0314 08:30:12.313592 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xh5r9" event={"ID":"e452fd28-7c38-40e8-8679-acb0d21e521e","Type":"ContainerDied","Data":"2e7db4acec8be943ed88c0d6428b36b8b5aeee96bf812d8ef2c1fceb5301fb04"} Mar 14 08:30:12 crc kubenswrapper[4893]: I0314 08:30:12.313669 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xh5r9" event={"ID":"e452fd28-7c38-40e8-8679-acb0d21e521e","Type":"ContainerDied","Data":"f0481b6ef92bd9ca1f059097ed7221e9dfdd3419988761faefd465cc39f66814"} Mar 14 08:30:12 crc kubenswrapper[4893]: I0314 08:30:12.313700 4893 scope.go:117] "RemoveContainer" containerID="2e7db4acec8be943ed88c0d6428b36b8b5aeee96bf812d8ef2c1fceb5301fb04" Mar 14 08:30:12 crc kubenswrapper[4893]: I0314 08:30:12.317804 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xh5r9" Mar 14 08:30:12 crc kubenswrapper[4893]: I0314 08:30:12.336783 4893 scope.go:117] "RemoveContainer" containerID="a73b95ae7fbf596ae2d5393144bab4059b62a6e46dfbeb52ab78a18d1a25231f" Mar 14 08:30:12 crc kubenswrapper[4893]: I0314 08:30:12.356272 4893 scope.go:117] "RemoveContainer" containerID="ae10efc3526b99e50b844502e512633ade4b6e92a784910965f995ee236f3e25" Mar 14 08:30:12 crc kubenswrapper[4893]: I0314 08:30:12.398690 4893 scope.go:117] "RemoveContainer" containerID="2e7db4acec8be943ed88c0d6428b36b8b5aeee96bf812d8ef2c1fceb5301fb04" Mar 14 08:30:12 crc kubenswrapper[4893]: E0314 08:30:12.399205 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e7db4acec8be943ed88c0d6428b36b8b5aeee96bf812d8ef2c1fceb5301fb04\": container with ID starting with 2e7db4acec8be943ed88c0d6428b36b8b5aeee96bf812d8ef2c1fceb5301fb04 not found: ID does not exist" containerID="2e7db4acec8be943ed88c0d6428b36b8b5aeee96bf812d8ef2c1fceb5301fb04" Mar 14 08:30:12 crc kubenswrapper[4893]: I0314 08:30:12.399258 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e7db4acec8be943ed88c0d6428b36b8b5aeee96bf812d8ef2c1fceb5301fb04"} err="failed to get container status \"2e7db4acec8be943ed88c0d6428b36b8b5aeee96bf812d8ef2c1fceb5301fb04\": rpc error: code = NotFound desc = could not find container \"2e7db4acec8be943ed88c0d6428b36b8b5aeee96bf812d8ef2c1fceb5301fb04\": container with ID starting with 2e7db4acec8be943ed88c0d6428b36b8b5aeee96bf812d8ef2c1fceb5301fb04 not found: ID does not exist" Mar 14 08:30:12 crc kubenswrapper[4893]: I0314 08:30:12.399291 4893 scope.go:117] "RemoveContainer" containerID="a73b95ae7fbf596ae2d5393144bab4059b62a6e46dfbeb52ab78a18d1a25231f" Mar 14 08:30:12 crc kubenswrapper[4893]: E0314 08:30:12.399661 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a73b95ae7fbf596ae2d5393144bab4059b62a6e46dfbeb52ab78a18d1a25231f\": container with ID starting with a73b95ae7fbf596ae2d5393144bab4059b62a6e46dfbeb52ab78a18d1a25231f not found: ID does not exist" containerID="a73b95ae7fbf596ae2d5393144bab4059b62a6e46dfbeb52ab78a18d1a25231f" Mar 14 08:30:12 crc kubenswrapper[4893]: I0314 08:30:12.399691 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a73b95ae7fbf596ae2d5393144bab4059b62a6e46dfbeb52ab78a18d1a25231f"} err="failed to get container status \"a73b95ae7fbf596ae2d5393144bab4059b62a6e46dfbeb52ab78a18d1a25231f\": rpc error: code = NotFound desc = could not find container \"a73b95ae7fbf596ae2d5393144bab4059b62a6e46dfbeb52ab78a18d1a25231f\": container with ID starting with a73b95ae7fbf596ae2d5393144bab4059b62a6e46dfbeb52ab78a18d1a25231f not found: ID does not exist" Mar 14 08:30:12 crc kubenswrapper[4893]: I0314 08:30:12.399720 4893 scope.go:117] "RemoveContainer" containerID="ae10efc3526b99e50b844502e512633ade4b6e92a784910965f995ee236f3e25" Mar 14 08:30:12 crc kubenswrapper[4893]: E0314 08:30:12.399949 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae10efc3526b99e50b844502e512633ade4b6e92a784910965f995ee236f3e25\": container with ID starting with ae10efc3526b99e50b844502e512633ade4b6e92a784910965f995ee236f3e25 not found: ID does not exist" containerID="ae10efc3526b99e50b844502e512633ade4b6e92a784910965f995ee236f3e25" Mar 14 08:30:12 crc kubenswrapper[4893]: I0314 08:30:12.399978 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae10efc3526b99e50b844502e512633ade4b6e92a784910965f995ee236f3e25"} err="failed to get container status \"ae10efc3526b99e50b844502e512633ade4b6e92a784910965f995ee236f3e25\": rpc error: code = NotFound desc = could not find container \"ae10efc3526b99e50b844502e512633ade4b6e92a784910965f995ee236f3e25\": container with ID starting with ae10efc3526b99e50b844502e512633ade4b6e92a784910965f995ee236f3e25 not found: ID does not exist" Mar 14 08:30:12 crc kubenswrapper[4893]: I0314 08:30:12.407657 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e452fd28-7c38-40e8-8679-acb0d21e521e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e452fd28-7c38-40e8-8679-acb0d21e521e" (UID: "e452fd28-7c38-40e8-8679-acb0d21e521e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:30:12 crc kubenswrapper[4893]: I0314 08:30:12.508323 4893 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e452fd28-7c38-40e8-8679-acb0d21e521e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 08:30:12 crc kubenswrapper[4893]: I0314 08:30:12.665431 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xh5r9"] Mar 14 08:30:12 crc kubenswrapper[4893]: I0314 08:30:12.671881 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xh5r9"] Mar 14 08:30:13 crc kubenswrapper[4893]: I0314 08:30:13.394344 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e452fd28-7c38-40e8-8679-acb0d21e521e" path="/var/lib/kubelet/pods/e452fd28-7c38-40e8-8679-acb0d21e521e/volumes" Mar 14 08:32:00 crc kubenswrapper[4893]: I0314 08:32:00.136571 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557952-8n84s"] Mar 14 08:32:00 crc kubenswrapper[4893]: E0314 08:32:00.137339 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e452fd28-7c38-40e8-8679-acb0d21e521e" containerName="extract-utilities" Mar 14 08:32:00 crc kubenswrapper[4893]: I0314 08:32:00.137354 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="e452fd28-7c38-40e8-8679-acb0d21e521e" containerName="extract-utilities" Mar 14 08:32:00 crc kubenswrapper[4893]: E0314 08:32:00.137369 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ae27235-e9fd-4795-93b1-b3e050a2f894" containerName="collect-profiles" Mar 14 08:32:00 crc kubenswrapper[4893]: I0314 08:32:00.137378 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ae27235-e9fd-4795-93b1-b3e050a2f894" containerName="collect-profiles" Mar 14 08:32:00 crc kubenswrapper[4893]: E0314 08:32:00.137403 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e452fd28-7c38-40e8-8679-acb0d21e521e" containerName="registry-server" Mar 14 08:32:00 crc kubenswrapper[4893]: I0314 08:32:00.137409 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="e452fd28-7c38-40e8-8679-acb0d21e521e" containerName="registry-server" Mar 14 08:32:00 crc kubenswrapper[4893]: E0314 08:32:00.137423 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e452fd28-7c38-40e8-8679-acb0d21e521e" containerName="extract-content" Mar 14 08:32:00 crc kubenswrapper[4893]: I0314 08:32:00.137430 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="e452fd28-7c38-40e8-8679-acb0d21e521e" containerName="extract-content" Mar 14 08:32:00 crc kubenswrapper[4893]: E0314 08:32:00.137440 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a84b836f-f0fd-4554-8a08-51f2fe190947" containerName="oc" Mar 14 08:32:00 crc kubenswrapper[4893]: I0314 08:32:00.137448 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="a84b836f-f0fd-4554-8a08-51f2fe190947" containerName="oc" Mar 14 08:32:00 crc kubenswrapper[4893]: I0314 08:32:00.137639 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="a84b836f-f0fd-4554-8a08-51f2fe190947" containerName="oc" Mar 14 08:32:00 crc kubenswrapper[4893]: I0314 08:32:00.137656 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="e452fd28-7c38-40e8-8679-acb0d21e521e" containerName="registry-server" Mar 14 08:32:00 crc kubenswrapper[4893]: I0314 08:32:00.137664 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ae27235-e9fd-4795-93b1-b3e050a2f894" containerName="collect-profiles" Mar 14 08:32:00 crc kubenswrapper[4893]: I0314 08:32:00.138106 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557952-8n84s" Mar 14 08:32:00 crc kubenswrapper[4893]: I0314 08:32:00.141243 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 08:32:00 crc kubenswrapper[4893]: I0314 08:32:00.141386 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-44qb7" Mar 14 08:32:00 crc kubenswrapper[4893]: I0314 08:32:00.141678 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 08:32:00 crc kubenswrapper[4893]: I0314 08:32:00.151775 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557952-8n84s"] Mar 14 08:32:00 crc kubenswrapper[4893]: I0314 08:32:00.236948 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nj4qk\" (UniqueName: \"kubernetes.io/projected/087cb6ca-245e-409d-85dc-3e1127fa79c7-kube-api-access-nj4qk\") pod \"auto-csr-approver-29557952-8n84s\" (UID: \"087cb6ca-245e-409d-85dc-3e1127fa79c7\") " pod="openshift-infra/auto-csr-approver-29557952-8n84s" Mar 14 08:32:00 crc kubenswrapper[4893]: I0314 08:32:00.338603 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nj4qk\" (UniqueName: \"kubernetes.io/projected/087cb6ca-245e-409d-85dc-3e1127fa79c7-kube-api-access-nj4qk\") pod \"auto-csr-approver-29557952-8n84s\" (UID: \"087cb6ca-245e-409d-85dc-3e1127fa79c7\") " pod="openshift-infra/auto-csr-approver-29557952-8n84s" Mar 14 08:32:00 crc kubenswrapper[4893]: I0314 08:32:00.359959 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nj4qk\" (UniqueName: \"kubernetes.io/projected/087cb6ca-245e-409d-85dc-3e1127fa79c7-kube-api-access-nj4qk\") pod \"auto-csr-approver-29557952-8n84s\" (UID: \"087cb6ca-245e-409d-85dc-3e1127fa79c7\") " pod="openshift-infra/auto-csr-approver-29557952-8n84s" Mar 14 08:32:00 crc kubenswrapper[4893]: I0314 08:32:00.471269 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557952-8n84s" Mar 14 08:32:00 crc kubenswrapper[4893]: I0314 08:32:00.892922 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557952-8n84s"] Mar 14 08:32:00 crc kubenswrapper[4893]: I0314 08:32:00.897499 4893 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 08:32:01 crc kubenswrapper[4893]: I0314 08:32:01.179141 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557952-8n84s" event={"ID":"087cb6ca-245e-409d-85dc-3e1127fa79c7","Type":"ContainerStarted","Data":"11a7b76aaf50db477b7bece1957c4ab4c2687311fcd90c638ebcda679a51524f"} Mar 14 08:32:02 crc kubenswrapper[4893]: I0314 08:32:02.188800 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557952-8n84s" event={"ID":"087cb6ca-245e-409d-85dc-3e1127fa79c7","Type":"ContainerStarted","Data":"bee0f519d3e5f5f9352b6d23fdbbddc534e45e7862e1761d039e68b32eb9c48c"} Mar 14 08:32:02 crc kubenswrapper[4893]: I0314 08:32:02.206710 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557952-8n84s" podStartSLOduration=1.388930252 podStartE2EDuration="2.206687538s" podCreationTimestamp="2026-03-14 08:32:00 +0000 UTC" firstStartedPulling="2026-03-14 08:32:00.897192083 +0000 UTC m=+5600.159368885" lastFinishedPulling="2026-03-14 08:32:01.714949379 +0000 UTC m=+5600.977126171" observedRunningTime="2026-03-14 08:32:02.20265471 +0000 UTC m=+5601.464831502" watchObservedRunningTime="2026-03-14 08:32:02.206687538 +0000 UTC m=+5601.468864320" Mar 14 08:32:03 crc kubenswrapper[4893]: I0314 08:32:03.198050 4893 generic.go:334] "Generic (PLEG): container finished" podID="087cb6ca-245e-409d-85dc-3e1127fa79c7" containerID="bee0f519d3e5f5f9352b6d23fdbbddc534e45e7862e1761d039e68b32eb9c48c" exitCode=0 Mar 14 08:32:03 crc kubenswrapper[4893]: I0314 08:32:03.198112 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557952-8n84s" event={"ID":"087cb6ca-245e-409d-85dc-3e1127fa79c7","Type":"ContainerDied","Data":"bee0f519d3e5f5f9352b6d23fdbbddc534e45e7862e1761d039e68b32eb9c48c"} Mar 14 08:32:04 crc kubenswrapper[4893]: I0314 08:32:04.474391 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557952-8n84s" Mar 14 08:32:04 crc kubenswrapper[4893]: I0314 08:32:04.592651 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nj4qk\" (UniqueName: \"kubernetes.io/projected/087cb6ca-245e-409d-85dc-3e1127fa79c7-kube-api-access-nj4qk\") pod \"087cb6ca-245e-409d-85dc-3e1127fa79c7\" (UID: \"087cb6ca-245e-409d-85dc-3e1127fa79c7\") " Mar 14 08:32:04 crc kubenswrapper[4893]: I0314 08:32:04.600495 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/087cb6ca-245e-409d-85dc-3e1127fa79c7-kube-api-access-nj4qk" (OuterVolumeSpecName: "kube-api-access-nj4qk") pod "087cb6ca-245e-409d-85dc-3e1127fa79c7" (UID: "087cb6ca-245e-409d-85dc-3e1127fa79c7"). InnerVolumeSpecName "kube-api-access-nj4qk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:32:04 crc kubenswrapper[4893]: I0314 08:32:04.694225 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nj4qk\" (UniqueName: \"kubernetes.io/projected/087cb6ca-245e-409d-85dc-3e1127fa79c7-kube-api-access-nj4qk\") on node \"crc\" DevicePath \"\"" Mar 14 08:32:05 crc kubenswrapper[4893]: I0314 08:32:05.210097 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557952-8n84s" event={"ID":"087cb6ca-245e-409d-85dc-3e1127fa79c7","Type":"ContainerDied","Data":"11a7b76aaf50db477b7bece1957c4ab4c2687311fcd90c638ebcda679a51524f"} Mar 14 08:32:05 crc kubenswrapper[4893]: I0314 08:32:05.210156 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11a7b76aaf50db477b7bece1957c4ab4c2687311fcd90c638ebcda679a51524f" Mar 14 08:32:05 crc kubenswrapper[4893]: I0314 08:32:05.210123 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557952-8n84s" Mar 14 08:32:05 crc kubenswrapper[4893]: I0314 08:32:05.542014 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557946-dllnx"] Mar 14 08:32:05 crc kubenswrapper[4893]: I0314 08:32:05.547780 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557946-dllnx"] Mar 14 08:32:07 crc kubenswrapper[4893]: I0314 08:32:07.392876 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79ca7dde-9be2-477e-a6fe-8bf762e0d234" path="/var/lib/kubelet/pods/79ca7dde-9be2-477e-a6fe-8bf762e0d234/volumes" Mar 14 08:32:10 crc kubenswrapper[4893]: I0314 08:32:10.318624 4893 scope.go:117] "RemoveContainer" containerID="fe473cf3aad2a371b299f9f39d2827e746b7c2b39c7784e3208f8afc8bb9f8ae" Mar 14 08:32:29 crc kubenswrapper[4893]: I0314 08:32:29.731113 4893 patch_prober.go:28] interesting pod/machine-config-daemon-d4x6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:32:29 crc kubenswrapper[4893]: I0314 08:32:29.731735 4893 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:32:59 crc kubenswrapper[4893]: I0314 08:32:59.731453 4893 patch_prober.go:28] interesting pod/machine-config-daemon-d4x6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:32:59 crc kubenswrapper[4893]: I0314 08:32:59.732059 4893 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:33:29 crc kubenswrapper[4893]: I0314 08:33:29.731146 4893 patch_prober.go:28] interesting pod/machine-config-daemon-d4x6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:33:29 crc kubenswrapper[4893]: I0314 08:33:29.731919 4893 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:33:29 crc kubenswrapper[4893]: I0314 08:33:29.731986 4893 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" Mar 14 08:33:29 crc kubenswrapper[4893]: I0314 08:33:29.732852 4893 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3083f08c6eb6e0b335286ff295ebf4380e7a6e56f11d4fec39ab2b9ffa84f078"} pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 08:33:29 crc kubenswrapper[4893]: I0314 08:33:29.733018 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" containerID="cri-o://3083f08c6eb6e0b335286ff295ebf4380e7a6e56f11d4fec39ab2b9ffa84f078" gracePeriod=600 Mar 14 08:33:30 crc kubenswrapper[4893]: I0314 08:33:30.304772 4893 generic.go:334] "Generic (PLEG): container finished" podID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerID="3083f08c6eb6e0b335286ff295ebf4380e7a6e56f11d4fec39ab2b9ffa84f078" exitCode=0 Mar 14 08:33:30 crc kubenswrapper[4893]: I0314 08:33:30.304836 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" event={"ID":"ad6724e5-48cf-4417-ae51-b1cb8c6af70d","Type":"ContainerDied","Data":"3083f08c6eb6e0b335286ff295ebf4380e7a6e56f11d4fec39ab2b9ffa84f078"} Mar 14 08:33:30 crc kubenswrapper[4893]: I0314 08:33:30.305275 4893 scope.go:117] "RemoveContainer" containerID="9caed2e46027ef871354b1defae4055b2942b0a9ac8f8b215e24dd5cd352dc86" Mar 14 08:33:31 crc kubenswrapper[4893]: I0314 08:33:31.315647 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" event={"ID":"ad6724e5-48cf-4417-ae51-b1cb8c6af70d","Type":"ContainerStarted","Data":"19d1c71c015278c3b832cc9c2d45649b6580e0fc82b6df298faf32bcd6ab8569"} Mar 14 08:34:00 crc kubenswrapper[4893]: I0314 08:34:00.141487 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557954-kkc98"] Mar 14 08:34:00 crc kubenswrapper[4893]: E0314 08:34:00.142451 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="087cb6ca-245e-409d-85dc-3e1127fa79c7" containerName="oc" Mar 14 08:34:00 crc kubenswrapper[4893]: I0314 08:34:00.142470 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="087cb6ca-245e-409d-85dc-3e1127fa79c7" containerName="oc" Mar 14 08:34:00 crc kubenswrapper[4893]: I0314 08:34:00.142646 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="087cb6ca-245e-409d-85dc-3e1127fa79c7" containerName="oc" Mar 14 08:34:00 crc kubenswrapper[4893]: I0314 08:34:00.143196 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557954-kkc98" Mar 14 08:34:00 crc kubenswrapper[4893]: I0314 08:34:00.145880 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 08:34:00 crc kubenswrapper[4893]: I0314 08:34:00.146048 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 08:34:00 crc kubenswrapper[4893]: I0314 08:34:00.146074 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-44qb7" Mar 14 08:34:00 crc kubenswrapper[4893]: I0314 08:34:00.150263 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557954-kkc98"] Mar 14 08:34:00 crc kubenswrapper[4893]: I0314 08:34:00.222123 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-st4w5\" (UniqueName: \"kubernetes.io/projected/2613890a-4617-42c9-80db-d0d99e6ba0ab-kube-api-access-st4w5\") pod \"auto-csr-approver-29557954-kkc98\" (UID: \"2613890a-4617-42c9-80db-d0d99e6ba0ab\") " pod="openshift-infra/auto-csr-approver-29557954-kkc98" Mar 14 08:34:00 crc kubenswrapper[4893]: I0314 08:34:00.323471 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-st4w5\" (UniqueName: \"kubernetes.io/projected/2613890a-4617-42c9-80db-d0d99e6ba0ab-kube-api-access-st4w5\") pod \"auto-csr-approver-29557954-kkc98\" (UID: \"2613890a-4617-42c9-80db-d0d99e6ba0ab\") " pod="openshift-infra/auto-csr-approver-29557954-kkc98" Mar 14 08:34:00 crc kubenswrapper[4893]: I0314 08:34:00.344244 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-st4w5\" (UniqueName: \"kubernetes.io/projected/2613890a-4617-42c9-80db-d0d99e6ba0ab-kube-api-access-st4w5\") pod \"auto-csr-approver-29557954-kkc98\" (UID: \"2613890a-4617-42c9-80db-d0d99e6ba0ab\") " pod="openshift-infra/auto-csr-approver-29557954-kkc98" Mar 14 08:34:00 crc kubenswrapper[4893]: I0314 08:34:00.471733 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557954-kkc98" Mar 14 08:34:00 crc kubenswrapper[4893]: I0314 08:34:00.907756 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557954-kkc98"] Mar 14 08:34:01 crc kubenswrapper[4893]: I0314 08:34:01.537480 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557954-kkc98" event={"ID":"2613890a-4617-42c9-80db-d0d99e6ba0ab","Type":"ContainerStarted","Data":"24d94ede5af4a73abb00f21a215225bf6d1fbe8e35d200376fcbee7eddbf90ee"} Mar 14 08:34:02 crc kubenswrapper[4893]: I0314 08:34:02.548127 4893 generic.go:334] "Generic (PLEG): container finished" podID="2613890a-4617-42c9-80db-d0d99e6ba0ab" containerID="d94b8b0a858ad536d84b96e2598879f33a149549252e75056a5b08c7c5e35c03" exitCode=0 Mar 14 08:34:02 crc kubenswrapper[4893]: I0314 08:34:02.548186 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557954-kkc98" event={"ID":"2613890a-4617-42c9-80db-d0d99e6ba0ab","Type":"ContainerDied","Data":"d94b8b0a858ad536d84b96e2598879f33a149549252e75056a5b08c7c5e35c03"} Mar 14 08:34:03 crc kubenswrapper[4893]: I0314 08:34:03.863940 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557954-kkc98" Mar 14 08:34:03 crc kubenswrapper[4893]: I0314 08:34:03.975641 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-st4w5\" (UniqueName: \"kubernetes.io/projected/2613890a-4617-42c9-80db-d0d99e6ba0ab-kube-api-access-st4w5\") pod \"2613890a-4617-42c9-80db-d0d99e6ba0ab\" (UID: \"2613890a-4617-42c9-80db-d0d99e6ba0ab\") " Mar 14 08:34:03 crc kubenswrapper[4893]: I0314 08:34:03.981257 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2613890a-4617-42c9-80db-d0d99e6ba0ab-kube-api-access-st4w5" (OuterVolumeSpecName: "kube-api-access-st4w5") pod "2613890a-4617-42c9-80db-d0d99e6ba0ab" (UID: "2613890a-4617-42c9-80db-d0d99e6ba0ab"). InnerVolumeSpecName "kube-api-access-st4w5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:34:04 crc kubenswrapper[4893]: I0314 08:34:04.077633 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-st4w5\" (UniqueName: \"kubernetes.io/projected/2613890a-4617-42c9-80db-d0d99e6ba0ab-kube-api-access-st4w5\") on node \"crc\" DevicePath \"\"" Mar 14 08:34:04 crc kubenswrapper[4893]: I0314 08:34:04.563967 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557954-kkc98" event={"ID":"2613890a-4617-42c9-80db-d0d99e6ba0ab","Type":"ContainerDied","Data":"24d94ede5af4a73abb00f21a215225bf6d1fbe8e35d200376fcbee7eddbf90ee"} Mar 14 08:34:04 crc kubenswrapper[4893]: I0314 08:34:04.564291 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24d94ede5af4a73abb00f21a215225bf6d1fbe8e35d200376fcbee7eddbf90ee" Mar 14 08:34:04 crc kubenswrapper[4893]: I0314 08:34:04.564020 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557954-kkc98" Mar 14 08:34:04 crc kubenswrapper[4893]: I0314 08:34:04.929510 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557948-l2h84"] Mar 14 08:34:04 crc kubenswrapper[4893]: I0314 08:34:04.937840 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557948-l2h84"] Mar 14 08:34:05 crc kubenswrapper[4893]: I0314 08:34:05.386117 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92830b14-dfcc-4ea9-a289-c6a3cf957677" path="/var/lib/kubelet/pods/92830b14-dfcc-4ea9-a289-c6a3cf957677/volumes" Mar 14 08:34:10 crc kubenswrapper[4893]: I0314 08:34:10.402308 4893 scope.go:117] "RemoveContainer" containerID="8cc5b152061df97e15f797993ef1927fe7d1527506eb88b12172f2039d5acd86" Mar 14 08:34:33 crc kubenswrapper[4893]: I0314 08:34:33.955270 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-fq4zv/must-gather-rs75c"] Mar 14 08:34:33 crc kubenswrapper[4893]: E0314 08:34:33.955987 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2613890a-4617-42c9-80db-d0d99e6ba0ab" containerName="oc" Mar 14 08:34:33 crc kubenswrapper[4893]: I0314 08:34:33.956001 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="2613890a-4617-42c9-80db-d0d99e6ba0ab" containerName="oc" Mar 14 08:34:33 crc kubenswrapper[4893]: I0314 08:34:33.956136 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="2613890a-4617-42c9-80db-d0d99e6ba0ab" containerName="oc" Mar 14 08:34:33 crc kubenswrapper[4893]: I0314 08:34:33.956856 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fq4zv/must-gather-rs75c" Mar 14 08:34:33 crc kubenswrapper[4893]: I0314 08:34:33.963736 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-fq4zv"/"default-dockercfg-xscc7" Mar 14 08:34:33 crc kubenswrapper[4893]: I0314 08:34:33.971243 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-fq4zv"/"openshift-service-ca.crt" Mar 14 08:34:33 crc kubenswrapper[4893]: I0314 08:34:33.974339 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-fq4zv/must-gather-rs75c"] Mar 14 08:34:34 crc kubenswrapper[4893]: I0314 08:34:34.030062 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-fq4zv"/"kube-root-ca.crt" Mar 14 08:34:34 crc kubenswrapper[4893]: I0314 08:34:34.096304 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/cd49778a-3ef6-40f5-ac5f-d84b89b8c786-must-gather-output\") pod \"must-gather-rs75c\" (UID: \"cd49778a-3ef6-40f5-ac5f-d84b89b8c786\") " pod="openshift-must-gather-fq4zv/must-gather-rs75c" Mar 14 08:34:34 crc kubenswrapper[4893]: I0314 08:34:34.096383 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ghww\" (UniqueName: \"kubernetes.io/projected/cd49778a-3ef6-40f5-ac5f-d84b89b8c786-kube-api-access-6ghww\") pod \"must-gather-rs75c\" (UID: \"cd49778a-3ef6-40f5-ac5f-d84b89b8c786\") " pod="openshift-must-gather-fq4zv/must-gather-rs75c" Mar 14 08:34:34 crc kubenswrapper[4893]: I0314 08:34:34.198080 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ghww\" (UniqueName: \"kubernetes.io/projected/cd49778a-3ef6-40f5-ac5f-d84b89b8c786-kube-api-access-6ghww\") pod \"must-gather-rs75c\" (UID: \"cd49778a-3ef6-40f5-ac5f-d84b89b8c786\") " pod="openshift-must-gather-fq4zv/must-gather-rs75c" Mar 14 08:34:34 crc kubenswrapper[4893]: I0314 08:34:34.198577 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/cd49778a-3ef6-40f5-ac5f-d84b89b8c786-must-gather-output\") pod \"must-gather-rs75c\" (UID: \"cd49778a-3ef6-40f5-ac5f-d84b89b8c786\") " pod="openshift-must-gather-fq4zv/must-gather-rs75c" Mar 14 08:34:34 crc kubenswrapper[4893]: I0314 08:34:34.198926 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/cd49778a-3ef6-40f5-ac5f-d84b89b8c786-must-gather-output\") pod \"must-gather-rs75c\" (UID: \"cd49778a-3ef6-40f5-ac5f-d84b89b8c786\") " pod="openshift-must-gather-fq4zv/must-gather-rs75c" Mar 14 08:34:34 crc kubenswrapper[4893]: I0314 08:34:34.216259 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ghww\" (UniqueName: \"kubernetes.io/projected/cd49778a-3ef6-40f5-ac5f-d84b89b8c786-kube-api-access-6ghww\") pod \"must-gather-rs75c\" (UID: \"cd49778a-3ef6-40f5-ac5f-d84b89b8c786\") " pod="openshift-must-gather-fq4zv/must-gather-rs75c" Mar 14 08:34:34 crc kubenswrapper[4893]: I0314 08:34:34.272310 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fq4zv/must-gather-rs75c" Mar 14 08:34:34 crc kubenswrapper[4893]: I0314 08:34:34.700880 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-fq4zv/must-gather-rs75c"] Mar 14 08:34:34 crc kubenswrapper[4893]: I0314 08:34:34.789391 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fq4zv/must-gather-rs75c" event={"ID":"cd49778a-3ef6-40f5-ac5f-d84b89b8c786","Type":"ContainerStarted","Data":"bfcb8911045b713c7711542b512502722ff6719d75915fb308fda33fdb93e880"} Mar 14 08:34:41 crc kubenswrapper[4893]: I0314 08:34:41.866371 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fq4zv/must-gather-rs75c" event={"ID":"cd49778a-3ef6-40f5-ac5f-d84b89b8c786","Type":"ContainerStarted","Data":"bc56b360dcf36c666583bbc24e076c6391d77a8066c2aefc3a3d1806c2793681"} Mar 14 08:34:41 crc kubenswrapper[4893]: I0314 08:34:41.867018 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fq4zv/must-gather-rs75c" event={"ID":"cd49778a-3ef6-40f5-ac5f-d84b89b8c786","Type":"ContainerStarted","Data":"4c74824d0e8a8ec320a497350ae8441b7c3998867a9bd43f20ec3a6cf6ec7215"} Mar 14 08:34:41 crc kubenswrapper[4893]: I0314 08:34:41.886148 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-fq4zv/must-gather-rs75c" podStartSLOduration=2.241494222 podStartE2EDuration="8.886124769s" podCreationTimestamp="2026-03-14 08:34:33 +0000 UTC" firstStartedPulling="2026-03-14 08:34:34.705562887 +0000 UTC m=+5753.967739679" lastFinishedPulling="2026-03-14 08:34:41.350193434 +0000 UTC m=+5760.612370226" observedRunningTime="2026-03-14 08:34:41.882583272 +0000 UTC m=+5761.144760094" watchObservedRunningTime="2026-03-14 08:34:41.886124769 +0000 UTC m=+5761.148301591" Mar 14 08:35:10 crc kubenswrapper[4893]: I0314 08:35:10.313654 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xflmr"] Mar 14 08:35:10 crc kubenswrapper[4893]: I0314 08:35:10.315610 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xflmr" Mar 14 08:35:10 crc kubenswrapper[4893]: I0314 08:35:10.328791 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xflmr"] Mar 14 08:35:10 crc kubenswrapper[4893]: I0314 08:35:10.397329 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wcrt\" (UniqueName: \"kubernetes.io/projected/0ac8cdd2-fa0c-4bc6-9045-68e452487ccc-kube-api-access-4wcrt\") pod \"community-operators-xflmr\" (UID: \"0ac8cdd2-fa0c-4bc6-9045-68e452487ccc\") " pod="openshift-marketplace/community-operators-xflmr" Mar 14 08:35:10 crc kubenswrapper[4893]: I0314 08:35:10.397393 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ac8cdd2-fa0c-4bc6-9045-68e452487ccc-utilities\") pod \"community-operators-xflmr\" (UID: \"0ac8cdd2-fa0c-4bc6-9045-68e452487ccc\") " pod="openshift-marketplace/community-operators-xflmr" Mar 14 08:35:10 crc kubenswrapper[4893]: I0314 08:35:10.397582 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ac8cdd2-fa0c-4bc6-9045-68e452487ccc-catalog-content\") pod \"community-operators-xflmr\" (UID: \"0ac8cdd2-fa0c-4bc6-9045-68e452487ccc\") " pod="openshift-marketplace/community-operators-xflmr" Mar 14 08:35:10 crc kubenswrapper[4893]: I0314 08:35:10.499370 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ac8cdd2-fa0c-4bc6-9045-68e452487ccc-catalog-content\") pod \"community-operators-xflmr\" (UID: \"0ac8cdd2-fa0c-4bc6-9045-68e452487ccc\") " pod="openshift-marketplace/community-operators-xflmr" Mar 14 08:35:10 crc kubenswrapper[4893]: I0314 08:35:10.499463 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wcrt\" (UniqueName: \"kubernetes.io/projected/0ac8cdd2-fa0c-4bc6-9045-68e452487ccc-kube-api-access-4wcrt\") pod \"community-operators-xflmr\" (UID: \"0ac8cdd2-fa0c-4bc6-9045-68e452487ccc\") " pod="openshift-marketplace/community-operators-xflmr" Mar 14 08:35:10 crc kubenswrapper[4893]: I0314 08:35:10.499538 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ac8cdd2-fa0c-4bc6-9045-68e452487ccc-utilities\") pod \"community-operators-xflmr\" (UID: \"0ac8cdd2-fa0c-4bc6-9045-68e452487ccc\") " pod="openshift-marketplace/community-operators-xflmr" Mar 14 08:35:10 crc kubenswrapper[4893]: I0314 08:35:10.500380 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ac8cdd2-fa0c-4bc6-9045-68e452487ccc-catalog-content\") pod \"community-operators-xflmr\" (UID: \"0ac8cdd2-fa0c-4bc6-9045-68e452487ccc\") " pod="openshift-marketplace/community-operators-xflmr" Mar 14 08:35:10 crc kubenswrapper[4893]: I0314 08:35:10.500379 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ac8cdd2-fa0c-4bc6-9045-68e452487ccc-utilities\") pod \"community-operators-xflmr\" (UID: \"0ac8cdd2-fa0c-4bc6-9045-68e452487ccc\") " pod="openshift-marketplace/community-operators-xflmr" Mar 14 08:35:10 crc kubenswrapper[4893]: I0314 08:35:10.522732 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wcrt\" (UniqueName: \"kubernetes.io/projected/0ac8cdd2-fa0c-4bc6-9045-68e452487ccc-kube-api-access-4wcrt\") pod \"community-operators-xflmr\" (UID: \"0ac8cdd2-fa0c-4bc6-9045-68e452487ccc\") " pod="openshift-marketplace/community-operators-xflmr" Mar 14 08:35:10 crc kubenswrapper[4893]: I0314 08:35:10.635584 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xflmr" Mar 14 08:35:11 crc kubenswrapper[4893]: I0314 08:35:11.147347 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xflmr"] Mar 14 08:35:12 crc kubenswrapper[4893]: I0314 08:35:12.068599 4893 generic.go:334] "Generic (PLEG): container finished" podID="0ac8cdd2-fa0c-4bc6-9045-68e452487ccc" containerID="e802c3646d958ffcf4fd2dbccd168d98b73752d54577437b6456eda799d9582c" exitCode=0 Mar 14 08:35:12 crc kubenswrapper[4893]: I0314 08:35:12.068726 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xflmr" event={"ID":"0ac8cdd2-fa0c-4bc6-9045-68e452487ccc","Type":"ContainerDied","Data":"e802c3646d958ffcf4fd2dbccd168d98b73752d54577437b6456eda799d9582c"} Mar 14 08:35:12 crc kubenswrapper[4893]: I0314 08:35:12.068980 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xflmr" event={"ID":"0ac8cdd2-fa0c-4bc6-9045-68e452487ccc","Type":"ContainerStarted","Data":"78b00042e826e1aa460faf71a3e4b7fcdb4efdd842aad9c7423c6c6e18d01e60"} Mar 14 08:35:13 crc kubenswrapper[4893]: I0314 08:35:13.079062 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xflmr" event={"ID":"0ac8cdd2-fa0c-4bc6-9045-68e452487ccc","Type":"ContainerStarted","Data":"b43fca65b6dd3a592835c1b2653d02f2d5633c95a742cb308705aa0ca2f3d7b5"} Mar 14 08:35:14 crc kubenswrapper[4893]: I0314 08:35:14.087137 4893 generic.go:334] "Generic (PLEG): container finished" podID="0ac8cdd2-fa0c-4bc6-9045-68e452487ccc" containerID="b43fca65b6dd3a592835c1b2653d02f2d5633c95a742cb308705aa0ca2f3d7b5" exitCode=0 Mar 14 08:35:14 crc kubenswrapper[4893]: I0314 08:35:14.087218 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xflmr" event={"ID":"0ac8cdd2-fa0c-4bc6-9045-68e452487ccc","Type":"ContainerDied","Data":"b43fca65b6dd3a592835c1b2653d02f2d5633c95a742cb308705aa0ca2f3d7b5"} Mar 14 08:35:15 crc kubenswrapper[4893]: I0314 08:35:15.108634 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xflmr" event={"ID":"0ac8cdd2-fa0c-4bc6-9045-68e452487ccc","Type":"ContainerStarted","Data":"e04ad3836e499e61a30a323a02995a81a593326a1909f8f2d1758744e29672cf"} Mar 14 08:35:15 crc kubenswrapper[4893]: I0314 08:35:15.128990 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xflmr" podStartSLOduration=2.659010176 podStartE2EDuration="5.128972147s" podCreationTimestamp="2026-03-14 08:35:10 +0000 UTC" firstStartedPulling="2026-03-14 08:35:12.07085344 +0000 UTC m=+5791.333030242" lastFinishedPulling="2026-03-14 08:35:14.540815411 +0000 UTC m=+5793.802992213" observedRunningTime="2026-03-14 08:35:15.124130689 +0000 UTC m=+5794.386307481" watchObservedRunningTime="2026-03-14 08:35:15.128972147 +0000 UTC m=+5794.391148939" Mar 14 08:35:20 crc kubenswrapper[4893]: I0314 08:35:20.636202 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xflmr" Mar 14 08:35:20 crc kubenswrapper[4893]: I0314 08:35:20.636955 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xflmr" Mar 14 08:35:20 crc kubenswrapper[4893]: I0314 08:35:20.695311 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xflmr" Mar 14 08:35:21 crc kubenswrapper[4893]: I0314 08:35:21.189466 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xflmr" Mar 14 08:35:21 crc kubenswrapper[4893]: I0314 08:35:21.238843 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xflmr"] Mar 14 08:35:23 crc kubenswrapper[4893]: I0314 08:35:23.160138 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xflmr" podUID="0ac8cdd2-fa0c-4bc6-9045-68e452487ccc" containerName="registry-server" containerID="cri-o://e04ad3836e499e61a30a323a02995a81a593326a1909f8f2d1758744e29672cf" gracePeriod=2 Mar 14 08:35:23 crc kubenswrapper[4893]: I0314 08:35:23.586727 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xflmr" Mar 14 08:35:23 crc kubenswrapper[4893]: I0314 08:35:23.693048 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ac8cdd2-fa0c-4bc6-9045-68e452487ccc-catalog-content\") pod \"0ac8cdd2-fa0c-4bc6-9045-68e452487ccc\" (UID: \"0ac8cdd2-fa0c-4bc6-9045-68e452487ccc\") " Mar 14 08:35:23 crc kubenswrapper[4893]: I0314 08:35:23.693083 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ac8cdd2-fa0c-4bc6-9045-68e452487ccc-utilities\") pod \"0ac8cdd2-fa0c-4bc6-9045-68e452487ccc\" (UID: \"0ac8cdd2-fa0c-4bc6-9045-68e452487ccc\") " Mar 14 08:35:23 crc kubenswrapper[4893]: I0314 08:35:23.693159 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wcrt\" (UniqueName: \"kubernetes.io/projected/0ac8cdd2-fa0c-4bc6-9045-68e452487ccc-kube-api-access-4wcrt\") pod \"0ac8cdd2-fa0c-4bc6-9045-68e452487ccc\" (UID: \"0ac8cdd2-fa0c-4bc6-9045-68e452487ccc\") " Mar 14 08:35:23 crc kubenswrapper[4893]: I0314 08:35:23.694150 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ac8cdd2-fa0c-4bc6-9045-68e452487ccc-utilities" (OuterVolumeSpecName: "utilities") pod "0ac8cdd2-fa0c-4bc6-9045-68e452487ccc" (UID: "0ac8cdd2-fa0c-4bc6-9045-68e452487ccc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:35:23 crc kubenswrapper[4893]: I0314 08:35:23.698221 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ac8cdd2-fa0c-4bc6-9045-68e452487ccc-kube-api-access-4wcrt" (OuterVolumeSpecName: "kube-api-access-4wcrt") pod "0ac8cdd2-fa0c-4bc6-9045-68e452487ccc" (UID: "0ac8cdd2-fa0c-4bc6-9045-68e452487ccc"). InnerVolumeSpecName "kube-api-access-4wcrt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:35:23 crc kubenswrapper[4893]: I0314 08:35:23.781724 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ac8cdd2-fa0c-4bc6-9045-68e452487ccc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0ac8cdd2-fa0c-4bc6-9045-68e452487ccc" (UID: "0ac8cdd2-fa0c-4bc6-9045-68e452487ccc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:35:23 crc kubenswrapper[4893]: I0314 08:35:23.794498 4893 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ac8cdd2-fa0c-4bc6-9045-68e452487ccc-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 08:35:23 crc kubenswrapper[4893]: I0314 08:35:23.794730 4893 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ac8cdd2-fa0c-4bc6-9045-68e452487ccc-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 08:35:23 crc kubenswrapper[4893]: I0314 08:35:23.794825 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wcrt\" (UniqueName: \"kubernetes.io/projected/0ac8cdd2-fa0c-4bc6-9045-68e452487ccc-kube-api-access-4wcrt\") on node \"crc\" DevicePath \"\"" Mar 14 08:35:24 crc kubenswrapper[4893]: I0314 08:35:24.166234 4893 generic.go:334] "Generic (PLEG): container finished" podID="0ac8cdd2-fa0c-4bc6-9045-68e452487ccc" containerID="e04ad3836e499e61a30a323a02995a81a593326a1909f8f2d1758744e29672cf" exitCode=0 Mar 14 08:35:24 crc kubenswrapper[4893]: I0314 08:35:24.166269 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xflmr" event={"ID":"0ac8cdd2-fa0c-4bc6-9045-68e452487ccc","Type":"ContainerDied","Data":"e04ad3836e499e61a30a323a02995a81a593326a1909f8f2d1758744e29672cf"} Mar 14 08:35:24 crc kubenswrapper[4893]: I0314 08:35:24.166295 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xflmr" event={"ID":"0ac8cdd2-fa0c-4bc6-9045-68e452487ccc","Type":"ContainerDied","Data":"78b00042e826e1aa460faf71a3e4b7fcdb4efdd842aad9c7423c6c6e18d01e60"} Mar 14 08:35:24 crc kubenswrapper[4893]: I0314 08:35:24.166313 4893 scope.go:117] "RemoveContainer" containerID="e04ad3836e499e61a30a323a02995a81a593326a1909f8f2d1758744e29672cf" Mar 14 08:35:24 crc kubenswrapper[4893]: I0314 08:35:24.166436 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xflmr" Mar 14 08:35:24 crc kubenswrapper[4893]: I0314 08:35:24.199426 4893 scope.go:117] "RemoveContainer" containerID="b43fca65b6dd3a592835c1b2653d02f2d5633c95a742cb308705aa0ca2f3d7b5" Mar 14 08:35:24 crc kubenswrapper[4893]: I0314 08:35:24.203551 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xflmr"] Mar 14 08:35:24 crc kubenswrapper[4893]: I0314 08:35:24.213175 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xflmr"] Mar 14 08:35:24 crc kubenswrapper[4893]: I0314 08:35:24.218634 4893 scope.go:117] "RemoveContainer" containerID="e802c3646d958ffcf4fd2dbccd168d98b73752d54577437b6456eda799d9582c" Mar 14 08:35:24 crc kubenswrapper[4893]: I0314 08:35:24.250111 4893 scope.go:117] "RemoveContainer" containerID="e04ad3836e499e61a30a323a02995a81a593326a1909f8f2d1758744e29672cf" Mar 14 08:35:24 crc kubenswrapper[4893]: E0314 08:35:24.250550 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e04ad3836e499e61a30a323a02995a81a593326a1909f8f2d1758744e29672cf\": container with ID starting with e04ad3836e499e61a30a323a02995a81a593326a1909f8f2d1758744e29672cf not found: ID does not exist" containerID="e04ad3836e499e61a30a323a02995a81a593326a1909f8f2d1758744e29672cf" Mar 14 08:35:24 crc kubenswrapper[4893]: I0314 08:35:24.250595 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e04ad3836e499e61a30a323a02995a81a593326a1909f8f2d1758744e29672cf"} err="failed to get container status \"e04ad3836e499e61a30a323a02995a81a593326a1909f8f2d1758744e29672cf\": rpc error: code = NotFound desc = could not find container \"e04ad3836e499e61a30a323a02995a81a593326a1909f8f2d1758744e29672cf\": container with ID starting with e04ad3836e499e61a30a323a02995a81a593326a1909f8f2d1758744e29672cf not found: ID does not exist" Mar 14 08:35:24 crc kubenswrapper[4893]: I0314 08:35:24.250622 4893 scope.go:117] "RemoveContainer" containerID="b43fca65b6dd3a592835c1b2653d02f2d5633c95a742cb308705aa0ca2f3d7b5" Mar 14 08:35:24 crc kubenswrapper[4893]: E0314 08:35:24.250854 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b43fca65b6dd3a592835c1b2653d02f2d5633c95a742cb308705aa0ca2f3d7b5\": container with ID starting with b43fca65b6dd3a592835c1b2653d02f2d5633c95a742cb308705aa0ca2f3d7b5 not found: ID does not exist" containerID="b43fca65b6dd3a592835c1b2653d02f2d5633c95a742cb308705aa0ca2f3d7b5" Mar 14 08:35:24 crc kubenswrapper[4893]: I0314 08:35:24.250886 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b43fca65b6dd3a592835c1b2653d02f2d5633c95a742cb308705aa0ca2f3d7b5"} err="failed to get container status \"b43fca65b6dd3a592835c1b2653d02f2d5633c95a742cb308705aa0ca2f3d7b5\": rpc error: code = NotFound desc = could not find container \"b43fca65b6dd3a592835c1b2653d02f2d5633c95a742cb308705aa0ca2f3d7b5\": container with ID starting with b43fca65b6dd3a592835c1b2653d02f2d5633c95a742cb308705aa0ca2f3d7b5 not found: ID does not exist" Mar 14 08:35:24 crc kubenswrapper[4893]: I0314 08:35:24.250905 4893 scope.go:117] "RemoveContainer" containerID="e802c3646d958ffcf4fd2dbccd168d98b73752d54577437b6456eda799d9582c" Mar 14 08:35:24 crc kubenswrapper[4893]: E0314 08:35:24.251263 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e802c3646d958ffcf4fd2dbccd168d98b73752d54577437b6456eda799d9582c\": container with ID starting with e802c3646d958ffcf4fd2dbccd168d98b73752d54577437b6456eda799d9582c not found: ID does not exist" containerID="e802c3646d958ffcf4fd2dbccd168d98b73752d54577437b6456eda799d9582c" Mar 14 08:35:24 crc kubenswrapper[4893]: I0314 08:35:24.251432 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e802c3646d958ffcf4fd2dbccd168d98b73752d54577437b6456eda799d9582c"} err="failed to get container status \"e802c3646d958ffcf4fd2dbccd168d98b73752d54577437b6456eda799d9582c\": rpc error: code = NotFound desc = could not find container \"e802c3646d958ffcf4fd2dbccd168d98b73752d54577437b6456eda799d9582c\": container with ID starting with e802c3646d958ffcf4fd2dbccd168d98b73752d54577437b6456eda799d9582c not found: ID does not exist" Mar 14 08:35:25 crc kubenswrapper[4893]: I0314 08:35:25.386547 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ac8cdd2-fa0c-4bc6-9045-68e452487ccc" path="/var/lib/kubelet/pods/0ac8cdd2-fa0c-4bc6-9045-68e452487ccc/volumes" Mar 14 08:35:40 crc kubenswrapper[4893]: I0314 08:35:40.700379 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298xwz7b_54ba629d-3d14-4c08-92ca-430abfe4177c/util/0.log" Mar 14 08:35:40 crc kubenswrapper[4893]: I0314 08:35:40.864048 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298xwz7b_54ba629d-3d14-4c08-92ca-430abfe4177c/util/0.log" Mar 14 08:35:40 crc kubenswrapper[4893]: I0314 08:35:40.890364 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298xwz7b_54ba629d-3d14-4c08-92ca-430abfe4177c/pull/0.log" Mar 14 08:35:40 crc kubenswrapper[4893]: I0314 08:35:40.920442 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298xwz7b_54ba629d-3d14-4c08-92ca-430abfe4177c/pull/0.log" Mar 14 08:35:41 crc kubenswrapper[4893]: I0314 08:35:41.079141 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298xwz7b_54ba629d-3d14-4c08-92ca-430abfe4177c/extract/0.log" Mar 14 08:35:41 crc kubenswrapper[4893]: I0314 08:35:41.082249 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298xwz7b_54ba629d-3d14-4c08-92ca-430abfe4177c/util/0.log" Mar 14 08:35:41 crc kubenswrapper[4893]: I0314 08:35:41.110710 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298xwz7b_54ba629d-3d14-4c08-92ca-430abfe4177c/pull/0.log" Mar 14 08:35:41 crc kubenswrapper[4893]: I0314 08:35:41.346644 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-d47688694-lg76w_bb479ff8-15fb-458b-87c7-ec4b3d15721d/manager/0.log" Mar 14 08:35:41 crc kubenswrapper[4893]: I0314 08:35:41.480007 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-66d56f6ff4-cjqhn_10f5729a-cf9e-4851-9173-2c6c0d6dbcf0/manager/0.log" Mar 14 08:35:41 crc kubenswrapper[4893]: I0314 08:35:41.683537 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5964f64c48-65klp_9e7e613b-2bb4-40a5-a8c2-5649763d4a61/manager/0.log" Mar 14 08:35:41 crc kubenswrapper[4893]: I0314 08:35:41.736587 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-77b6666d85-qfg79_a8ac6ab6-a354-4d94-9bfb-48ae4b15ff88/manager/0.log" Mar 14 08:35:41 crc kubenswrapper[4893]: I0314 08:35:41.921465 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d9d6b584d-djn7l_26bb4dec-8e1a-4fdc-a7ca-808ae7026afe/manager/0.log" Mar 14 08:35:42 crc kubenswrapper[4893]: I0314 08:35:42.195372 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5bc894d9b-gm6pg_1e068b17-cab4-421d-b501-cd825de6b67c/manager/0.log" Mar 14 08:35:42 crc kubenswrapper[4893]: I0314 08:35:42.327616 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-54dc5b8f8d-qmt9z_9966deb5-a260-43a8-bec4-772b6308266d/manager/0.log" Mar 14 08:35:42 crc kubenswrapper[4893]: I0314 08:35:42.442104 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-684f77d66d-xfpdm_01b24940-e97d-472d-902c-87bfe6b67147/manager/0.log" Mar 14 08:35:42 crc kubenswrapper[4893]: I0314 08:35:42.545685 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-57b484b4df-tlm9k_7a3a590a-bacd-4d27-91f1-1b6afd52ab3e/manager/0.log" Mar 14 08:35:42 crc kubenswrapper[4893]: I0314 08:35:42.724352 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-5b6b6b4c9f-cxg74_b8da0600-7ae7-4f7a-8b3e-c81523dc6034/manager/0.log" Mar 14 08:35:42 crc kubenswrapper[4893]: I0314 08:35:42.859860 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-984cd4dcf-xr2zr_49eeeddb-7017-4642-aabb-ea932fa16ac7/manager/0.log" Mar 14 08:35:42 crc kubenswrapper[4893]: I0314 08:35:42.902346 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-776c5696bf-zkh7h_c50bd703-5cd9-418a-b623-4809a8ff4213/manager/0.log" Mar 14 08:35:43 crc kubenswrapper[4893]: I0314 08:35:43.034020 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-7f84474648-xjwp7_0997c764-6b32-41ea-adca-b04feb1fbe6f/manager/0.log" Mar 14 08:35:43 crc kubenswrapper[4893]: I0314 08:35:43.099203 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5f4f55cb5c-gztc6_1a01facb-b7c0-476b-96b1-759e6e9f3c30/manager/0.log" Mar 14 08:35:43 crc kubenswrapper[4893]: I0314 08:35:43.192293 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6f7958d774rh7fr_ba3c47d0-78fd-4b4d-8aa8-66c2379f8c0f/manager/0.log" Mar 14 08:35:43 crc kubenswrapper[4893]: I0314 08:35:43.414978 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-6dc56d8cd6-q649l_271e053c-65a4-4422-9a1c-132c83e80ce5/operator/0.log" Mar 14 08:35:43 crc kubenswrapper[4893]: I0314 08:35:43.724251 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-cvls2_1ec921c3-291e-460e-abe1-d981ae06b426/registry-server/0.log" Mar 14 08:35:43 crc kubenswrapper[4893]: I0314 08:35:43.798071 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-bbc5b68f9-7967f_2b0418ea-6737-4c52-b1bc-915fba5ef735/manager/0.log" Mar 14 08:35:44 crc kubenswrapper[4893]: I0314 08:35:44.074366 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6484b7b757-kd7nd_be24c0b7-17b5-4962-b12d-f438a21b953f/manager/0.log" Mar 14 08:35:44 crc kubenswrapper[4893]: I0314 08:35:44.143994 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-574d45c66c-kwcbl_887bf8c6-7149-4e9b-b67c-bc70791532b2/manager/0.log" Mar 14 08:35:44 crc kubenswrapper[4893]: I0314 08:35:44.191025 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-dq859_7bab7ac9-4699-41cc-b1ca-0f344b13ab15/operator/0.log" Mar 14 08:35:44 crc kubenswrapper[4893]: I0314 08:35:44.296529 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-7f9cc5dd44-wgkn7_373cda74-3e73-4a92-9d91-395826ab1864/manager/0.log" Mar 14 08:35:44 crc kubenswrapper[4893]: I0314 08:35:44.403276 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6854b8b9d9-49ss2_d92c0e9c-9151-4619-8ce6-9eb5cd77d093/manager/0.log" Mar 14 08:35:44 crc kubenswrapper[4893]: I0314 08:35:44.482893 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-lbg2b_b440d436-7017-4503-823f-16e998dbf74d/manager/0.log" Mar 14 08:35:44 crc kubenswrapper[4893]: I0314 08:35:44.596768 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c4d75f7f9-9rs6t_633253f4-c3d4-439d-9b79-dee8c6d41bdc/manager/0.log" Mar 14 08:35:59 crc kubenswrapper[4893]: I0314 08:35:59.731103 4893 patch_prober.go:28] interesting pod/machine-config-daemon-d4x6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:35:59 crc kubenswrapper[4893]: I0314 08:35:59.732590 4893 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:36:00 crc kubenswrapper[4893]: I0314 08:36:00.150648 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557956-8hqv8"] Mar 14 08:36:00 crc kubenswrapper[4893]: E0314 08:36:00.151226 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ac8cdd2-fa0c-4bc6-9045-68e452487ccc" containerName="registry-server" Mar 14 08:36:00 crc kubenswrapper[4893]: I0314 08:36:00.151242 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ac8cdd2-fa0c-4bc6-9045-68e452487ccc" containerName="registry-server" Mar 14 08:36:00 crc kubenswrapper[4893]: E0314 08:36:00.151261 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ac8cdd2-fa0c-4bc6-9045-68e452487ccc" containerName="extract-content" Mar 14 08:36:00 crc kubenswrapper[4893]: I0314 08:36:00.151267 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ac8cdd2-fa0c-4bc6-9045-68e452487ccc" containerName="extract-content" Mar 14 08:36:00 crc kubenswrapper[4893]: E0314 08:36:00.151279 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ac8cdd2-fa0c-4bc6-9045-68e452487ccc" containerName="extract-utilities" Mar 14 08:36:00 crc kubenswrapper[4893]: I0314 08:36:00.151285 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ac8cdd2-fa0c-4bc6-9045-68e452487ccc" containerName="extract-utilities" Mar 14 08:36:00 crc kubenswrapper[4893]: I0314 08:36:00.151409 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ac8cdd2-fa0c-4bc6-9045-68e452487ccc" containerName="registry-server" Mar 14 08:36:00 crc kubenswrapper[4893]: I0314 08:36:00.151846 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557956-8hqv8" Mar 14 08:36:00 crc kubenswrapper[4893]: I0314 08:36:00.156131 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-44qb7" Mar 14 08:36:00 crc kubenswrapper[4893]: I0314 08:36:00.156708 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 08:36:00 crc kubenswrapper[4893]: I0314 08:36:00.157764 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 08:36:00 crc kubenswrapper[4893]: I0314 08:36:00.163286 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557956-8hqv8"] Mar 14 08:36:00 crc kubenswrapper[4893]: I0314 08:36:00.288819 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whrpd\" (UniqueName: \"kubernetes.io/projected/519ad268-04fb-45a8-adeb-261c1f9e8f8d-kube-api-access-whrpd\") pod \"auto-csr-approver-29557956-8hqv8\" (UID: \"519ad268-04fb-45a8-adeb-261c1f9e8f8d\") " pod="openshift-infra/auto-csr-approver-29557956-8hqv8" Mar 14 08:36:00 crc kubenswrapper[4893]: I0314 08:36:00.390883 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whrpd\" (UniqueName: \"kubernetes.io/projected/519ad268-04fb-45a8-adeb-261c1f9e8f8d-kube-api-access-whrpd\") pod \"auto-csr-approver-29557956-8hqv8\" (UID: \"519ad268-04fb-45a8-adeb-261c1f9e8f8d\") " pod="openshift-infra/auto-csr-approver-29557956-8hqv8" Mar 14 08:36:00 crc kubenswrapper[4893]: I0314 08:36:00.412181 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whrpd\" (UniqueName: \"kubernetes.io/projected/519ad268-04fb-45a8-adeb-261c1f9e8f8d-kube-api-access-whrpd\") pod \"auto-csr-approver-29557956-8hqv8\" (UID: \"519ad268-04fb-45a8-adeb-261c1f9e8f8d\") " pod="openshift-infra/auto-csr-approver-29557956-8hqv8" Mar 14 08:36:00 crc kubenswrapper[4893]: I0314 08:36:00.472605 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557956-8hqv8" Mar 14 08:36:00 crc kubenswrapper[4893]: I0314 08:36:00.708443 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557956-8hqv8"] Mar 14 08:36:01 crc kubenswrapper[4893]: I0314 08:36:01.414751 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557956-8hqv8" event={"ID":"519ad268-04fb-45a8-adeb-261c1f9e8f8d","Type":"ContainerStarted","Data":"b2a2919ca15e7d331973226411864b495786fa09636ea1fb3d0871080c876cfd"} Mar 14 08:36:02 crc kubenswrapper[4893]: I0314 08:36:02.422003 4893 generic.go:334] "Generic (PLEG): container finished" podID="519ad268-04fb-45a8-adeb-261c1f9e8f8d" containerID="82b395ff6d3d5181dcc96cefd0dab0286dcb5be76fb8d4d2ea8444afb2a04089" exitCode=0 Mar 14 08:36:02 crc kubenswrapper[4893]: I0314 08:36:02.422060 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557956-8hqv8" event={"ID":"519ad268-04fb-45a8-adeb-261c1f9e8f8d","Type":"ContainerDied","Data":"82b395ff6d3d5181dcc96cefd0dab0286dcb5be76fb8d4d2ea8444afb2a04089"} Mar 14 08:36:03 crc kubenswrapper[4893]: I0314 08:36:03.195391 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-v7lxq_f23fcc26-f800-497a-b038-065687659df7/control-plane-machine-set-operator/0.log" Mar 14 08:36:03 crc kubenswrapper[4893]: I0314 08:36:03.370012 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-vbg6j_871c529e-b263-4600-982a-d7be266f86e4/kube-rbac-proxy/0.log" Mar 14 08:36:03 crc kubenswrapper[4893]: I0314 08:36:03.397442 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-vbg6j_871c529e-b263-4600-982a-d7be266f86e4/machine-api-operator/0.log" Mar 14 08:36:03 crc kubenswrapper[4893]: I0314 08:36:03.731057 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557956-8hqv8" Mar 14 08:36:03 crc kubenswrapper[4893]: I0314 08:36:03.841233 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whrpd\" (UniqueName: \"kubernetes.io/projected/519ad268-04fb-45a8-adeb-261c1f9e8f8d-kube-api-access-whrpd\") pod \"519ad268-04fb-45a8-adeb-261c1f9e8f8d\" (UID: \"519ad268-04fb-45a8-adeb-261c1f9e8f8d\") " Mar 14 08:36:03 crc kubenswrapper[4893]: I0314 08:36:03.846703 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/519ad268-04fb-45a8-adeb-261c1f9e8f8d-kube-api-access-whrpd" (OuterVolumeSpecName: "kube-api-access-whrpd") pod "519ad268-04fb-45a8-adeb-261c1f9e8f8d" (UID: "519ad268-04fb-45a8-adeb-261c1f9e8f8d"). InnerVolumeSpecName "kube-api-access-whrpd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:36:03 crc kubenswrapper[4893]: I0314 08:36:03.942376 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whrpd\" (UniqueName: \"kubernetes.io/projected/519ad268-04fb-45a8-adeb-261c1f9e8f8d-kube-api-access-whrpd\") on node \"crc\" DevicePath \"\"" Mar 14 08:36:04 crc kubenswrapper[4893]: I0314 08:36:04.436133 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557956-8hqv8" event={"ID":"519ad268-04fb-45a8-adeb-261c1f9e8f8d","Type":"ContainerDied","Data":"b2a2919ca15e7d331973226411864b495786fa09636ea1fb3d0871080c876cfd"} Mar 14 08:36:04 crc kubenswrapper[4893]: I0314 08:36:04.436544 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2a2919ca15e7d331973226411864b495786fa09636ea1fb3d0871080c876cfd" Mar 14 08:36:04 crc kubenswrapper[4893]: I0314 08:36:04.436164 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557956-8hqv8" Mar 14 08:36:04 crc kubenswrapper[4893]: I0314 08:36:04.799691 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557950-dkbcc"] Mar 14 08:36:04 crc kubenswrapper[4893]: I0314 08:36:04.804273 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557950-dkbcc"] Mar 14 08:36:05 crc kubenswrapper[4893]: I0314 08:36:05.385259 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a84b836f-f0fd-4554-8a08-51f2fe190947" path="/var/lib/kubelet/pods/a84b836f-f0fd-4554-8a08-51f2fe190947/volumes" Mar 14 08:36:10 crc kubenswrapper[4893]: I0314 08:36:10.490796 4893 scope.go:117] "RemoveContainer" containerID="ec6635435e148993d86b30cf27351aa2053df23fc218f5701b6c230e1b18ffc4" Mar 14 08:36:14 crc kubenswrapper[4893]: I0314 08:36:14.914388 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-545d4d4674-7vxnn_4b666b54-a5a5-4494-b03c-0d08c0bf9a79/cert-manager-controller/0.log" Mar 14 08:36:15 crc kubenswrapper[4893]: I0314 08:36:15.070831 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-5545bd876-fdtkk_8db0bf07-4adf-46db-8410-0e7b02326c81/cert-manager-cainjector/0.log" Mar 14 08:36:15 crc kubenswrapper[4893]: I0314 08:36:15.183152 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-6888856db4-khrgn_6ccf1897-752e-420b-9ba9-ed767e6eac38/cert-manager-webhook/0.log" Mar 14 08:36:28 crc kubenswrapper[4893]: I0314 08:36:28.084405 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-vzjv7_7d9fe9c4-e01d-4c14-babe-c16587326c63/nmstate-console-plugin/0.log" Mar 14 08:36:28 crc kubenswrapper[4893]: I0314 08:36:28.293734 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-k468n_132b2e20-99cc-4cba-b1dd-9d7f51cb774c/nmstate-handler/0.log" Mar 14 08:36:28 crc kubenswrapper[4893]: I0314 08:36:28.345781 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-cqr9d_d76392bf-1abc-406a-bd1e-bb95efdba8fc/kube-rbac-proxy/0.log" Mar 14 08:36:28 crc kubenswrapper[4893]: I0314 08:36:28.352066 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-cqr9d_d76392bf-1abc-406a-bd1e-bb95efdba8fc/nmstate-metrics/0.log" Mar 14 08:36:28 crc kubenswrapper[4893]: I0314 08:36:28.478338 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-cwxwg_ac8e2f6f-8e50-46f3-b062-310098d7091c/nmstate-operator/0.log" Mar 14 08:36:28 crc kubenswrapper[4893]: I0314 08:36:28.502454 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-v47ts_d6b31376-0d73-47a7-93e6-26f77823be30/nmstate-webhook/0.log" Mar 14 08:36:29 crc kubenswrapper[4893]: I0314 08:36:29.730899 4893 patch_prober.go:28] interesting pod/machine-config-daemon-d4x6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:36:29 crc kubenswrapper[4893]: I0314 08:36:29.732452 4893 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:36:53 crc kubenswrapper[4893]: I0314 08:36:53.770297 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-clzml_92e0ed09-c3c1-42a9-9405-fcf906686b43/kube-rbac-proxy/0.log" Mar 14 08:36:53 crc kubenswrapper[4893]: I0314 08:36:53.957161 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7m6db_17f61fcc-48c5-42ef-97a4-9440159a79fc/cp-frr-files/0.log" Mar 14 08:36:54 crc kubenswrapper[4893]: I0314 08:36:54.078812 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-clzml_92e0ed09-c3c1-42a9-9405-fcf906686b43/controller/0.log" Mar 14 08:36:54 crc kubenswrapper[4893]: I0314 08:36:54.152439 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7m6db_17f61fcc-48c5-42ef-97a4-9440159a79fc/cp-frr-files/0.log" Mar 14 08:36:54 crc kubenswrapper[4893]: I0314 08:36:54.180099 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7m6db_17f61fcc-48c5-42ef-97a4-9440159a79fc/cp-metrics/0.log" Mar 14 08:36:54 crc kubenswrapper[4893]: I0314 08:36:54.193572 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7m6db_17f61fcc-48c5-42ef-97a4-9440159a79fc/cp-reloader/0.log" Mar 14 08:36:54 crc kubenswrapper[4893]: I0314 08:36:54.243050 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7m6db_17f61fcc-48c5-42ef-97a4-9440159a79fc/cp-reloader/0.log" Mar 14 08:36:54 crc kubenswrapper[4893]: I0314 08:36:54.426515 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7m6db_17f61fcc-48c5-42ef-97a4-9440159a79fc/cp-metrics/0.log" Mar 14 08:36:54 crc kubenswrapper[4893]: I0314 08:36:54.443207 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7m6db_17f61fcc-48c5-42ef-97a4-9440159a79fc/cp-metrics/0.log" Mar 14 08:36:54 crc kubenswrapper[4893]: I0314 08:36:54.480560 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7m6db_17f61fcc-48c5-42ef-97a4-9440159a79fc/cp-frr-files/0.log" Mar 14 08:36:54 crc kubenswrapper[4893]: I0314 08:36:54.482341 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7m6db_17f61fcc-48c5-42ef-97a4-9440159a79fc/cp-reloader/0.log" Mar 14 08:36:54 crc kubenswrapper[4893]: I0314 08:36:54.625063 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7m6db_17f61fcc-48c5-42ef-97a4-9440159a79fc/cp-reloader/0.log" Mar 14 08:36:54 crc kubenswrapper[4893]: I0314 08:36:54.651538 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7m6db_17f61fcc-48c5-42ef-97a4-9440159a79fc/cp-frr-files/0.log" Mar 14 08:36:54 crc kubenswrapper[4893]: I0314 08:36:54.677331 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7m6db_17f61fcc-48c5-42ef-97a4-9440159a79fc/cp-metrics/0.log" Mar 14 08:36:54 crc kubenswrapper[4893]: I0314 08:36:54.701336 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7m6db_17f61fcc-48c5-42ef-97a4-9440159a79fc/controller/0.log" Mar 14 08:36:54 crc kubenswrapper[4893]: I0314 08:36:54.850092 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7m6db_17f61fcc-48c5-42ef-97a4-9440159a79fc/frr-metrics/0.log" Mar 14 08:36:54 crc kubenswrapper[4893]: I0314 08:36:54.911915 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7m6db_17f61fcc-48c5-42ef-97a4-9440159a79fc/kube-rbac-proxy/0.log" Mar 14 08:36:54 crc kubenswrapper[4893]: I0314 08:36:54.984781 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7m6db_17f61fcc-48c5-42ef-97a4-9440159a79fc/kube-rbac-proxy-frr/0.log" Mar 14 08:36:55 crc kubenswrapper[4893]: I0314 08:36:55.090655 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7m6db_17f61fcc-48c5-42ef-97a4-9440159a79fc/reloader/0.log" Mar 14 08:36:55 crc kubenswrapper[4893]: I0314 08:36:55.215774 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-s55qs_c697e0ec-f27c-4fff-9e27-bb5ba9738ad5/frr-k8s-webhook-server/0.log" Mar 14 08:36:55 crc kubenswrapper[4893]: I0314 08:36:55.348291 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7447f7bf6-2crtw_02873c8a-a664-48db-bc40-5c9862429b0f/manager/0.log" Mar 14 08:36:55 crc kubenswrapper[4893]: I0314 08:36:55.530374 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7f5cb5468c-x85cn_fe78adb3-52b5-406b-b4e9-64111677415f/webhook-server/0.log" Mar 14 08:36:55 crc kubenswrapper[4893]: I0314 08:36:55.647603 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-2pzp9_25c5b449-1fb7-4fa3-80da-3227b24100d5/kube-rbac-proxy/0.log" Mar 14 08:36:56 crc kubenswrapper[4893]: I0314 08:36:56.204004 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-2pzp9_25c5b449-1fb7-4fa3-80da-3227b24100d5/speaker/0.log" Mar 14 08:36:56 crc kubenswrapper[4893]: I0314 08:36:56.545116 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7m6db_17f61fcc-48c5-42ef-97a4-9440159a79fc/frr/0.log" Mar 14 08:36:59 crc kubenswrapper[4893]: I0314 08:36:59.731394 4893 patch_prober.go:28] interesting pod/machine-config-daemon-d4x6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:36:59 crc kubenswrapper[4893]: I0314 08:36:59.731905 4893 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:36:59 crc kubenswrapper[4893]: I0314 08:36:59.731976 4893 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" Mar 14 08:36:59 crc kubenswrapper[4893]: I0314 08:36:59.732901 4893 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"19d1c71c015278c3b832cc9c2d45649b6580e0fc82b6df298faf32bcd6ab8569"} pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 08:36:59 crc kubenswrapper[4893]: I0314 08:36:59.733004 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerName="machine-config-daemon" containerID="cri-o://19d1c71c015278c3b832cc9c2d45649b6580e0fc82b6df298faf32bcd6ab8569" gracePeriod=600 Mar 14 08:36:59 crc kubenswrapper[4893]: E0314 08:36:59.858079 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:37:00 crc kubenswrapper[4893]: I0314 08:37:00.839306 4893 generic.go:334] "Generic (PLEG): container finished" podID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" containerID="19d1c71c015278c3b832cc9c2d45649b6580e0fc82b6df298faf32bcd6ab8569" exitCode=0 Mar 14 08:37:00 crc kubenswrapper[4893]: I0314 08:37:00.839454 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" event={"ID":"ad6724e5-48cf-4417-ae51-b1cb8c6af70d","Type":"ContainerDied","Data":"19d1c71c015278c3b832cc9c2d45649b6580e0fc82b6df298faf32bcd6ab8569"} Mar 14 08:37:00 crc kubenswrapper[4893]: I0314 08:37:00.840414 4893 scope.go:117] "RemoveContainer" containerID="3083f08c6eb6e0b335286ff295ebf4380e7a6e56f11d4fec39ab2b9ffa84f078" Mar 14 08:37:00 crc kubenswrapper[4893]: I0314 08:37:00.841444 4893 scope.go:117] "RemoveContainer" containerID="19d1c71c015278c3b832cc9c2d45649b6580e0fc82b6df298faf32bcd6ab8569" Mar 14 08:37:00 crc kubenswrapper[4893]: E0314 08:37:00.841862 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:37:08 crc kubenswrapper[4893]: I0314 08:37:08.807190 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874cdkvf_eee28f73-c7eb-4a40-b0a0-3eb53b1cba7f/util/0.log" Mar 14 08:37:08 crc kubenswrapper[4893]: I0314 08:37:08.990587 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874cdkvf_eee28f73-c7eb-4a40-b0a0-3eb53b1cba7f/util/0.log" Mar 14 08:37:09 crc kubenswrapper[4893]: I0314 08:37:09.033960 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874cdkvf_eee28f73-c7eb-4a40-b0a0-3eb53b1cba7f/pull/0.log" Mar 14 08:37:09 crc kubenswrapper[4893]: I0314 08:37:09.093943 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874cdkvf_eee28f73-c7eb-4a40-b0a0-3eb53b1cba7f/pull/0.log" Mar 14 08:37:09 crc kubenswrapper[4893]: I0314 08:37:09.198879 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874cdkvf_eee28f73-c7eb-4a40-b0a0-3eb53b1cba7f/util/0.log" Mar 14 08:37:09 crc kubenswrapper[4893]: I0314 08:37:09.227577 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874cdkvf_eee28f73-c7eb-4a40-b0a0-3eb53b1cba7f/extract/0.log" Mar 14 08:37:09 crc kubenswrapper[4893]: I0314 08:37:09.262875 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874cdkvf_eee28f73-c7eb-4a40-b0a0-3eb53b1cba7f/pull/0.log" Mar 14 08:37:09 crc kubenswrapper[4893]: I0314 08:37:09.490962 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1gxkwj_901f8898-8f02-4d53-9795-99e707b401c6/util/0.log" Mar 14 08:37:09 crc kubenswrapper[4893]: I0314 08:37:09.693081 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1gxkwj_901f8898-8f02-4d53-9795-99e707b401c6/pull/0.log" Mar 14 08:37:09 crc kubenswrapper[4893]: I0314 08:37:09.696499 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1gxkwj_901f8898-8f02-4d53-9795-99e707b401c6/pull/0.log" Mar 14 08:37:09 crc kubenswrapper[4893]: I0314 08:37:09.765028 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1gxkwj_901f8898-8f02-4d53-9795-99e707b401c6/util/0.log" Mar 14 08:37:09 crc kubenswrapper[4893]: I0314 08:37:09.834122 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1gxkwj_901f8898-8f02-4d53-9795-99e707b401c6/util/0.log" Mar 14 08:37:09 crc kubenswrapper[4893]: I0314 08:37:09.913967 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1gxkwj_901f8898-8f02-4d53-9795-99e707b401c6/pull/0.log" Mar 14 08:37:09 crc kubenswrapper[4893]: I0314 08:37:09.923498 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1gxkwj_901f8898-8f02-4d53-9795-99e707b401c6/extract/0.log" Mar 14 08:37:10 crc kubenswrapper[4893]: I0314 08:37:10.070324 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fhp5n_fb3756e8-b82d-40c5-90c6-0cc4114980a6/util/0.log" Mar 14 08:37:10 crc kubenswrapper[4893]: I0314 08:37:10.237954 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fhp5n_fb3756e8-b82d-40c5-90c6-0cc4114980a6/util/0.log" Mar 14 08:37:10 crc kubenswrapper[4893]: I0314 08:37:10.244665 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fhp5n_fb3756e8-b82d-40c5-90c6-0cc4114980a6/pull/0.log" Mar 14 08:37:10 crc kubenswrapper[4893]: I0314 08:37:10.261454 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fhp5n_fb3756e8-b82d-40c5-90c6-0cc4114980a6/pull/0.log" Mar 14 08:37:10 crc kubenswrapper[4893]: I0314 08:37:10.411736 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fhp5n_fb3756e8-b82d-40c5-90c6-0cc4114980a6/extract/0.log" Mar 14 08:37:10 crc kubenswrapper[4893]: I0314 08:37:10.479228 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fhp5n_fb3756e8-b82d-40c5-90c6-0cc4114980a6/pull/0.log" Mar 14 08:37:10 crc kubenswrapper[4893]: I0314 08:37:10.504664 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fhp5n_fb3756e8-b82d-40c5-90c6-0cc4114980a6/util/0.log" Mar 14 08:37:10 crc kubenswrapper[4893]: I0314 08:37:10.621595 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pcmps_d21f5443-3331-4559-814d-5a68ee167fa5/extract-utilities/0.log" Mar 14 08:37:10 crc kubenswrapper[4893]: I0314 08:37:10.756739 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pcmps_d21f5443-3331-4559-814d-5a68ee167fa5/extract-utilities/0.log" Mar 14 08:37:10 crc kubenswrapper[4893]: I0314 08:37:10.775087 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pcmps_d21f5443-3331-4559-814d-5a68ee167fa5/extract-content/0.log" Mar 14 08:37:10 crc kubenswrapper[4893]: I0314 08:37:10.782698 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pcmps_d21f5443-3331-4559-814d-5a68ee167fa5/extract-content/0.log" Mar 14 08:37:10 crc kubenswrapper[4893]: I0314 08:37:10.946243 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pcmps_d21f5443-3331-4559-814d-5a68ee167fa5/extract-utilities/0.log" Mar 14 08:37:10 crc kubenswrapper[4893]: I0314 08:37:10.947988 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pcmps_d21f5443-3331-4559-814d-5a68ee167fa5/extract-content/0.log" Mar 14 08:37:11 crc kubenswrapper[4893]: I0314 08:37:11.167284 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mxkg9_f5c55a2d-4a16-4a38-97d2-1af4328c8ae7/extract-utilities/0.log" Mar 14 08:37:11 crc kubenswrapper[4893]: I0314 08:37:11.373209 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mxkg9_f5c55a2d-4a16-4a38-97d2-1af4328c8ae7/extract-utilities/0.log" Mar 14 08:37:11 crc kubenswrapper[4893]: I0314 08:37:11.405511 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mxkg9_f5c55a2d-4a16-4a38-97d2-1af4328c8ae7/extract-content/0.log" Mar 14 08:37:11 crc kubenswrapper[4893]: I0314 08:37:11.445748 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mxkg9_f5c55a2d-4a16-4a38-97d2-1af4328c8ae7/extract-content/0.log" Mar 14 08:37:11 crc kubenswrapper[4893]: I0314 08:37:11.599866 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mxkg9_f5c55a2d-4a16-4a38-97d2-1af4328c8ae7/extract-content/0.log" Mar 14 08:37:11 crc kubenswrapper[4893]: I0314 08:37:11.620216 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mxkg9_f5c55a2d-4a16-4a38-97d2-1af4328c8ae7/extract-utilities/0.log" Mar 14 08:37:11 crc kubenswrapper[4893]: I0314 08:37:11.781700 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pcmps_d21f5443-3331-4559-814d-5a68ee167fa5/registry-server/0.log" Mar 14 08:37:11 crc kubenswrapper[4893]: I0314 08:37:11.819263 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-8kqzg_34d6b9ea-7ada-4230-a712-fa63c21038a1/marketplace-operator/0.log" Mar 14 08:37:12 crc kubenswrapper[4893]: I0314 08:37:12.069765 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-t2cjr_a2bde7b6-4ac1-40b1-b678-c64594684743/extract-utilities/0.log" Mar 14 08:37:12 crc kubenswrapper[4893]: I0314 08:37:12.216001 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mxkg9_f5c55a2d-4a16-4a38-97d2-1af4328c8ae7/registry-server/0.log" Mar 14 08:37:12 crc kubenswrapper[4893]: I0314 08:37:12.245433 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-t2cjr_a2bde7b6-4ac1-40b1-b678-c64594684743/extract-content/0.log" Mar 14 08:37:12 crc kubenswrapper[4893]: I0314 08:37:12.278891 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-t2cjr_a2bde7b6-4ac1-40b1-b678-c64594684743/extract-utilities/0.log" Mar 14 08:37:12 crc kubenswrapper[4893]: I0314 08:37:12.349636 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-t2cjr_a2bde7b6-4ac1-40b1-b678-c64594684743/extract-content/0.log" Mar 14 08:37:12 crc kubenswrapper[4893]: I0314 08:37:12.501723 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-t2cjr_a2bde7b6-4ac1-40b1-b678-c64594684743/extract-content/0.log" Mar 14 08:37:12 crc kubenswrapper[4893]: I0314 08:37:12.550058 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-t2cjr_a2bde7b6-4ac1-40b1-b678-c64594684743/extract-utilities/0.log" Mar 14 08:37:12 crc kubenswrapper[4893]: I0314 08:37:12.693998 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xpqn5_57b36d58-d179-46c0-a64b-db5bb0ccc8be/extract-utilities/0.log" Mar 14 08:37:12 crc kubenswrapper[4893]: I0314 08:37:12.764192 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-t2cjr_a2bde7b6-4ac1-40b1-b678-c64594684743/registry-server/0.log" Mar 14 08:37:12 crc kubenswrapper[4893]: I0314 08:37:12.883343 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xpqn5_57b36d58-d179-46c0-a64b-db5bb0ccc8be/extract-utilities/0.log" Mar 14 08:37:12 crc kubenswrapper[4893]: I0314 08:37:12.883494 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xpqn5_57b36d58-d179-46c0-a64b-db5bb0ccc8be/extract-content/0.log" Mar 14 08:37:12 crc kubenswrapper[4893]: I0314 08:37:12.907410 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xpqn5_57b36d58-d179-46c0-a64b-db5bb0ccc8be/extract-content/0.log" Mar 14 08:37:13 crc kubenswrapper[4893]: I0314 08:37:13.054362 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xpqn5_57b36d58-d179-46c0-a64b-db5bb0ccc8be/extract-utilities/0.log" Mar 14 08:37:13 crc kubenswrapper[4893]: I0314 08:37:13.065614 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xpqn5_57b36d58-d179-46c0-a64b-db5bb0ccc8be/extract-content/0.log" Mar 14 08:37:13 crc kubenswrapper[4893]: I0314 08:37:13.941560 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xpqn5_57b36d58-d179-46c0-a64b-db5bb0ccc8be/registry-server/0.log" Mar 14 08:37:15 crc kubenswrapper[4893]: I0314 08:37:15.377823 4893 scope.go:117] "RemoveContainer" containerID="19d1c71c015278c3b832cc9c2d45649b6580e0fc82b6df298faf32bcd6ab8569" Mar 14 08:37:15 crc kubenswrapper[4893]: E0314 08:37:15.378609 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:37:30 crc kubenswrapper[4893]: I0314 08:37:30.376450 4893 scope.go:117] "RemoveContainer" containerID="19d1c71c015278c3b832cc9c2d45649b6580e0fc82b6df298faf32bcd6ab8569" Mar 14 08:37:30 crc kubenswrapper[4893]: E0314 08:37:30.377430 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:37:42 crc kubenswrapper[4893]: I0314 08:37:42.376567 4893 scope.go:117] "RemoveContainer" containerID="19d1c71c015278c3b832cc9c2d45649b6580e0fc82b6df298faf32bcd6ab8569" Mar 14 08:37:42 crc kubenswrapper[4893]: E0314 08:37:42.377483 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:37:56 crc kubenswrapper[4893]: I0314 08:37:56.376348 4893 scope.go:117] "RemoveContainer" containerID="19d1c71c015278c3b832cc9c2d45649b6580e0fc82b6df298faf32bcd6ab8569" Mar 14 08:37:56 crc kubenswrapper[4893]: E0314 08:37:56.378503 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:38:00 crc kubenswrapper[4893]: I0314 08:38:00.141319 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557958-cll5w"] Mar 14 08:38:00 crc kubenswrapper[4893]: E0314 08:38:00.141915 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="519ad268-04fb-45a8-adeb-261c1f9e8f8d" containerName="oc" Mar 14 08:38:00 crc kubenswrapper[4893]: I0314 08:38:00.141928 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="519ad268-04fb-45a8-adeb-261c1f9e8f8d" containerName="oc" Mar 14 08:38:00 crc kubenswrapper[4893]: I0314 08:38:00.142069 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="519ad268-04fb-45a8-adeb-261c1f9e8f8d" containerName="oc" Mar 14 08:38:00 crc kubenswrapper[4893]: I0314 08:38:00.142672 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557958-cll5w" Mar 14 08:38:00 crc kubenswrapper[4893]: I0314 08:38:00.145031 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-44qb7" Mar 14 08:38:00 crc kubenswrapper[4893]: I0314 08:38:00.145666 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 08:38:00 crc kubenswrapper[4893]: I0314 08:38:00.145882 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 08:38:00 crc kubenswrapper[4893]: I0314 08:38:00.153770 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557958-cll5w"] Mar 14 08:38:00 crc kubenswrapper[4893]: I0314 08:38:00.195031 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2tbb\" (UniqueName: \"kubernetes.io/projected/b2973bfb-fbeb-4610-bf77-f1c339ea56a7-kube-api-access-l2tbb\") pod \"auto-csr-approver-29557958-cll5w\" (UID: \"b2973bfb-fbeb-4610-bf77-f1c339ea56a7\") " pod="openshift-infra/auto-csr-approver-29557958-cll5w" Mar 14 08:38:00 crc kubenswrapper[4893]: I0314 08:38:00.295440 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2tbb\" (UniqueName: \"kubernetes.io/projected/b2973bfb-fbeb-4610-bf77-f1c339ea56a7-kube-api-access-l2tbb\") pod \"auto-csr-approver-29557958-cll5w\" (UID: \"b2973bfb-fbeb-4610-bf77-f1c339ea56a7\") " pod="openshift-infra/auto-csr-approver-29557958-cll5w" Mar 14 08:38:00 crc kubenswrapper[4893]: I0314 08:38:00.325240 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2tbb\" (UniqueName: \"kubernetes.io/projected/b2973bfb-fbeb-4610-bf77-f1c339ea56a7-kube-api-access-l2tbb\") pod \"auto-csr-approver-29557958-cll5w\" (UID: \"b2973bfb-fbeb-4610-bf77-f1c339ea56a7\") " pod="openshift-infra/auto-csr-approver-29557958-cll5w" Mar 14 08:38:00 crc kubenswrapper[4893]: I0314 08:38:00.467870 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557958-cll5w" Mar 14 08:38:00 crc kubenswrapper[4893]: I0314 08:38:00.909494 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557958-cll5w"] Mar 14 08:38:00 crc kubenswrapper[4893]: I0314 08:38:00.919586 4893 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 08:38:01 crc kubenswrapper[4893]: I0314 08:38:01.249403 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557958-cll5w" event={"ID":"b2973bfb-fbeb-4610-bf77-f1c339ea56a7","Type":"ContainerStarted","Data":"f72ad675116ec7a1f7e12dc594a84b3fbeab65ab080970ba6535c295642855b6"} Mar 14 08:38:03 crc kubenswrapper[4893]: I0314 08:38:03.265624 4893 generic.go:334] "Generic (PLEG): container finished" podID="b2973bfb-fbeb-4610-bf77-f1c339ea56a7" containerID="22af4973d927048e103915644baaeccdbb22e1a3e2a2d7ad7d238d3bfea15505" exitCode=0 Mar 14 08:38:03 crc kubenswrapper[4893]: I0314 08:38:03.265733 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557958-cll5w" event={"ID":"b2973bfb-fbeb-4610-bf77-f1c339ea56a7","Type":"ContainerDied","Data":"22af4973d927048e103915644baaeccdbb22e1a3e2a2d7ad7d238d3bfea15505"} Mar 14 08:38:04 crc kubenswrapper[4893]: I0314 08:38:04.532726 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557958-cll5w" Mar 14 08:38:04 crc kubenswrapper[4893]: I0314 08:38:04.654872 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2tbb\" (UniqueName: \"kubernetes.io/projected/b2973bfb-fbeb-4610-bf77-f1c339ea56a7-kube-api-access-l2tbb\") pod \"b2973bfb-fbeb-4610-bf77-f1c339ea56a7\" (UID: \"b2973bfb-fbeb-4610-bf77-f1c339ea56a7\") " Mar 14 08:38:04 crc kubenswrapper[4893]: I0314 08:38:04.664006 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2973bfb-fbeb-4610-bf77-f1c339ea56a7-kube-api-access-l2tbb" (OuterVolumeSpecName: "kube-api-access-l2tbb") pod "b2973bfb-fbeb-4610-bf77-f1c339ea56a7" (UID: "b2973bfb-fbeb-4610-bf77-f1c339ea56a7"). InnerVolumeSpecName "kube-api-access-l2tbb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:38:04 crc kubenswrapper[4893]: I0314 08:38:04.757431 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2tbb\" (UniqueName: \"kubernetes.io/projected/b2973bfb-fbeb-4610-bf77-f1c339ea56a7-kube-api-access-l2tbb\") on node \"crc\" DevicePath \"\"" Mar 14 08:38:05 crc kubenswrapper[4893]: I0314 08:38:05.285760 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557958-cll5w" event={"ID":"b2973bfb-fbeb-4610-bf77-f1c339ea56a7","Type":"ContainerDied","Data":"f72ad675116ec7a1f7e12dc594a84b3fbeab65ab080970ba6535c295642855b6"} Mar 14 08:38:05 crc kubenswrapper[4893]: I0314 08:38:05.285806 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f72ad675116ec7a1f7e12dc594a84b3fbeab65ab080970ba6535c295642855b6" Mar 14 08:38:05 crc kubenswrapper[4893]: I0314 08:38:05.285866 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557958-cll5w" Mar 14 08:38:05 crc kubenswrapper[4893]: I0314 08:38:05.593224 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557952-8n84s"] Mar 14 08:38:05 crc kubenswrapper[4893]: I0314 08:38:05.598035 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557952-8n84s"] Mar 14 08:38:07 crc kubenswrapper[4893]: I0314 08:38:07.387374 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="087cb6ca-245e-409d-85dc-3e1127fa79c7" path="/var/lib/kubelet/pods/087cb6ca-245e-409d-85dc-3e1127fa79c7/volumes" Mar 14 08:38:10 crc kubenswrapper[4893]: I0314 08:38:10.581764 4893 scope.go:117] "RemoveContainer" containerID="bee0f519d3e5f5f9352b6d23fdbbddc534e45e7862e1761d039e68b32eb9c48c" Mar 14 08:38:11 crc kubenswrapper[4893]: I0314 08:38:11.382139 4893 scope.go:117] "RemoveContainer" containerID="19d1c71c015278c3b832cc9c2d45649b6580e0fc82b6df298faf32bcd6ab8569" Mar 14 08:38:11 crc kubenswrapper[4893]: E0314 08:38:11.382941 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:38:23 crc kubenswrapper[4893]: I0314 08:38:23.376546 4893 scope.go:117] "RemoveContainer" containerID="19d1c71c015278c3b832cc9c2d45649b6580e0fc82b6df298faf32bcd6ab8569" Mar 14 08:38:23 crc kubenswrapper[4893]: E0314 08:38:23.377482 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:38:27 crc kubenswrapper[4893]: I0314 08:38:27.483574 4893 generic.go:334] "Generic (PLEG): container finished" podID="cd49778a-3ef6-40f5-ac5f-d84b89b8c786" containerID="4c74824d0e8a8ec320a497350ae8441b7c3998867a9bd43f20ec3a6cf6ec7215" exitCode=0 Mar 14 08:38:27 crc kubenswrapper[4893]: I0314 08:38:27.483677 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fq4zv/must-gather-rs75c" event={"ID":"cd49778a-3ef6-40f5-ac5f-d84b89b8c786","Type":"ContainerDied","Data":"4c74824d0e8a8ec320a497350ae8441b7c3998867a9bd43f20ec3a6cf6ec7215"} Mar 14 08:38:27 crc kubenswrapper[4893]: I0314 08:38:27.484293 4893 scope.go:117] "RemoveContainer" containerID="4c74824d0e8a8ec320a497350ae8441b7c3998867a9bd43f20ec3a6cf6ec7215" Mar 14 08:38:27 crc kubenswrapper[4893]: I0314 08:38:27.912415 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-fq4zv_must-gather-rs75c_cd49778a-3ef6-40f5-ac5f-d84b89b8c786/gather/0.log" Mar 14 08:38:35 crc kubenswrapper[4893]: I0314 08:38:35.481663 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-fq4zv/must-gather-rs75c"] Mar 14 08:38:35 crc kubenswrapper[4893]: I0314 08:38:35.483611 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-fq4zv/must-gather-rs75c" podUID="cd49778a-3ef6-40f5-ac5f-d84b89b8c786" containerName="copy" containerID="cri-o://bc56b360dcf36c666583bbc24e076c6391d77a8066c2aefc3a3d1806c2793681" gracePeriod=2 Mar 14 08:38:35 crc kubenswrapper[4893]: I0314 08:38:35.487715 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-fq4zv/must-gather-rs75c"] Mar 14 08:38:35 crc kubenswrapper[4893]: I0314 08:38:35.893732 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-fq4zv_must-gather-rs75c_cd49778a-3ef6-40f5-ac5f-d84b89b8c786/copy/0.log" Mar 14 08:38:35 crc kubenswrapper[4893]: I0314 08:38:35.894328 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fq4zv/must-gather-rs75c" Mar 14 08:38:36 crc kubenswrapper[4893]: I0314 08:38:36.032387 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ghww\" (UniqueName: \"kubernetes.io/projected/cd49778a-3ef6-40f5-ac5f-d84b89b8c786-kube-api-access-6ghww\") pod \"cd49778a-3ef6-40f5-ac5f-d84b89b8c786\" (UID: \"cd49778a-3ef6-40f5-ac5f-d84b89b8c786\") " Mar 14 08:38:36 crc kubenswrapper[4893]: I0314 08:38:36.032556 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/cd49778a-3ef6-40f5-ac5f-d84b89b8c786-must-gather-output\") pod \"cd49778a-3ef6-40f5-ac5f-d84b89b8c786\" (UID: \"cd49778a-3ef6-40f5-ac5f-d84b89b8c786\") " Mar 14 08:38:36 crc kubenswrapper[4893]: I0314 08:38:36.038549 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd49778a-3ef6-40f5-ac5f-d84b89b8c786-kube-api-access-6ghww" (OuterVolumeSpecName: "kube-api-access-6ghww") pod "cd49778a-3ef6-40f5-ac5f-d84b89b8c786" (UID: "cd49778a-3ef6-40f5-ac5f-d84b89b8c786"). InnerVolumeSpecName "kube-api-access-6ghww". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:38:36 crc kubenswrapper[4893]: I0314 08:38:36.125666 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd49778a-3ef6-40f5-ac5f-d84b89b8c786-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "cd49778a-3ef6-40f5-ac5f-d84b89b8c786" (UID: "cd49778a-3ef6-40f5-ac5f-d84b89b8c786"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:38:36 crc kubenswrapper[4893]: I0314 08:38:36.134071 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ghww\" (UniqueName: \"kubernetes.io/projected/cd49778a-3ef6-40f5-ac5f-d84b89b8c786-kube-api-access-6ghww\") on node \"crc\" DevicePath \"\"" Mar 14 08:38:36 crc kubenswrapper[4893]: I0314 08:38:36.134116 4893 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/cd49778a-3ef6-40f5-ac5f-d84b89b8c786-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 14 08:38:36 crc kubenswrapper[4893]: I0314 08:38:36.541006 4893 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-fq4zv_must-gather-rs75c_cd49778a-3ef6-40f5-ac5f-d84b89b8c786/copy/0.log" Mar 14 08:38:36 crc kubenswrapper[4893]: I0314 08:38:36.541427 4893 generic.go:334] "Generic (PLEG): container finished" podID="cd49778a-3ef6-40f5-ac5f-d84b89b8c786" containerID="bc56b360dcf36c666583bbc24e076c6391d77a8066c2aefc3a3d1806c2793681" exitCode=143 Mar 14 08:38:36 crc kubenswrapper[4893]: I0314 08:38:36.541485 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fq4zv/must-gather-rs75c" Mar 14 08:38:36 crc kubenswrapper[4893]: I0314 08:38:36.541542 4893 scope.go:117] "RemoveContainer" containerID="bc56b360dcf36c666583bbc24e076c6391d77a8066c2aefc3a3d1806c2793681" Mar 14 08:38:36 crc kubenswrapper[4893]: I0314 08:38:36.571024 4893 scope.go:117] "RemoveContainer" containerID="4c74824d0e8a8ec320a497350ae8441b7c3998867a9bd43f20ec3a6cf6ec7215" Mar 14 08:38:36 crc kubenswrapper[4893]: I0314 08:38:36.642807 4893 scope.go:117] "RemoveContainer" containerID="bc56b360dcf36c666583bbc24e076c6391d77a8066c2aefc3a3d1806c2793681" Mar 14 08:38:36 crc kubenswrapper[4893]: E0314 08:38:36.643298 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc56b360dcf36c666583bbc24e076c6391d77a8066c2aefc3a3d1806c2793681\": container with ID starting with bc56b360dcf36c666583bbc24e076c6391d77a8066c2aefc3a3d1806c2793681 not found: ID does not exist" containerID="bc56b360dcf36c666583bbc24e076c6391d77a8066c2aefc3a3d1806c2793681" Mar 14 08:38:36 crc kubenswrapper[4893]: I0314 08:38:36.643329 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc56b360dcf36c666583bbc24e076c6391d77a8066c2aefc3a3d1806c2793681"} err="failed to get container status \"bc56b360dcf36c666583bbc24e076c6391d77a8066c2aefc3a3d1806c2793681\": rpc error: code = NotFound desc = could not find container \"bc56b360dcf36c666583bbc24e076c6391d77a8066c2aefc3a3d1806c2793681\": container with ID starting with bc56b360dcf36c666583bbc24e076c6391d77a8066c2aefc3a3d1806c2793681 not found: ID does not exist" Mar 14 08:38:36 crc kubenswrapper[4893]: I0314 08:38:36.643355 4893 scope.go:117] "RemoveContainer" containerID="4c74824d0e8a8ec320a497350ae8441b7c3998867a9bd43f20ec3a6cf6ec7215" Mar 14 08:38:36 crc kubenswrapper[4893]: E0314 08:38:36.643658 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c74824d0e8a8ec320a497350ae8441b7c3998867a9bd43f20ec3a6cf6ec7215\": container with ID starting with 4c74824d0e8a8ec320a497350ae8441b7c3998867a9bd43f20ec3a6cf6ec7215 not found: ID does not exist" containerID="4c74824d0e8a8ec320a497350ae8441b7c3998867a9bd43f20ec3a6cf6ec7215" Mar 14 08:38:36 crc kubenswrapper[4893]: I0314 08:38:36.643679 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c74824d0e8a8ec320a497350ae8441b7c3998867a9bd43f20ec3a6cf6ec7215"} err="failed to get container status \"4c74824d0e8a8ec320a497350ae8441b7c3998867a9bd43f20ec3a6cf6ec7215\": rpc error: code = NotFound desc = could not find container \"4c74824d0e8a8ec320a497350ae8441b7c3998867a9bd43f20ec3a6cf6ec7215\": container with ID starting with 4c74824d0e8a8ec320a497350ae8441b7c3998867a9bd43f20ec3a6cf6ec7215 not found: ID does not exist" Mar 14 08:38:37 crc kubenswrapper[4893]: I0314 08:38:37.387581 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd49778a-3ef6-40f5-ac5f-d84b89b8c786" path="/var/lib/kubelet/pods/cd49778a-3ef6-40f5-ac5f-d84b89b8c786/volumes" Mar 14 08:38:38 crc kubenswrapper[4893]: I0314 08:38:38.377497 4893 scope.go:117] "RemoveContainer" containerID="19d1c71c015278c3b832cc9c2d45649b6580e0fc82b6df298faf32bcd6ab8569" Mar 14 08:38:38 crc kubenswrapper[4893]: E0314 08:38:38.378105 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:38:51 crc kubenswrapper[4893]: I0314 08:38:51.381678 4893 scope.go:117] "RemoveContainer" containerID="19d1c71c015278c3b832cc9c2d45649b6580e0fc82b6df298faf32bcd6ab8569" Mar 14 08:38:51 crc kubenswrapper[4893]: E0314 08:38:51.382411 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:39:05 crc kubenswrapper[4893]: I0314 08:39:05.377059 4893 scope.go:117] "RemoveContainer" containerID="19d1c71c015278c3b832cc9c2d45649b6580e0fc82b6df298faf32bcd6ab8569" Mar 14 08:39:05 crc kubenswrapper[4893]: E0314 08:39:05.378070 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:39:19 crc kubenswrapper[4893]: I0314 08:39:19.376452 4893 scope.go:117] "RemoveContainer" containerID="19d1c71c015278c3b832cc9c2d45649b6580e0fc82b6df298faf32bcd6ab8569" Mar 14 08:39:19 crc kubenswrapper[4893]: E0314 08:39:19.377201 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:39:19 crc kubenswrapper[4893]: I0314 08:39:19.431893 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-k8j45"] Mar 14 08:39:19 crc kubenswrapper[4893]: E0314 08:39:19.432195 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd49778a-3ef6-40f5-ac5f-d84b89b8c786" containerName="gather" Mar 14 08:39:19 crc kubenswrapper[4893]: I0314 08:39:19.432209 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd49778a-3ef6-40f5-ac5f-d84b89b8c786" containerName="gather" Mar 14 08:39:19 crc kubenswrapper[4893]: E0314 08:39:19.432223 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2973bfb-fbeb-4610-bf77-f1c339ea56a7" containerName="oc" Mar 14 08:39:19 crc kubenswrapper[4893]: I0314 08:39:19.432229 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2973bfb-fbeb-4610-bf77-f1c339ea56a7" containerName="oc" Mar 14 08:39:19 crc kubenswrapper[4893]: E0314 08:39:19.432241 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd49778a-3ef6-40f5-ac5f-d84b89b8c786" containerName="copy" Mar 14 08:39:19 crc kubenswrapper[4893]: I0314 08:39:19.432250 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd49778a-3ef6-40f5-ac5f-d84b89b8c786" containerName="copy" Mar 14 08:39:19 crc kubenswrapper[4893]: I0314 08:39:19.432405 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2973bfb-fbeb-4610-bf77-f1c339ea56a7" containerName="oc" Mar 14 08:39:19 crc kubenswrapper[4893]: I0314 08:39:19.432422 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd49778a-3ef6-40f5-ac5f-d84b89b8c786" containerName="gather" Mar 14 08:39:19 crc kubenswrapper[4893]: I0314 08:39:19.432446 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd49778a-3ef6-40f5-ac5f-d84b89b8c786" containerName="copy" Mar 14 08:39:19 crc kubenswrapper[4893]: I0314 08:39:19.433418 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k8j45" Mar 14 08:39:19 crc kubenswrapper[4893]: I0314 08:39:19.446449 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k8j45"] Mar 14 08:39:19 crc kubenswrapper[4893]: I0314 08:39:19.565477 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/458e5ddf-3e48-48d7-9379-84226ced5d50-catalog-content\") pod \"redhat-operators-k8j45\" (UID: \"458e5ddf-3e48-48d7-9379-84226ced5d50\") " pod="openshift-marketplace/redhat-operators-k8j45" Mar 14 08:39:19 crc kubenswrapper[4893]: I0314 08:39:19.565607 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/458e5ddf-3e48-48d7-9379-84226ced5d50-utilities\") pod \"redhat-operators-k8j45\" (UID: \"458e5ddf-3e48-48d7-9379-84226ced5d50\") " pod="openshift-marketplace/redhat-operators-k8j45" Mar 14 08:39:19 crc kubenswrapper[4893]: I0314 08:39:19.565793 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbl2z\" (UniqueName: \"kubernetes.io/projected/458e5ddf-3e48-48d7-9379-84226ced5d50-kube-api-access-dbl2z\") pod \"redhat-operators-k8j45\" (UID: \"458e5ddf-3e48-48d7-9379-84226ced5d50\") " pod="openshift-marketplace/redhat-operators-k8j45" Mar 14 08:39:19 crc kubenswrapper[4893]: I0314 08:39:19.628977 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lkm5p"] Mar 14 08:39:19 crc kubenswrapper[4893]: I0314 08:39:19.630624 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lkm5p" Mar 14 08:39:19 crc kubenswrapper[4893]: I0314 08:39:19.638814 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lkm5p"] Mar 14 08:39:19 crc kubenswrapper[4893]: I0314 08:39:19.668905 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbl2z\" (UniqueName: \"kubernetes.io/projected/458e5ddf-3e48-48d7-9379-84226ced5d50-kube-api-access-dbl2z\") pod \"redhat-operators-k8j45\" (UID: \"458e5ddf-3e48-48d7-9379-84226ced5d50\") " pod="openshift-marketplace/redhat-operators-k8j45" Mar 14 08:39:19 crc kubenswrapper[4893]: I0314 08:39:19.669210 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/458e5ddf-3e48-48d7-9379-84226ced5d50-catalog-content\") pod \"redhat-operators-k8j45\" (UID: \"458e5ddf-3e48-48d7-9379-84226ced5d50\") " pod="openshift-marketplace/redhat-operators-k8j45" Mar 14 08:39:19 crc kubenswrapper[4893]: I0314 08:39:19.669739 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/458e5ddf-3e48-48d7-9379-84226ced5d50-utilities\") pod \"redhat-operators-k8j45\" (UID: \"458e5ddf-3e48-48d7-9379-84226ced5d50\") " pod="openshift-marketplace/redhat-operators-k8j45" Mar 14 08:39:19 crc kubenswrapper[4893]: I0314 08:39:19.671100 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/458e5ddf-3e48-48d7-9379-84226ced5d50-catalog-content\") pod \"redhat-operators-k8j45\" (UID: \"458e5ddf-3e48-48d7-9379-84226ced5d50\") " pod="openshift-marketplace/redhat-operators-k8j45" Mar 14 08:39:19 crc kubenswrapper[4893]: I0314 08:39:19.672848 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/458e5ddf-3e48-48d7-9379-84226ced5d50-utilities\") pod \"redhat-operators-k8j45\" (UID: \"458e5ddf-3e48-48d7-9379-84226ced5d50\") " pod="openshift-marketplace/redhat-operators-k8j45" Mar 14 08:39:19 crc kubenswrapper[4893]: I0314 08:39:19.688004 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbl2z\" (UniqueName: \"kubernetes.io/projected/458e5ddf-3e48-48d7-9379-84226ced5d50-kube-api-access-dbl2z\") pod \"redhat-operators-k8j45\" (UID: \"458e5ddf-3e48-48d7-9379-84226ced5d50\") " pod="openshift-marketplace/redhat-operators-k8j45" Mar 14 08:39:19 crc kubenswrapper[4893]: I0314 08:39:19.754298 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k8j45" Mar 14 08:39:19 crc kubenswrapper[4893]: I0314 08:39:19.771169 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da7f2d15-a152-482a-9ddc-dbc95caee46f-catalog-content\") pod \"redhat-marketplace-lkm5p\" (UID: \"da7f2d15-a152-482a-9ddc-dbc95caee46f\") " pod="openshift-marketplace/redhat-marketplace-lkm5p" Mar 14 08:39:19 crc kubenswrapper[4893]: I0314 08:39:19.771233 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da7f2d15-a152-482a-9ddc-dbc95caee46f-utilities\") pod \"redhat-marketplace-lkm5p\" (UID: \"da7f2d15-a152-482a-9ddc-dbc95caee46f\") " pod="openshift-marketplace/redhat-marketplace-lkm5p" Mar 14 08:39:19 crc kubenswrapper[4893]: I0314 08:39:19.771255 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sm6jz\" (UniqueName: \"kubernetes.io/projected/da7f2d15-a152-482a-9ddc-dbc95caee46f-kube-api-access-sm6jz\") pod \"redhat-marketplace-lkm5p\" (UID: \"da7f2d15-a152-482a-9ddc-dbc95caee46f\") " pod="openshift-marketplace/redhat-marketplace-lkm5p" Mar 14 08:39:19 crc kubenswrapper[4893]: I0314 08:39:19.872978 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da7f2d15-a152-482a-9ddc-dbc95caee46f-catalog-content\") pod \"redhat-marketplace-lkm5p\" (UID: \"da7f2d15-a152-482a-9ddc-dbc95caee46f\") " pod="openshift-marketplace/redhat-marketplace-lkm5p" Mar 14 08:39:19 crc kubenswrapper[4893]: I0314 08:39:19.873626 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da7f2d15-a152-482a-9ddc-dbc95caee46f-utilities\") pod \"redhat-marketplace-lkm5p\" (UID: \"da7f2d15-a152-482a-9ddc-dbc95caee46f\") " pod="openshift-marketplace/redhat-marketplace-lkm5p" Mar 14 08:39:19 crc kubenswrapper[4893]: I0314 08:39:19.873658 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sm6jz\" (UniqueName: \"kubernetes.io/projected/da7f2d15-a152-482a-9ddc-dbc95caee46f-kube-api-access-sm6jz\") pod \"redhat-marketplace-lkm5p\" (UID: \"da7f2d15-a152-482a-9ddc-dbc95caee46f\") " pod="openshift-marketplace/redhat-marketplace-lkm5p" Mar 14 08:39:19 crc kubenswrapper[4893]: I0314 08:39:19.873718 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da7f2d15-a152-482a-9ddc-dbc95caee46f-catalog-content\") pod \"redhat-marketplace-lkm5p\" (UID: \"da7f2d15-a152-482a-9ddc-dbc95caee46f\") " pod="openshift-marketplace/redhat-marketplace-lkm5p" Mar 14 08:39:19 crc kubenswrapper[4893]: I0314 08:39:19.873946 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da7f2d15-a152-482a-9ddc-dbc95caee46f-utilities\") pod \"redhat-marketplace-lkm5p\" (UID: \"da7f2d15-a152-482a-9ddc-dbc95caee46f\") " pod="openshift-marketplace/redhat-marketplace-lkm5p" Mar 14 08:39:19 crc kubenswrapper[4893]: I0314 08:39:19.894455 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sm6jz\" (UniqueName: \"kubernetes.io/projected/da7f2d15-a152-482a-9ddc-dbc95caee46f-kube-api-access-sm6jz\") pod \"redhat-marketplace-lkm5p\" (UID: \"da7f2d15-a152-482a-9ddc-dbc95caee46f\") " pod="openshift-marketplace/redhat-marketplace-lkm5p" Mar 14 08:39:19 crc kubenswrapper[4893]: I0314 08:39:19.950688 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lkm5p" Mar 14 08:39:20 crc kubenswrapper[4893]: I0314 08:39:20.172727 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lkm5p"] Mar 14 08:39:20 crc kubenswrapper[4893]: I0314 08:39:20.213081 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k8j45"] Mar 14 08:39:20 crc kubenswrapper[4893]: W0314 08:39:20.219661 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod458e5ddf_3e48_48d7_9379_84226ced5d50.slice/crio-b83c37f0ff9a4eb5abc5266409dfe08909be83f2002cfca306880d6d90cc96b8 WatchSource:0}: Error finding container b83c37f0ff9a4eb5abc5266409dfe08909be83f2002cfca306880d6d90cc96b8: Status 404 returned error can't find the container with id b83c37f0ff9a4eb5abc5266409dfe08909be83f2002cfca306880d6d90cc96b8 Mar 14 08:39:20 crc kubenswrapper[4893]: I0314 08:39:20.884212 4893 generic.go:334] "Generic (PLEG): container finished" podID="da7f2d15-a152-482a-9ddc-dbc95caee46f" containerID="f461e73b8e7841c55d8a1732d8faf3040bee5e9995dd9a406f3577de90675020" exitCode=0 Mar 14 08:39:20 crc kubenswrapper[4893]: I0314 08:39:20.884280 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lkm5p" event={"ID":"da7f2d15-a152-482a-9ddc-dbc95caee46f","Type":"ContainerDied","Data":"f461e73b8e7841c55d8a1732d8faf3040bee5e9995dd9a406f3577de90675020"} Mar 14 08:39:20 crc kubenswrapper[4893]: I0314 08:39:20.884593 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lkm5p" event={"ID":"da7f2d15-a152-482a-9ddc-dbc95caee46f","Type":"ContainerStarted","Data":"4aaa96e309c6e0bd3fd59c2ffd99018b12f6db85568bb034a31dce9235bc6ab2"} Mar 14 08:39:20 crc kubenswrapper[4893]: I0314 08:39:20.887361 4893 generic.go:334] "Generic (PLEG): container finished" podID="458e5ddf-3e48-48d7-9379-84226ced5d50" containerID="10995fe542c45997ad57f47e167c19771ca253d0a44ac7a3cd7e3798850b536b" exitCode=0 Mar 14 08:39:20 crc kubenswrapper[4893]: I0314 08:39:20.887394 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k8j45" event={"ID":"458e5ddf-3e48-48d7-9379-84226ced5d50","Type":"ContainerDied","Data":"10995fe542c45997ad57f47e167c19771ca253d0a44ac7a3cd7e3798850b536b"} Mar 14 08:39:20 crc kubenswrapper[4893]: I0314 08:39:20.887415 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k8j45" event={"ID":"458e5ddf-3e48-48d7-9379-84226ced5d50","Type":"ContainerStarted","Data":"b83c37f0ff9a4eb5abc5266409dfe08909be83f2002cfca306880d6d90cc96b8"} Mar 14 08:39:22 crc kubenswrapper[4893]: I0314 08:39:22.902675 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k8j45" event={"ID":"458e5ddf-3e48-48d7-9379-84226ced5d50","Type":"ContainerStarted","Data":"06ecc6582883736b0e8c56b9a187440a7b5aa8c6b61b8d66f87e63e4a170e86d"} Mar 14 08:39:22 crc kubenswrapper[4893]: I0314 08:39:22.904568 4893 generic.go:334] "Generic (PLEG): container finished" podID="da7f2d15-a152-482a-9ddc-dbc95caee46f" containerID="e1ed4dfb14ec0bdb89bb2fa8d900599ac497166c280b4a9dd2ff8d5e50ad388d" exitCode=0 Mar 14 08:39:22 crc kubenswrapper[4893]: I0314 08:39:22.904602 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lkm5p" event={"ID":"da7f2d15-a152-482a-9ddc-dbc95caee46f","Type":"ContainerDied","Data":"e1ed4dfb14ec0bdb89bb2fa8d900599ac497166c280b4a9dd2ff8d5e50ad388d"} Mar 14 08:39:23 crc kubenswrapper[4893]: I0314 08:39:23.914837 4893 generic.go:334] "Generic (PLEG): container finished" podID="458e5ddf-3e48-48d7-9379-84226ced5d50" containerID="06ecc6582883736b0e8c56b9a187440a7b5aa8c6b61b8d66f87e63e4a170e86d" exitCode=0 Mar 14 08:39:23 crc kubenswrapper[4893]: I0314 08:39:23.915854 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k8j45" event={"ID":"458e5ddf-3e48-48d7-9379-84226ced5d50","Type":"ContainerDied","Data":"06ecc6582883736b0e8c56b9a187440a7b5aa8c6b61b8d66f87e63e4a170e86d"} Mar 14 08:39:23 crc kubenswrapper[4893]: I0314 08:39:23.920076 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lkm5p" event={"ID":"da7f2d15-a152-482a-9ddc-dbc95caee46f","Type":"ContainerStarted","Data":"973f7c83c11056c69a481a66e20a268c2a5ed4fcd8ae0bddb784bf7f826bfee5"} Mar 14 08:39:24 crc kubenswrapper[4893]: I0314 08:39:24.929209 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k8j45" event={"ID":"458e5ddf-3e48-48d7-9379-84226ced5d50","Type":"ContainerStarted","Data":"ce3789019fb61efb32649a7e0ab492327c8eec80c370a616a45de86d291bffa0"} Mar 14 08:39:24 crc kubenswrapper[4893]: I0314 08:39:24.950673 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-k8j45" podStartSLOduration=2.53267378 podStartE2EDuration="5.950653876s" podCreationTimestamp="2026-03-14 08:39:19 +0000 UTC" firstStartedPulling="2026-03-14 08:39:20.889922925 +0000 UTC m=+6040.152099717" lastFinishedPulling="2026-03-14 08:39:24.307903021 +0000 UTC m=+6043.570079813" observedRunningTime="2026-03-14 08:39:24.948013292 +0000 UTC m=+6044.210190084" watchObservedRunningTime="2026-03-14 08:39:24.950653876 +0000 UTC m=+6044.212830668" Mar 14 08:39:24 crc kubenswrapper[4893]: I0314 08:39:24.953657 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lkm5p" podStartSLOduration=3.524004309 podStartE2EDuration="5.953647289s" podCreationTimestamp="2026-03-14 08:39:19 +0000 UTC" firstStartedPulling="2026-03-14 08:39:20.885798424 +0000 UTC m=+6040.147975216" lastFinishedPulling="2026-03-14 08:39:23.315441404 +0000 UTC m=+6042.577618196" observedRunningTime="2026-03-14 08:39:23.961781567 +0000 UTC m=+6043.223958359" watchObservedRunningTime="2026-03-14 08:39:24.953647289 +0000 UTC m=+6044.215824081" Mar 14 08:39:29 crc kubenswrapper[4893]: I0314 08:39:29.754679 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-k8j45" Mar 14 08:39:29 crc kubenswrapper[4893]: I0314 08:39:29.755271 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-k8j45" Mar 14 08:39:29 crc kubenswrapper[4893]: I0314 08:39:29.951897 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lkm5p" Mar 14 08:39:29 crc kubenswrapper[4893]: I0314 08:39:29.952143 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lkm5p" Mar 14 08:39:29 crc kubenswrapper[4893]: I0314 08:39:29.988160 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lkm5p" Mar 14 08:39:30 crc kubenswrapper[4893]: I0314 08:39:30.794401 4893 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-k8j45" podUID="458e5ddf-3e48-48d7-9379-84226ced5d50" containerName="registry-server" probeResult="failure" output=< Mar 14 08:39:30 crc kubenswrapper[4893]: timeout: failed to connect service ":50051" within 1s Mar 14 08:39:30 crc kubenswrapper[4893]: > Mar 14 08:39:31 crc kubenswrapper[4893]: I0314 08:39:31.015647 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lkm5p" Mar 14 08:39:31 crc kubenswrapper[4893]: I0314 08:39:31.064693 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lkm5p"] Mar 14 08:39:32 crc kubenswrapper[4893]: I0314 08:39:32.978216 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lkm5p" podUID="da7f2d15-a152-482a-9ddc-dbc95caee46f" containerName="registry-server" containerID="cri-o://973f7c83c11056c69a481a66e20a268c2a5ed4fcd8ae0bddb784bf7f826bfee5" gracePeriod=2 Mar 14 08:39:34 crc kubenswrapper[4893]: I0314 08:39:34.377429 4893 scope.go:117] "RemoveContainer" containerID="19d1c71c015278c3b832cc9c2d45649b6580e0fc82b6df298faf32bcd6ab8569" Mar 14 08:39:34 crc kubenswrapper[4893]: E0314 08:39:34.378146 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:39:36 crc kubenswrapper[4893]: I0314 08:39:36.004916 4893 generic.go:334] "Generic (PLEG): container finished" podID="da7f2d15-a152-482a-9ddc-dbc95caee46f" containerID="973f7c83c11056c69a481a66e20a268c2a5ed4fcd8ae0bddb784bf7f826bfee5" exitCode=0 Mar 14 08:39:36 crc kubenswrapper[4893]: I0314 08:39:36.005026 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lkm5p" event={"ID":"da7f2d15-a152-482a-9ddc-dbc95caee46f","Type":"ContainerDied","Data":"973f7c83c11056c69a481a66e20a268c2a5ed4fcd8ae0bddb784bf7f826bfee5"} Mar 14 08:39:36 crc kubenswrapper[4893]: I0314 08:39:36.312235 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lkm5p" Mar 14 08:39:36 crc kubenswrapper[4893]: I0314 08:39:36.404380 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da7f2d15-a152-482a-9ddc-dbc95caee46f-utilities\") pod \"da7f2d15-a152-482a-9ddc-dbc95caee46f\" (UID: \"da7f2d15-a152-482a-9ddc-dbc95caee46f\") " Mar 14 08:39:36 crc kubenswrapper[4893]: I0314 08:39:36.404451 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da7f2d15-a152-482a-9ddc-dbc95caee46f-catalog-content\") pod \"da7f2d15-a152-482a-9ddc-dbc95caee46f\" (UID: \"da7f2d15-a152-482a-9ddc-dbc95caee46f\") " Mar 14 08:39:36 crc kubenswrapper[4893]: I0314 08:39:36.404542 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sm6jz\" (UniqueName: \"kubernetes.io/projected/da7f2d15-a152-482a-9ddc-dbc95caee46f-kube-api-access-sm6jz\") pod \"da7f2d15-a152-482a-9ddc-dbc95caee46f\" (UID: \"da7f2d15-a152-482a-9ddc-dbc95caee46f\") " Mar 14 08:39:36 crc kubenswrapper[4893]: I0314 08:39:36.406030 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da7f2d15-a152-482a-9ddc-dbc95caee46f-utilities" (OuterVolumeSpecName: "utilities") pod "da7f2d15-a152-482a-9ddc-dbc95caee46f" (UID: "da7f2d15-a152-482a-9ddc-dbc95caee46f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:39:36 crc kubenswrapper[4893]: I0314 08:39:36.413878 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da7f2d15-a152-482a-9ddc-dbc95caee46f-kube-api-access-sm6jz" (OuterVolumeSpecName: "kube-api-access-sm6jz") pod "da7f2d15-a152-482a-9ddc-dbc95caee46f" (UID: "da7f2d15-a152-482a-9ddc-dbc95caee46f"). InnerVolumeSpecName "kube-api-access-sm6jz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:39:36 crc kubenswrapper[4893]: I0314 08:39:36.430067 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da7f2d15-a152-482a-9ddc-dbc95caee46f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "da7f2d15-a152-482a-9ddc-dbc95caee46f" (UID: "da7f2d15-a152-482a-9ddc-dbc95caee46f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:39:36 crc kubenswrapper[4893]: I0314 08:39:36.506413 4893 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da7f2d15-a152-482a-9ddc-dbc95caee46f-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 08:39:36 crc kubenswrapper[4893]: I0314 08:39:36.506856 4893 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da7f2d15-a152-482a-9ddc-dbc95caee46f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 08:39:36 crc kubenswrapper[4893]: I0314 08:39:36.506870 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sm6jz\" (UniqueName: \"kubernetes.io/projected/da7f2d15-a152-482a-9ddc-dbc95caee46f-kube-api-access-sm6jz\") on node \"crc\" DevicePath \"\"" Mar 14 08:39:37 crc kubenswrapper[4893]: I0314 08:39:37.015787 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lkm5p" event={"ID":"da7f2d15-a152-482a-9ddc-dbc95caee46f","Type":"ContainerDied","Data":"4aaa96e309c6e0bd3fd59c2ffd99018b12f6db85568bb034a31dce9235bc6ab2"} Mar 14 08:39:37 crc kubenswrapper[4893]: I0314 08:39:37.015862 4893 scope.go:117] "RemoveContainer" containerID="973f7c83c11056c69a481a66e20a268c2a5ed4fcd8ae0bddb784bf7f826bfee5" Mar 14 08:39:37 crc kubenswrapper[4893]: I0314 08:39:37.016750 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lkm5p" Mar 14 08:39:37 crc kubenswrapper[4893]: I0314 08:39:37.037846 4893 scope.go:117] "RemoveContainer" containerID="e1ed4dfb14ec0bdb89bb2fa8d900599ac497166c280b4a9dd2ff8d5e50ad388d" Mar 14 08:39:37 crc kubenswrapper[4893]: I0314 08:39:37.056692 4893 scope.go:117] "RemoveContainer" containerID="f461e73b8e7841c55d8a1732d8faf3040bee5e9995dd9a406f3577de90675020" Mar 14 08:39:37 crc kubenswrapper[4893]: I0314 08:39:37.072584 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lkm5p"] Mar 14 08:39:37 crc kubenswrapper[4893]: I0314 08:39:37.080869 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lkm5p"] Mar 14 08:39:37 crc kubenswrapper[4893]: I0314 08:39:37.384665 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da7f2d15-a152-482a-9ddc-dbc95caee46f" path="/var/lib/kubelet/pods/da7f2d15-a152-482a-9ddc-dbc95caee46f/volumes" Mar 14 08:39:39 crc kubenswrapper[4893]: I0314 08:39:39.795935 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-k8j45" Mar 14 08:39:39 crc kubenswrapper[4893]: I0314 08:39:39.841390 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-k8j45" Mar 14 08:39:40 crc kubenswrapper[4893]: I0314 08:39:40.713954 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-k8j45"] Mar 14 08:39:41 crc kubenswrapper[4893]: I0314 08:39:41.047165 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-k8j45" podUID="458e5ddf-3e48-48d7-9379-84226ced5d50" containerName="registry-server" containerID="cri-o://ce3789019fb61efb32649a7e0ab492327c8eec80c370a616a45de86d291bffa0" gracePeriod=2 Mar 14 08:39:41 crc kubenswrapper[4893]: I0314 08:39:41.505185 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k8j45" Mar 14 08:39:41 crc kubenswrapper[4893]: I0314 08:39:41.577057 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbl2z\" (UniqueName: \"kubernetes.io/projected/458e5ddf-3e48-48d7-9379-84226ced5d50-kube-api-access-dbl2z\") pod \"458e5ddf-3e48-48d7-9379-84226ced5d50\" (UID: \"458e5ddf-3e48-48d7-9379-84226ced5d50\") " Mar 14 08:39:41 crc kubenswrapper[4893]: I0314 08:39:41.577183 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/458e5ddf-3e48-48d7-9379-84226ced5d50-utilities\") pod \"458e5ddf-3e48-48d7-9379-84226ced5d50\" (UID: \"458e5ddf-3e48-48d7-9379-84226ced5d50\") " Mar 14 08:39:41 crc kubenswrapper[4893]: I0314 08:39:41.577242 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/458e5ddf-3e48-48d7-9379-84226ced5d50-catalog-content\") pod \"458e5ddf-3e48-48d7-9379-84226ced5d50\" (UID: \"458e5ddf-3e48-48d7-9379-84226ced5d50\") " Mar 14 08:39:41 crc kubenswrapper[4893]: I0314 08:39:41.579861 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/458e5ddf-3e48-48d7-9379-84226ced5d50-utilities" (OuterVolumeSpecName: "utilities") pod "458e5ddf-3e48-48d7-9379-84226ced5d50" (UID: "458e5ddf-3e48-48d7-9379-84226ced5d50"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:39:41 crc kubenswrapper[4893]: I0314 08:39:41.582441 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/458e5ddf-3e48-48d7-9379-84226ced5d50-kube-api-access-dbl2z" (OuterVolumeSpecName: "kube-api-access-dbl2z") pod "458e5ddf-3e48-48d7-9379-84226ced5d50" (UID: "458e5ddf-3e48-48d7-9379-84226ced5d50"). InnerVolumeSpecName "kube-api-access-dbl2z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:39:41 crc kubenswrapper[4893]: I0314 08:39:41.679203 4893 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/458e5ddf-3e48-48d7-9379-84226ced5d50-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 08:39:41 crc kubenswrapper[4893]: I0314 08:39:41.679268 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbl2z\" (UniqueName: \"kubernetes.io/projected/458e5ddf-3e48-48d7-9379-84226ced5d50-kube-api-access-dbl2z\") on node \"crc\" DevicePath \"\"" Mar 14 08:39:41 crc kubenswrapper[4893]: I0314 08:39:41.713011 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/458e5ddf-3e48-48d7-9379-84226ced5d50-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "458e5ddf-3e48-48d7-9379-84226ced5d50" (UID: "458e5ddf-3e48-48d7-9379-84226ced5d50"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:39:41 crc kubenswrapper[4893]: I0314 08:39:41.780758 4893 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/458e5ddf-3e48-48d7-9379-84226ced5d50-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 08:39:42 crc kubenswrapper[4893]: I0314 08:39:42.059090 4893 generic.go:334] "Generic (PLEG): container finished" podID="458e5ddf-3e48-48d7-9379-84226ced5d50" containerID="ce3789019fb61efb32649a7e0ab492327c8eec80c370a616a45de86d291bffa0" exitCode=0 Mar 14 08:39:42 crc kubenswrapper[4893]: I0314 08:39:42.059129 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k8j45" event={"ID":"458e5ddf-3e48-48d7-9379-84226ced5d50","Type":"ContainerDied","Data":"ce3789019fb61efb32649a7e0ab492327c8eec80c370a616a45de86d291bffa0"} Mar 14 08:39:42 crc kubenswrapper[4893]: I0314 08:39:42.059154 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k8j45" event={"ID":"458e5ddf-3e48-48d7-9379-84226ced5d50","Type":"ContainerDied","Data":"b83c37f0ff9a4eb5abc5266409dfe08909be83f2002cfca306880d6d90cc96b8"} Mar 14 08:39:42 crc kubenswrapper[4893]: I0314 08:39:42.059157 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k8j45" Mar 14 08:39:42 crc kubenswrapper[4893]: I0314 08:39:42.059178 4893 scope.go:117] "RemoveContainer" containerID="ce3789019fb61efb32649a7e0ab492327c8eec80c370a616a45de86d291bffa0" Mar 14 08:39:42 crc kubenswrapper[4893]: I0314 08:39:42.097917 4893 scope.go:117] "RemoveContainer" containerID="06ecc6582883736b0e8c56b9a187440a7b5aa8c6b61b8d66f87e63e4a170e86d" Mar 14 08:39:42 crc kubenswrapper[4893]: I0314 08:39:42.100641 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-k8j45"] Mar 14 08:39:42 crc kubenswrapper[4893]: I0314 08:39:42.106285 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-k8j45"] Mar 14 08:39:42 crc kubenswrapper[4893]: I0314 08:39:42.135987 4893 scope.go:117] "RemoveContainer" containerID="10995fe542c45997ad57f47e167c19771ca253d0a44ac7a3cd7e3798850b536b" Mar 14 08:39:42 crc kubenswrapper[4893]: I0314 08:39:42.159878 4893 scope.go:117] "RemoveContainer" containerID="ce3789019fb61efb32649a7e0ab492327c8eec80c370a616a45de86d291bffa0" Mar 14 08:39:42 crc kubenswrapper[4893]: E0314 08:39:42.160839 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce3789019fb61efb32649a7e0ab492327c8eec80c370a616a45de86d291bffa0\": container with ID starting with ce3789019fb61efb32649a7e0ab492327c8eec80c370a616a45de86d291bffa0 not found: ID does not exist" containerID="ce3789019fb61efb32649a7e0ab492327c8eec80c370a616a45de86d291bffa0" Mar 14 08:39:42 crc kubenswrapper[4893]: I0314 08:39:42.160915 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce3789019fb61efb32649a7e0ab492327c8eec80c370a616a45de86d291bffa0"} err="failed to get container status \"ce3789019fb61efb32649a7e0ab492327c8eec80c370a616a45de86d291bffa0\": rpc error: code = NotFound desc = could not find container \"ce3789019fb61efb32649a7e0ab492327c8eec80c370a616a45de86d291bffa0\": container with ID starting with ce3789019fb61efb32649a7e0ab492327c8eec80c370a616a45de86d291bffa0 not found: ID does not exist" Mar 14 08:39:42 crc kubenswrapper[4893]: I0314 08:39:42.160972 4893 scope.go:117] "RemoveContainer" containerID="06ecc6582883736b0e8c56b9a187440a7b5aa8c6b61b8d66f87e63e4a170e86d" Mar 14 08:39:42 crc kubenswrapper[4893]: E0314 08:39:42.161645 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06ecc6582883736b0e8c56b9a187440a7b5aa8c6b61b8d66f87e63e4a170e86d\": container with ID starting with 06ecc6582883736b0e8c56b9a187440a7b5aa8c6b61b8d66f87e63e4a170e86d not found: ID does not exist" containerID="06ecc6582883736b0e8c56b9a187440a7b5aa8c6b61b8d66f87e63e4a170e86d" Mar 14 08:39:42 crc kubenswrapper[4893]: I0314 08:39:42.161676 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06ecc6582883736b0e8c56b9a187440a7b5aa8c6b61b8d66f87e63e4a170e86d"} err="failed to get container status \"06ecc6582883736b0e8c56b9a187440a7b5aa8c6b61b8d66f87e63e4a170e86d\": rpc error: code = NotFound desc = could not find container \"06ecc6582883736b0e8c56b9a187440a7b5aa8c6b61b8d66f87e63e4a170e86d\": container with ID starting with 06ecc6582883736b0e8c56b9a187440a7b5aa8c6b61b8d66f87e63e4a170e86d not found: ID does not exist" Mar 14 08:39:42 crc kubenswrapper[4893]: I0314 08:39:42.161694 4893 scope.go:117] "RemoveContainer" containerID="10995fe542c45997ad57f47e167c19771ca253d0a44ac7a3cd7e3798850b536b" Mar 14 08:39:42 crc kubenswrapper[4893]: E0314 08:39:42.162126 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10995fe542c45997ad57f47e167c19771ca253d0a44ac7a3cd7e3798850b536b\": container with ID starting with 10995fe542c45997ad57f47e167c19771ca253d0a44ac7a3cd7e3798850b536b not found: ID does not exist" containerID="10995fe542c45997ad57f47e167c19771ca253d0a44ac7a3cd7e3798850b536b" Mar 14 08:39:42 crc kubenswrapper[4893]: I0314 08:39:42.162164 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10995fe542c45997ad57f47e167c19771ca253d0a44ac7a3cd7e3798850b536b"} err="failed to get container status \"10995fe542c45997ad57f47e167c19771ca253d0a44ac7a3cd7e3798850b536b\": rpc error: code = NotFound desc = could not find container \"10995fe542c45997ad57f47e167c19771ca253d0a44ac7a3cd7e3798850b536b\": container with ID starting with 10995fe542c45997ad57f47e167c19771ca253d0a44ac7a3cd7e3798850b536b not found: ID does not exist" Mar 14 08:39:43 crc kubenswrapper[4893]: I0314 08:39:43.390089 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="458e5ddf-3e48-48d7-9379-84226ced5d50" path="/var/lib/kubelet/pods/458e5ddf-3e48-48d7-9379-84226ced5d50/volumes" Mar 14 08:39:45 crc kubenswrapper[4893]: I0314 08:39:45.376866 4893 scope.go:117] "RemoveContainer" containerID="19d1c71c015278c3b832cc9c2d45649b6580e0fc82b6df298faf32bcd6ab8569" Mar 14 08:39:45 crc kubenswrapper[4893]: E0314 08:39:45.377334 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:40:00 crc kubenswrapper[4893]: I0314 08:40:00.181321 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557960-r8rqh"] Mar 14 08:40:00 crc kubenswrapper[4893]: E0314 08:40:00.183333 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da7f2d15-a152-482a-9ddc-dbc95caee46f" containerName="extract-utilities" Mar 14 08:40:00 crc kubenswrapper[4893]: I0314 08:40:00.183367 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="da7f2d15-a152-482a-9ddc-dbc95caee46f" containerName="extract-utilities" Mar 14 08:40:00 crc kubenswrapper[4893]: E0314 08:40:00.183427 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="458e5ddf-3e48-48d7-9379-84226ced5d50" containerName="extract-utilities" Mar 14 08:40:00 crc kubenswrapper[4893]: I0314 08:40:00.183445 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="458e5ddf-3e48-48d7-9379-84226ced5d50" containerName="extract-utilities" Mar 14 08:40:00 crc kubenswrapper[4893]: E0314 08:40:00.183498 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da7f2d15-a152-482a-9ddc-dbc95caee46f" containerName="registry-server" Mar 14 08:40:00 crc kubenswrapper[4893]: I0314 08:40:00.183515 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="da7f2d15-a152-482a-9ddc-dbc95caee46f" containerName="registry-server" Mar 14 08:40:00 crc kubenswrapper[4893]: E0314 08:40:00.183571 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da7f2d15-a152-482a-9ddc-dbc95caee46f" containerName="extract-content" Mar 14 08:40:00 crc kubenswrapper[4893]: I0314 08:40:00.183589 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="da7f2d15-a152-482a-9ddc-dbc95caee46f" containerName="extract-content" Mar 14 08:40:00 crc kubenswrapper[4893]: E0314 08:40:00.183657 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="458e5ddf-3e48-48d7-9379-84226ced5d50" containerName="registry-server" Mar 14 08:40:00 crc kubenswrapper[4893]: I0314 08:40:00.183762 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="458e5ddf-3e48-48d7-9379-84226ced5d50" containerName="registry-server" Mar 14 08:40:00 crc kubenswrapper[4893]: E0314 08:40:00.183848 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="458e5ddf-3e48-48d7-9379-84226ced5d50" containerName="extract-content" Mar 14 08:40:00 crc kubenswrapper[4893]: I0314 08:40:00.183868 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="458e5ddf-3e48-48d7-9379-84226ced5d50" containerName="extract-content" Mar 14 08:40:00 crc kubenswrapper[4893]: I0314 08:40:00.186698 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="da7f2d15-a152-482a-9ddc-dbc95caee46f" containerName="registry-server" Mar 14 08:40:00 crc kubenswrapper[4893]: I0314 08:40:00.186773 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="458e5ddf-3e48-48d7-9379-84226ced5d50" containerName="registry-server" Mar 14 08:40:00 crc kubenswrapper[4893]: I0314 08:40:00.188836 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557960-r8rqh" Mar 14 08:40:00 crc kubenswrapper[4893]: I0314 08:40:00.194220 4893 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-44qb7" Mar 14 08:40:00 crc kubenswrapper[4893]: I0314 08:40:00.194422 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 08:40:00 crc kubenswrapper[4893]: I0314 08:40:00.194447 4893 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 08:40:00 crc kubenswrapper[4893]: I0314 08:40:00.195813 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557960-r8rqh"] Mar 14 08:40:00 crc kubenswrapper[4893]: I0314 08:40:00.357029 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljkx6\" (UniqueName: \"kubernetes.io/projected/934bf78d-e1d0-4e8c-8b5c-ac4e685bcb93-kube-api-access-ljkx6\") pod \"auto-csr-approver-29557960-r8rqh\" (UID: \"934bf78d-e1d0-4e8c-8b5c-ac4e685bcb93\") " pod="openshift-infra/auto-csr-approver-29557960-r8rqh" Mar 14 08:40:00 crc kubenswrapper[4893]: I0314 08:40:00.376361 4893 scope.go:117] "RemoveContainer" containerID="19d1c71c015278c3b832cc9c2d45649b6580e0fc82b6df298faf32bcd6ab8569" Mar 14 08:40:00 crc kubenswrapper[4893]: E0314 08:40:00.376684 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:40:00 crc kubenswrapper[4893]: I0314 08:40:00.458632 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljkx6\" (UniqueName: \"kubernetes.io/projected/934bf78d-e1d0-4e8c-8b5c-ac4e685bcb93-kube-api-access-ljkx6\") pod \"auto-csr-approver-29557960-r8rqh\" (UID: \"934bf78d-e1d0-4e8c-8b5c-ac4e685bcb93\") " pod="openshift-infra/auto-csr-approver-29557960-r8rqh" Mar 14 08:40:00 crc kubenswrapper[4893]: I0314 08:40:00.488261 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljkx6\" (UniqueName: \"kubernetes.io/projected/934bf78d-e1d0-4e8c-8b5c-ac4e685bcb93-kube-api-access-ljkx6\") pod \"auto-csr-approver-29557960-r8rqh\" (UID: \"934bf78d-e1d0-4e8c-8b5c-ac4e685bcb93\") " pod="openshift-infra/auto-csr-approver-29557960-r8rqh" Mar 14 08:40:00 crc kubenswrapper[4893]: I0314 08:40:00.516065 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557960-r8rqh" Mar 14 08:40:00 crc kubenswrapper[4893]: I0314 08:40:00.990386 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557960-r8rqh"] Mar 14 08:40:01 crc kubenswrapper[4893]: I0314 08:40:01.216017 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557960-r8rqh" event={"ID":"934bf78d-e1d0-4e8c-8b5c-ac4e685bcb93","Type":"ContainerStarted","Data":"fa6e065c70a4288588098cb9d9bf11145311231c1f31e0a8c6cfc0b484d742aa"} Mar 14 08:40:03 crc kubenswrapper[4893]: I0314 08:40:03.236219 4893 generic.go:334] "Generic (PLEG): container finished" podID="934bf78d-e1d0-4e8c-8b5c-ac4e685bcb93" containerID="d9d54fc28d2b62b9d97ea2ce72c8e112edc50298c326dee7f5ed909804e57a08" exitCode=0 Mar 14 08:40:03 crc kubenswrapper[4893]: I0314 08:40:03.236292 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557960-r8rqh" event={"ID":"934bf78d-e1d0-4e8c-8b5c-ac4e685bcb93","Type":"ContainerDied","Data":"d9d54fc28d2b62b9d97ea2ce72c8e112edc50298c326dee7f5ed909804e57a08"} Mar 14 08:40:04 crc kubenswrapper[4893]: I0314 08:40:04.539913 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557960-r8rqh" Mar 14 08:40:04 crc kubenswrapper[4893]: I0314 08:40:04.642594 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljkx6\" (UniqueName: \"kubernetes.io/projected/934bf78d-e1d0-4e8c-8b5c-ac4e685bcb93-kube-api-access-ljkx6\") pod \"934bf78d-e1d0-4e8c-8b5c-ac4e685bcb93\" (UID: \"934bf78d-e1d0-4e8c-8b5c-ac4e685bcb93\") " Mar 14 08:40:04 crc kubenswrapper[4893]: I0314 08:40:04.648122 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/934bf78d-e1d0-4e8c-8b5c-ac4e685bcb93-kube-api-access-ljkx6" (OuterVolumeSpecName: "kube-api-access-ljkx6") pod "934bf78d-e1d0-4e8c-8b5c-ac4e685bcb93" (UID: "934bf78d-e1d0-4e8c-8b5c-ac4e685bcb93"). InnerVolumeSpecName "kube-api-access-ljkx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:40:04 crc kubenswrapper[4893]: I0314 08:40:04.744612 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljkx6\" (UniqueName: \"kubernetes.io/projected/934bf78d-e1d0-4e8c-8b5c-ac4e685bcb93-kube-api-access-ljkx6\") on node \"crc\" DevicePath \"\"" Mar 14 08:40:05 crc kubenswrapper[4893]: I0314 08:40:05.256956 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557960-r8rqh" event={"ID":"934bf78d-e1d0-4e8c-8b5c-ac4e685bcb93","Type":"ContainerDied","Data":"fa6e065c70a4288588098cb9d9bf11145311231c1f31e0a8c6cfc0b484d742aa"} Mar 14 08:40:05 crc kubenswrapper[4893]: I0314 08:40:05.257000 4893 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa6e065c70a4288588098cb9d9bf11145311231c1f31e0a8c6cfc0b484d742aa" Mar 14 08:40:05 crc kubenswrapper[4893]: I0314 08:40:05.257121 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557960-r8rqh" Mar 14 08:40:05 crc kubenswrapper[4893]: I0314 08:40:05.626119 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557954-kkc98"] Mar 14 08:40:05 crc kubenswrapper[4893]: I0314 08:40:05.632409 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557954-kkc98"] Mar 14 08:40:07 crc kubenswrapper[4893]: I0314 08:40:07.391937 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2613890a-4617-42c9-80db-d0d99e6ba0ab" path="/var/lib/kubelet/pods/2613890a-4617-42c9-80db-d0d99e6ba0ab/volumes" Mar 14 08:40:10 crc kubenswrapper[4893]: I0314 08:40:10.666597 4893 scope.go:117] "RemoveContainer" containerID="d94b8b0a858ad536d84b96e2598879f33a149549252e75056a5b08c7c5e35c03" Mar 14 08:40:13 crc kubenswrapper[4893]: I0314 08:40:13.376871 4893 scope.go:117] "RemoveContainer" containerID="19d1c71c015278c3b832cc9c2d45649b6580e0fc82b6df298faf32bcd6ab8569" Mar 14 08:40:13 crc kubenswrapper[4893]: E0314 08:40:13.377646 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:40:13 crc kubenswrapper[4893]: I0314 08:40:13.627238 4893 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wv7gg"] Mar 14 08:40:13 crc kubenswrapper[4893]: E0314 08:40:13.627517 4893 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="934bf78d-e1d0-4e8c-8b5c-ac4e685bcb93" containerName="oc" Mar 14 08:40:13 crc kubenswrapper[4893]: I0314 08:40:13.627543 4893 state_mem.go:107] "Deleted CPUSet assignment" podUID="934bf78d-e1d0-4e8c-8b5c-ac4e685bcb93" containerName="oc" Mar 14 08:40:13 crc kubenswrapper[4893]: I0314 08:40:13.627696 4893 memory_manager.go:354] "RemoveStaleState removing state" podUID="934bf78d-e1d0-4e8c-8b5c-ac4e685bcb93" containerName="oc" Mar 14 08:40:13 crc kubenswrapper[4893]: I0314 08:40:13.628617 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wv7gg" Mar 14 08:40:13 crc kubenswrapper[4893]: I0314 08:40:13.643662 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wv7gg"] Mar 14 08:40:13 crc kubenswrapper[4893]: I0314 08:40:13.789359 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7cb4f477-d791-4f49-aa0f-5d7cc35c08ea-catalog-content\") pod \"certified-operators-wv7gg\" (UID: \"7cb4f477-d791-4f49-aa0f-5d7cc35c08ea\") " pod="openshift-marketplace/certified-operators-wv7gg" Mar 14 08:40:13 crc kubenswrapper[4893]: I0314 08:40:13.789607 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n478q\" (UniqueName: \"kubernetes.io/projected/7cb4f477-d791-4f49-aa0f-5d7cc35c08ea-kube-api-access-n478q\") pod \"certified-operators-wv7gg\" (UID: \"7cb4f477-d791-4f49-aa0f-5d7cc35c08ea\") " pod="openshift-marketplace/certified-operators-wv7gg" Mar 14 08:40:13 crc kubenswrapper[4893]: I0314 08:40:13.789681 4893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7cb4f477-d791-4f49-aa0f-5d7cc35c08ea-utilities\") pod \"certified-operators-wv7gg\" (UID: \"7cb4f477-d791-4f49-aa0f-5d7cc35c08ea\") " pod="openshift-marketplace/certified-operators-wv7gg" Mar 14 08:40:13 crc kubenswrapper[4893]: I0314 08:40:13.891313 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7cb4f477-d791-4f49-aa0f-5d7cc35c08ea-catalog-content\") pod \"certified-operators-wv7gg\" (UID: \"7cb4f477-d791-4f49-aa0f-5d7cc35c08ea\") " pod="openshift-marketplace/certified-operators-wv7gg" Mar 14 08:40:13 crc kubenswrapper[4893]: I0314 08:40:13.891429 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n478q\" (UniqueName: \"kubernetes.io/projected/7cb4f477-d791-4f49-aa0f-5d7cc35c08ea-kube-api-access-n478q\") pod \"certified-operators-wv7gg\" (UID: \"7cb4f477-d791-4f49-aa0f-5d7cc35c08ea\") " pod="openshift-marketplace/certified-operators-wv7gg" Mar 14 08:40:13 crc kubenswrapper[4893]: I0314 08:40:13.891458 4893 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7cb4f477-d791-4f49-aa0f-5d7cc35c08ea-utilities\") pod \"certified-operators-wv7gg\" (UID: \"7cb4f477-d791-4f49-aa0f-5d7cc35c08ea\") " pod="openshift-marketplace/certified-operators-wv7gg" Mar 14 08:40:13 crc kubenswrapper[4893]: I0314 08:40:13.891879 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7cb4f477-d791-4f49-aa0f-5d7cc35c08ea-catalog-content\") pod \"certified-operators-wv7gg\" (UID: \"7cb4f477-d791-4f49-aa0f-5d7cc35c08ea\") " pod="openshift-marketplace/certified-operators-wv7gg" Mar 14 08:40:13 crc kubenswrapper[4893]: I0314 08:40:13.892005 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7cb4f477-d791-4f49-aa0f-5d7cc35c08ea-utilities\") pod \"certified-operators-wv7gg\" (UID: \"7cb4f477-d791-4f49-aa0f-5d7cc35c08ea\") " pod="openshift-marketplace/certified-operators-wv7gg" Mar 14 08:40:13 crc kubenswrapper[4893]: I0314 08:40:13.910096 4893 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n478q\" (UniqueName: \"kubernetes.io/projected/7cb4f477-d791-4f49-aa0f-5d7cc35c08ea-kube-api-access-n478q\") pod \"certified-operators-wv7gg\" (UID: \"7cb4f477-d791-4f49-aa0f-5d7cc35c08ea\") " pod="openshift-marketplace/certified-operators-wv7gg" Mar 14 08:40:14 crc kubenswrapper[4893]: I0314 08:40:14.001128 4893 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wv7gg" Mar 14 08:40:14 crc kubenswrapper[4893]: I0314 08:40:14.290720 4893 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wv7gg"] Mar 14 08:40:14 crc kubenswrapper[4893]: W0314 08:40:14.302318 4893 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7cb4f477_d791_4f49_aa0f_5d7cc35c08ea.slice/crio-2d5ab1357127ca9ff15a40eea8ae7813559ae48ee184dde1655b1d27babbe90f WatchSource:0}: Error finding container 2d5ab1357127ca9ff15a40eea8ae7813559ae48ee184dde1655b1d27babbe90f: Status 404 returned error can't find the container with id 2d5ab1357127ca9ff15a40eea8ae7813559ae48ee184dde1655b1d27babbe90f Mar 14 08:40:14 crc kubenswrapper[4893]: I0314 08:40:14.345478 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wv7gg" event={"ID":"7cb4f477-d791-4f49-aa0f-5d7cc35c08ea","Type":"ContainerStarted","Data":"2d5ab1357127ca9ff15a40eea8ae7813559ae48ee184dde1655b1d27babbe90f"} Mar 14 08:40:15 crc kubenswrapper[4893]: I0314 08:40:15.357033 4893 generic.go:334] "Generic (PLEG): container finished" podID="7cb4f477-d791-4f49-aa0f-5d7cc35c08ea" containerID="260511baffe033c2aa4c01cb2c86c9632a3aa23cb1dc71727df13a712debeedd" exitCode=0 Mar 14 08:40:15 crc kubenswrapper[4893]: I0314 08:40:15.357084 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wv7gg" event={"ID":"7cb4f477-d791-4f49-aa0f-5d7cc35c08ea","Type":"ContainerDied","Data":"260511baffe033c2aa4c01cb2c86c9632a3aa23cb1dc71727df13a712debeedd"} Mar 14 08:40:16 crc kubenswrapper[4893]: I0314 08:40:16.368703 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wv7gg" event={"ID":"7cb4f477-d791-4f49-aa0f-5d7cc35c08ea","Type":"ContainerStarted","Data":"ead7c8ff8875dfdf6c5a76dc6bd49f8ecaa9968c7c87e6738fabef4fd5c60558"} Mar 14 08:40:17 crc kubenswrapper[4893]: I0314 08:40:17.380644 4893 generic.go:334] "Generic (PLEG): container finished" podID="7cb4f477-d791-4f49-aa0f-5d7cc35c08ea" containerID="ead7c8ff8875dfdf6c5a76dc6bd49f8ecaa9968c7c87e6738fabef4fd5c60558" exitCode=0 Mar 14 08:40:17 crc kubenswrapper[4893]: I0314 08:40:17.387112 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wv7gg" event={"ID":"7cb4f477-d791-4f49-aa0f-5d7cc35c08ea","Type":"ContainerDied","Data":"ead7c8ff8875dfdf6c5a76dc6bd49f8ecaa9968c7c87e6738fabef4fd5c60558"} Mar 14 08:40:19 crc kubenswrapper[4893]: I0314 08:40:19.393681 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wv7gg" event={"ID":"7cb4f477-d791-4f49-aa0f-5d7cc35c08ea","Type":"ContainerStarted","Data":"2f5cefcf6b74371f1223a6f5833ebb3a72c88ce8d2c6e23d18a97207ea40bc08"} Mar 14 08:40:19 crc kubenswrapper[4893]: I0314 08:40:19.418247 4893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wv7gg" podStartSLOduration=3.6580338340000003 podStartE2EDuration="6.418224568s" podCreationTimestamp="2026-03-14 08:40:13 +0000 UTC" firstStartedPulling="2026-03-14 08:40:15.358960023 +0000 UTC m=+6094.621136835" lastFinishedPulling="2026-03-14 08:40:18.119150777 +0000 UTC m=+6097.381327569" observedRunningTime="2026-03-14 08:40:19.414122958 +0000 UTC m=+6098.676299780" watchObservedRunningTime="2026-03-14 08:40:19.418224568 +0000 UTC m=+6098.680401370" Mar 14 08:40:24 crc kubenswrapper[4893]: I0314 08:40:24.001259 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wv7gg" Mar 14 08:40:24 crc kubenswrapper[4893]: I0314 08:40:24.002757 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wv7gg" Mar 14 08:40:24 crc kubenswrapper[4893]: I0314 08:40:24.038313 4893 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wv7gg" Mar 14 08:40:24 crc kubenswrapper[4893]: I0314 08:40:24.458686 4893 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wv7gg" Mar 14 08:40:24 crc kubenswrapper[4893]: I0314 08:40:24.511170 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wv7gg"] Mar 14 08:40:25 crc kubenswrapper[4893]: I0314 08:40:25.377220 4893 scope.go:117] "RemoveContainer" containerID="19d1c71c015278c3b832cc9c2d45649b6580e0fc82b6df298faf32bcd6ab8569" Mar 14 08:40:25 crc kubenswrapper[4893]: E0314 08:40:25.377643 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:40:26 crc kubenswrapper[4893]: I0314 08:40:26.436725 4893 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wv7gg" podUID="7cb4f477-d791-4f49-aa0f-5d7cc35c08ea" containerName="registry-server" containerID="cri-o://2f5cefcf6b74371f1223a6f5833ebb3a72c88ce8d2c6e23d18a97207ea40bc08" gracePeriod=2 Mar 14 08:40:26 crc kubenswrapper[4893]: I0314 08:40:26.852916 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wv7gg" Mar 14 08:40:27 crc kubenswrapper[4893]: I0314 08:40:27.001500 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n478q\" (UniqueName: \"kubernetes.io/projected/7cb4f477-d791-4f49-aa0f-5d7cc35c08ea-kube-api-access-n478q\") pod \"7cb4f477-d791-4f49-aa0f-5d7cc35c08ea\" (UID: \"7cb4f477-d791-4f49-aa0f-5d7cc35c08ea\") " Mar 14 08:40:27 crc kubenswrapper[4893]: I0314 08:40:27.001861 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7cb4f477-d791-4f49-aa0f-5d7cc35c08ea-utilities\") pod \"7cb4f477-d791-4f49-aa0f-5d7cc35c08ea\" (UID: \"7cb4f477-d791-4f49-aa0f-5d7cc35c08ea\") " Mar 14 08:40:27 crc kubenswrapper[4893]: I0314 08:40:27.001945 4893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7cb4f477-d791-4f49-aa0f-5d7cc35c08ea-catalog-content\") pod \"7cb4f477-d791-4f49-aa0f-5d7cc35c08ea\" (UID: \"7cb4f477-d791-4f49-aa0f-5d7cc35c08ea\") " Mar 14 08:40:27 crc kubenswrapper[4893]: I0314 08:40:27.003011 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7cb4f477-d791-4f49-aa0f-5d7cc35c08ea-utilities" (OuterVolumeSpecName: "utilities") pod "7cb4f477-d791-4f49-aa0f-5d7cc35c08ea" (UID: "7cb4f477-d791-4f49-aa0f-5d7cc35c08ea"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:40:27 crc kubenswrapper[4893]: I0314 08:40:27.007155 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cb4f477-d791-4f49-aa0f-5d7cc35c08ea-kube-api-access-n478q" (OuterVolumeSpecName: "kube-api-access-n478q") pod "7cb4f477-d791-4f49-aa0f-5d7cc35c08ea" (UID: "7cb4f477-d791-4f49-aa0f-5d7cc35c08ea"). InnerVolumeSpecName "kube-api-access-n478q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:40:27 crc kubenswrapper[4893]: I0314 08:40:27.103104 4893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n478q\" (UniqueName: \"kubernetes.io/projected/7cb4f477-d791-4f49-aa0f-5d7cc35c08ea-kube-api-access-n478q\") on node \"crc\" DevicePath \"\"" Mar 14 08:40:27 crc kubenswrapper[4893]: I0314 08:40:27.103138 4893 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7cb4f477-d791-4f49-aa0f-5d7cc35c08ea-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 08:40:27 crc kubenswrapper[4893]: I0314 08:40:27.446994 4893 generic.go:334] "Generic (PLEG): container finished" podID="7cb4f477-d791-4f49-aa0f-5d7cc35c08ea" containerID="2f5cefcf6b74371f1223a6f5833ebb3a72c88ce8d2c6e23d18a97207ea40bc08" exitCode=0 Mar 14 08:40:27 crc kubenswrapper[4893]: I0314 08:40:27.447061 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wv7gg" event={"ID":"7cb4f477-d791-4f49-aa0f-5d7cc35c08ea","Type":"ContainerDied","Data":"2f5cefcf6b74371f1223a6f5833ebb3a72c88ce8d2c6e23d18a97207ea40bc08"} Mar 14 08:40:27 crc kubenswrapper[4893]: I0314 08:40:27.447088 4893 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wv7gg" Mar 14 08:40:27 crc kubenswrapper[4893]: I0314 08:40:27.447142 4893 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wv7gg" event={"ID":"7cb4f477-d791-4f49-aa0f-5d7cc35c08ea","Type":"ContainerDied","Data":"2d5ab1357127ca9ff15a40eea8ae7813559ae48ee184dde1655b1d27babbe90f"} Mar 14 08:40:27 crc kubenswrapper[4893]: I0314 08:40:27.447170 4893 scope.go:117] "RemoveContainer" containerID="2f5cefcf6b74371f1223a6f5833ebb3a72c88ce8d2c6e23d18a97207ea40bc08" Mar 14 08:40:27 crc kubenswrapper[4893]: I0314 08:40:27.467081 4893 scope.go:117] "RemoveContainer" containerID="ead7c8ff8875dfdf6c5a76dc6bd49f8ecaa9968c7c87e6738fabef4fd5c60558" Mar 14 08:40:27 crc kubenswrapper[4893]: I0314 08:40:27.483762 4893 scope.go:117] "RemoveContainer" containerID="260511baffe033c2aa4c01cb2c86c9632a3aa23cb1dc71727df13a712debeedd" Mar 14 08:40:27 crc kubenswrapper[4893]: I0314 08:40:27.504720 4893 scope.go:117] "RemoveContainer" containerID="2f5cefcf6b74371f1223a6f5833ebb3a72c88ce8d2c6e23d18a97207ea40bc08" Mar 14 08:40:27 crc kubenswrapper[4893]: E0314 08:40:27.505094 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f5cefcf6b74371f1223a6f5833ebb3a72c88ce8d2c6e23d18a97207ea40bc08\": container with ID starting with 2f5cefcf6b74371f1223a6f5833ebb3a72c88ce8d2c6e23d18a97207ea40bc08 not found: ID does not exist" containerID="2f5cefcf6b74371f1223a6f5833ebb3a72c88ce8d2c6e23d18a97207ea40bc08" Mar 14 08:40:27 crc kubenswrapper[4893]: I0314 08:40:27.505183 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f5cefcf6b74371f1223a6f5833ebb3a72c88ce8d2c6e23d18a97207ea40bc08"} err="failed to get container status \"2f5cefcf6b74371f1223a6f5833ebb3a72c88ce8d2c6e23d18a97207ea40bc08\": rpc error: code = NotFound desc = could not find container \"2f5cefcf6b74371f1223a6f5833ebb3a72c88ce8d2c6e23d18a97207ea40bc08\": container with ID starting with 2f5cefcf6b74371f1223a6f5833ebb3a72c88ce8d2c6e23d18a97207ea40bc08 not found: ID does not exist" Mar 14 08:40:27 crc kubenswrapper[4893]: I0314 08:40:27.505215 4893 scope.go:117] "RemoveContainer" containerID="ead7c8ff8875dfdf6c5a76dc6bd49f8ecaa9968c7c87e6738fabef4fd5c60558" Mar 14 08:40:27 crc kubenswrapper[4893]: E0314 08:40:27.505511 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ead7c8ff8875dfdf6c5a76dc6bd49f8ecaa9968c7c87e6738fabef4fd5c60558\": container with ID starting with ead7c8ff8875dfdf6c5a76dc6bd49f8ecaa9968c7c87e6738fabef4fd5c60558 not found: ID does not exist" containerID="ead7c8ff8875dfdf6c5a76dc6bd49f8ecaa9968c7c87e6738fabef4fd5c60558" Mar 14 08:40:27 crc kubenswrapper[4893]: I0314 08:40:27.505572 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ead7c8ff8875dfdf6c5a76dc6bd49f8ecaa9968c7c87e6738fabef4fd5c60558"} err="failed to get container status \"ead7c8ff8875dfdf6c5a76dc6bd49f8ecaa9968c7c87e6738fabef4fd5c60558\": rpc error: code = NotFound desc = could not find container \"ead7c8ff8875dfdf6c5a76dc6bd49f8ecaa9968c7c87e6738fabef4fd5c60558\": container with ID starting with ead7c8ff8875dfdf6c5a76dc6bd49f8ecaa9968c7c87e6738fabef4fd5c60558 not found: ID does not exist" Mar 14 08:40:27 crc kubenswrapper[4893]: I0314 08:40:27.505590 4893 scope.go:117] "RemoveContainer" containerID="260511baffe033c2aa4c01cb2c86c9632a3aa23cb1dc71727df13a712debeedd" Mar 14 08:40:27 crc kubenswrapper[4893]: E0314 08:40:27.506010 4893 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"260511baffe033c2aa4c01cb2c86c9632a3aa23cb1dc71727df13a712debeedd\": container with ID starting with 260511baffe033c2aa4c01cb2c86c9632a3aa23cb1dc71727df13a712debeedd not found: ID does not exist" containerID="260511baffe033c2aa4c01cb2c86c9632a3aa23cb1dc71727df13a712debeedd" Mar 14 08:40:27 crc kubenswrapper[4893]: I0314 08:40:27.506043 4893 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"260511baffe033c2aa4c01cb2c86c9632a3aa23cb1dc71727df13a712debeedd"} err="failed to get container status \"260511baffe033c2aa4c01cb2c86c9632a3aa23cb1dc71727df13a712debeedd\": rpc error: code = NotFound desc = could not find container \"260511baffe033c2aa4c01cb2c86c9632a3aa23cb1dc71727df13a712debeedd\": container with ID starting with 260511baffe033c2aa4c01cb2c86c9632a3aa23cb1dc71727df13a712debeedd not found: ID does not exist" Mar 14 08:40:27 crc kubenswrapper[4893]: I0314 08:40:27.614073 4893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7cb4f477-d791-4f49-aa0f-5d7cc35c08ea-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7cb4f477-d791-4f49-aa0f-5d7cc35c08ea" (UID: "7cb4f477-d791-4f49-aa0f-5d7cc35c08ea"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:40:27 crc kubenswrapper[4893]: I0314 08:40:27.711683 4893 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7cb4f477-d791-4f49-aa0f-5d7cc35c08ea-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 08:40:27 crc kubenswrapper[4893]: I0314 08:40:27.783587 4893 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wv7gg"] Mar 14 08:40:27 crc kubenswrapper[4893]: I0314 08:40:27.790291 4893 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wv7gg"] Mar 14 08:40:29 crc kubenswrapper[4893]: I0314 08:40:29.384754 4893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cb4f477-d791-4f49-aa0f-5d7cc35c08ea" path="/var/lib/kubelet/pods/7cb4f477-d791-4f49-aa0f-5d7cc35c08ea/volumes" Mar 14 08:40:40 crc kubenswrapper[4893]: I0314 08:40:40.376378 4893 scope.go:117] "RemoveContainer" containerID="19d1c71c015278c3b832cc9c2d45649b6580e0fc82b6df298faf32bcd6ab8569" Mar 14 08:40:40 crc kubenswrapper[4893]: E0314 08:40:40.377434 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d" Mar 14 08:40:53 crc kubenswrapper[4893]: I0314 08:40:53.376768 4893 scope.go:117] "RemoveContainer" containerID="19d1c71c015278c3b832cc9c2d45649b6580e0fc82b6df298faf32bcd6ab8569" Mar 14 08:40:53 crc kubenswrapper[4893]: E0314 08:40:53.377844 4893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4x6q_openshift-machine-config-operator(ad6724e5-48cf-4417-ae51-b1cb8c6af70d)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4x6q" podUID="ad6724e5-48cf-4417-ae51-b1cb8c6af70d"